Apple to launch child safety feature in the UK that scans images for nudityLeigh Mc Gowranon April 21, 2022 at 07:11 Silicon RepublicSilicon Republic

0

Apple has announced that new features for the detection of child sexual abuse material (CSAM) will soon be available on its devices in the UK, The Guardian reported.

The tech giant said the feature – called “communication safety in Messages” – is designed to warn children when they receive or send photos that contain nudity. This feature was launched in the US last December, according to 9to5Mac.

If the Apple device detects nudity, the image is blurred and the child is warned about the potential content, while being presented with options to message someone they trust for help.

An example of the feature when it detects an image with nudity. Image: Apple

This follows Apple’s initial proposals for CSAM detection tools last August, which were postponed following a backlash from critics.

While Apple attempted to assure its detractors that the measures would be privacy-preserving, concerns were raised as to how they could open a backdoor into widespread surveillance and monitoring of content.

Apple has made a number of changes to the feature following the response from critics. For example, the initial announcement said parents and law enforcement would automatically be notified. These alerts are no longer mentioned in the update.

The communication safety feature is also switched off by default and has to be turned on by parents.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple said in a statement. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

CSAM features have also been added to Apple apps such as Siri, Spotlight, and Safari Search. These apps will now intervene if the user searches for queries related to child exploitation.

“These interventions explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple said.

Siri will also help users who ask how to report CSAM content, by directing them to resources on how to file a report.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

The post Apple to launch child safety feature in the UK that scans images for nudity appeared first on Silicon Republic.

Leave a Comment