Apple announces new iPhone features to detect child sexual abuse


Next to a report about the work the company was doing to create a tool that scans iPhones for child abuse images, Apple said posted a message which provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company is announcing that it will introduce a variety of child safety features to Messages, Photos, and Siri.

To begin with, the Messaging app will include new notifications that will notify children, as well as their parents, when they send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app blurs it and displays several warnings. “It’s not your fault, but sensitive photos and videos can be used to hurt you,” one of the notifications says, according to a screenshot shared by Apple.

As an added precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. “Similar protections are available if a child attempts to send sexually explicit photos,” according to Apple. The company notes that the feature uses on-device machine learning to determine if a photo is explicit. Also, Apple does not have access to the messages themselves. This feature will be available for iCloud family accounts.

Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use technology to inform the National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the United States. “Apple’s method for detecting known CSAMs [Child Sexual Abuse Material] is designed with user privacy in mind,” the company claims.

Rather than scanning photos as they are uploaded to the cloud, the system will use an on-device database of “known” images provided by NCMEC and other organizations. The company says the database assigns a hash to the photos, which acts as a sort of fingerprint for them.

A cryptographic technology called private set intersection allows Apple to determine if there is a match without seeing the result of the process. If matched, an iPhone or iPad will create a cryptographic security voucher that encrypts the download, along with additional data about it. Another technology called threshold secret sharing allows the company to not see the content of these vouchers unless someone crosses an unspecified threshold of CSAM content. “The threshold is set to provide an extremely high level of accuracy and guarantees less than a one in a trillion chance per year of incorrectly flagging a given account,” according to the company.

Only when that line is crossed will the technology Apple plans to implement allow the company to review the content of vouchers. At this point, the tech giant says it will manually review each report to confirm there is a match. In cases where there is one, it will deactivate the individual’s iCloud account and send a report to NEMEC. Users can appeal a suspension if they believe their account was reported in error.

Siri Child Lock

Finally, Siri, along with the built-in search feature found in iOS and macOS, will direct users to child safety resources. For example, you can ask the company’s digital assistant how to report child exploitation. Apple also plans to update Siri to intervene when someone tries to perform CSAM-related searches. The assistant will explain “that interest in this topic is harmful and problematic”, and direct the person to resources that offer help with the issue.

Apple’s decision to work effectively with law enforcement will likely be viewed as a U-turn for the company. In 2016 he refused to help the FBI unlock the iPhone that had belonged to the man behind the San Bernardino terrorist attack. Although the government eventually turned an outside company to gain access to the device, Tim Cook called the episode “chilling” and warned he could create a backdoor for more government surveillance down the road.

Previous Revolutionary iPhone features that make life easier for people with disabilities
Next Apple rolls out new iPhone features for child safety: all the details