Apple announces new iPhone features to detect child sexual abuse


Following a report on the company’s work to create a tool that scans iPhones for images of child abuse, Apple published an article that provides more details on its efforts in this area. child safety. With the release of iOS 15, watchOS 8, and macOS Monterey later this year, the company is announcing that it will introduce a variety of child safety features in Messages, Photos, and Siri.

For starters, the Messages app will include new notifications that notify children, as well as their parents, when they send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and show several warnings. “It’s not your fault, but sensitive photos and videos can be used to hurt you,” one of the notifications says, according to a screenshot shared by Apple.

As an added precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. “Similar protections are available if a child attempts to send sexually explicit photos,” according to Apple. The company notes that the feature uses machine learning on the device to determine if a photo is meaningful. In addition, Apple does not have access to the messages themselves. This feature will be available for iCloud family accounts.

Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to educate the National Center for Missing and Exploited Children (NCMEC), which in turn will work with law enforcement agencies across the United States. “Apple’s method for detecting known CSAMs [Child Sexual Abuse Material] is designed with user privacy in mind, ”the company says.

Rather than scanning the photos as they are uploaded to the cloud, the system will use an on-device database of “known” images provided by NCMEC and other organizations. The company claims that the database assigns a hash to the photos, which acts as a kind of digital fingerprint for them.

A cryptographic technology called the Private Set Intersection allows Apple to determine if there is a match without seeing the outcome of the process. If there is a match, an iPhone or iPad will create a cryptographic security voucher that will encrypt the download, along with additional data about it. Another technology called Threshold Secret Sharing ensures that the company cannot see the content of these vouchers unless someone exceeds an unspecified threshold of CSAM content. “The threshold is set to provide an extremely high level of accuracy and guarantees less than a one in a trillion per year chance of incorrectly reporting a given account,” according to the company.

It is only when that line is crossed that the technology Apple plans to implement will allow the company to review the content of the vouchers. At this point, the tech giant says it will manually review each report to confirm there is a match. In cases where there is one, it will disable the individual’s iCloud account and report back to NEMEC. Users can appeal a suspension if they believe their account has been flagged in error.


Finally, Siri, along with the built-in search feature found in iOS and macOS, will direct users to child safety resources. For example, you could ask the company’s digital assistant how to report child exploitation. Apple is also planning to update Siri to intervene when someone tries to perform CSAM-related research. The assistant will explain “that interest in this topic is harmful and problematic,” and direct the person to resources that offer help with the problem.

Apple’s decision to work effectively with law enforcement is likely to be seen as a flip-flop for the company. In 2016, he refused to help the FBI unlock the iPhone that belonged to the man behind the San Bernardino terrorist attack. Although the government eventually turned to an outside company for access to the device, Tim Cook called the episode “chilling” and warned it could create a backdoor for increased government surveillance on the road. .

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.


Previous Revolutionary iPhone features that make life easier for people with disabilities
Next Apple rolls out new iPhone features for child safety: all the details

No Comment

Leave a reply

Your email address will not be published.