World

Apple remains silent on plans to detect known CSAM stored in iCloud Photos

Apple remains silent on plans to detect known CSAM stored in iCloud Photos
Written by admin

It has been more than a year since Apple Announced plans for three new child safety features, including a system to detect known child sexual abuse material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation features for Siri. The latter two features are now available, but Apple remains silent on its plans for the CSAM detection feature.

General features of iCloud
Apple initially said that CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company finally postponed the function based on “feedback from customers, advocacy groups, researchers, and others.”

In September 2021, Apple released the following update to its child safety page:

We previously announced plans for features to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time in the coming months to gather information and make improvements before we release these critically important child safety features.

In December 2021, Apple removed the previous update and all references to their CSAM screening plans from its child safety page, but an Apple spokesperson informed the edge that Apple’s plans for the feature hadn’t changed. However, to our knowledge, Apple has not publicly commented on the plans since that time.

We’ve reached out to Apple to ask if the feature is still planned. Apple did not immediately respond to a request for comment.

Apple moved forward with the implementation of its child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and expanded the function of the Messages app to Australia, Canada, New Zealand and the United Kingdom with iOS 15.5 and other software versions in May 2022.

Apple said its CSAM detection system was “designed with user privacy in mind.” The system would perform an “on-device match using a hash database of known CSAM images” from child safety organizations, which Apple would transform into an “unreadable hash set that is securely stored on users’ devices.” .

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing & Exploited Children (NCMEC), a nonprofit organization that works in partnership with US law enforcement agencies. Apple said that there would be a “threshold” that would guarantee “less than one chance in a trillion per year” that the system will incorrectly flag an account, in addition to a manual review of accounts flagged by a human.

Apple’s plans were criticized by a wide range of people and organizations, including security researchersthe Electronic Frontier Foundation (EFF), politicians, policy groups, university researchersand even some apple employees.

Some critics argued that Apple’s child safety features could create a “back door” into devices, which governments or law enforcement could use to surveil users. Another concern was false positives, including the possibility that someone intentionally added CSAM images to someone else’s iCloud account to flag their account.

Note: Due to the political or social nature of the discussion on this topic, the discussion thread is located on our politic notices forum. All forum members and site visitors can read and follow the thread, but posting is limited to forum members with at least 100 posts.

About the author

admin

Leave a Comment