The latter two features are now available, but Apple remains mum on plans for a CSAM detection feature.
Apple initially said that CSAM detection would be available in iOS 15 and iOS by the end of 2021iPadIt was implemented in an update to OS15, but the company eventually delayed the feature based on feedback from customers, advocacy groups, researchers, and others.
In September 2021, Apple released the following update on its child safety page: “Previously we announced plans for features designed to help protect children from predators who use communication tools to recruit and exploit them and to help limit child sexual abuse Dissemination of material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to spend more time collecting input and making improvements in the coming months before releasing these extremely important child safety features.”
In December 2021, Apple removed the aforementioned update and all references to the CSAM detection program from its child safety page, but an Apple spokesperson told The Verge that the company’s plans for the feature have not changed. However, Apple has not publicly commented on the plan since then, as far as MacRumors is aware. Macrumors has reached out to Apple to ask if the feature is still planned, but Apple did not immediately respond to a request for comment.
Apple did advance the implementation of the messaging app and Siri’s child safety features in iOS 15.2 and other software updates released in December 2021, and also expanded the messaging app functionality to iOS 15.5 and other software in May 2022. Australia, Canada, New Zealand and the United Kingdom.
Apple said its CSAM detection system was designed with user privacy in mind. The system will use a database of known CSAM image hashes from child safety groups for on-device matching, which Apple will convert into an unreadable set of hashes that can be safely stored on the user’s device.
Apple plans to report iCloud accounts with known hashes of CSAM images to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that works with U.S. law enforcement agencies. Apple said there will be a “threshold” to ensure “less than a one in a trillion chance each year” of accounts being incorrectly flagged by the system, in addition to manual review of flagged accounts.
Apple’s plans have been criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers and even some of Apple’s own employees.
Some critics argue that Apple’s child-safety features could create a “back door” into devices that governments or law enforcement agencies could use to spy on users. Another problem is false positives, including the possibility that someone could intentionally add a CSAM image to another person’s iCloud account to get their account flagged.