Create a free Manufacturing.net account to continue

Apple to Scan Photos for Child Sex Abuse

The technology would identify Child Sexual Abuse Material images stored in iCloud Photos.

Apple can disable an account and send a report to the National Center for Missing and Exploited Children (NCMEC).
Apple can disable an account and send a report to the National Center for Missing and Exploited Children (NCMEC).
iStock

Apple announced a child safety feature to limit the spread of Child Sexual Abuse Material (CSAM). The technology would identify CSAM images stored in iCloud Photos.

Apple says it recognized user privacy during development. The system does not scan images in the cloud, but rather, utilizes on-device matching with a database of known CSAM images. 

The matching process, titled "private set intersection," can register if there is a CSAM match without revealing the result. A cryptographic safety voucher is then uploaded to iCloud Photos with the image. 

A separate technology, "threshold secret sharing," allows Apple to interpret the contents of the safety vouchers only if the iCloud Photos account crosses the threshold of known CSAM content. Following a manual review, Apple can disable an account and send a report to the National Center for Missing and Exploited Children (NCMEC). 

Users claiming an incorrectly-flagged account can file an appeal. 

Apple also announced two other safety features. One assists parents in educating children about online communication. The second updates Siri and Search which provides additional resources to be safe online and receive help in unsafe situations. 

The features will arrive later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

More in Software