Apple hopes new feature will curb spread of child sexual abuse images
The image recognition will come in new versions of iOS and iPadOS - due to be released later year
Apple has announced a new feature for iPhones and iPads that is intended to limit the spread of child sexual abuse material (CSAM) online.
Apple says that its upcoming versions of iOS and iPadOS - due to be released later this year - will have 'new applications of cryptography' - enabling the company to detect CSAM images as they are uploaded to iCloud Photos, Apple's online storage.
Before an image is stored in iCloud Photos, an on-device matching process will be performed for that image against the database of known CSAM images, compiled by the US National Center for Missing and Exploited Children (NCMEC).
Apple is using a technology dubbed neuralHash, which analyses an image and converts it to a hash key or unique set of numbers. It then matches the key against NCMEC's database using cryptography.
Apple said the system ensures that it cannot learn about images that don't match the database.
A human reviewer will examine any images the system flags, to confirm a match.
If it is found that the image contains child abuse material, the user's account will be disabled, and the findings reported to the NCMEC.
'At Apple, our goal is to create technology that empowers people and enriches their lives - while helping them stay safe,' Apple said.
'We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material.'
The company claimed that its system has an error rate of 'less than one in 1 trillion' per year, and that it protects user privacy.
However, privacy groups have voiced concerns that authoritarian governments could exploit the changes to Apple's operating systems to spy on dissidents or protesters.
Matthew Green, a cryptography researcher at Johns Hopkins University, said he was concerned about the potential use of the technology to frame innocent people by sending them malicious images designed to appear as matches for child porn.
Such images could fool Apple's algorithm, eventually putting users in trouble, according to Mr. Green.
"This is a thing that you can do. Researchers have been able to do this pretty easily."
Apple has also announced that its Messages app has a new feature to warn children and their parents when sending or receiving sexually explicit images, while keeping private communications unreadable by Apple.
The company is also adding new features in Siri and Search that will intervene when users try to search for CSAM-related material.