The future of iOS:Apple will scan your pictures in the next update.
Apple announced details of a system to detect child sexual abuse material on the devices of its US customers. Before an image is stored in iCloud Photos, the system will look for similarities between the pictures on the device and others already known of child abuse. Apple claimed that a human reviewer would evaluate and report the user to the police if a match is found.
However, the announcement has raised fears that the technology could expand and allow phones to be scanned for other prohibited materials, including political content on dissident devices.
Privacy experts expressed concern that authoritarian governments may use this technology to spy on their citizens.
Apple announced that the new versions of iOS and iPadOS, which will be released later this year, will have "new crypto applications to help limit the spread of child sexual abuse material online while taking into account the users' privacy."
The system works by matching photos to a database of child sexual abuse images compiled by the US National Center for Missing and Exploited Children (NCMEC) and other child protection organizations.
Those images are translated into numeric codes that can "match" an image on an Apple device.
The company added that the system would also capture edited but similar versions of original images. Before an image is stored in iCloud, the system will look for matches between the images on the device and others already known of child abuse.
"High level of precision"
"Before an image is stored in iCloud, that image will be checked against the number codes of known child abuse photos," Apple said.
The company noted that the system has an "extremely high level of accuracy and guarantees less than one in a trillion chances per year of incorrectly marking a given account."
Apple said it would manually review each report to confirm there is a match. And then, you could disable a user's account and report it to the authorities. Apple said the new technology offers "significant" privacy benefits over existing systems, as the company will only analyze users' photos if they have a collection of known child sexual abuse images on their iCloud account.
However, some privacy experts have raised concerns. "Regardless of Apple's long-term plans, they have sent an obvious signal. In their (very influential) opinion, it is okay to build systems that scan users' phones for banned content," said Matthew Green, an expert on Cryptography from Johns Hopkins University. "It doesn't matter if they are right or wrong on that point; this amounts to 'opening the floodgates'; governments will ask for this technology."