The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.
Apple gives more detail on new iPhone photo scanning feature as controversy continues
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Apple has released yet more details on its new photo-scanning features, as the controversy over whether they should be added to the iPhone continues.
Earlier this month, Apple announced that it would be adding three new features to iOS, all of which are intended to fight against child sexual exploitation and the distribution of abuse imagery. One adds new information to Siri and search, another checks messages sent to children to see if they might contain inappropriate images, and the third compares photos on an iPhone with a database of known child sexual abuse material (CSAM) and alerts Apple if it is found.
It is the latter of those three features that has proven especially controversial. Critics say that the feature is in contravention of Apple’s commitment to privacy, and that it could in the future be used to scan for other kinds of images, such as political pictures on the phones of people living in authoritarian regimes.
Apple has repeatedly said that it will not allow the feature to be used for any other material, that it will not be used if a phone does not store photos in the cloud, and that a number of safeguards exist to ensure that the process is done in a way that preserves the privacy of users. In the time since the feature was announced, it has defended it in a range of interviews and publications, and says that it is still adding the feature as planned.
Now it has published a new paper, titled ‘Security Threat Model Review of Apple’s Child Safety Features’, that aims to give more reassurance that the feature will only be used as intended. It responds to a number of the security and privacy concerns that have been raised since it was introduced.
One of the specific announcements in the paper is that the database of possible images will not be taken from just one country’s official organisation. Pictures will only be matched if they are present in at least two different groups’ databases – which should ensure that no government is able to inject other content into the database.
Apple will also allow auditors to check through that database, with the full database of identifiers that the feature is looking for being provided so that others can check it is only scanning for child abuse imagery. That database will be included in every device running iOS and iPadOS, even though the feature is only active in the US, so that there will be no way for one specific phone to look for different images.
Apple’s own moderators will also be instructed not to report other kinds of images, the company says in the report, with much the same aim.
It also says that an account will only be flagged if its photo library includes at least 30 images that seem to be CSAM. That is to ensure that there are as few false positives as possible, and should mean that the chance of an account being incorrectly flagged is one in a trillion.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments