Apple has announced that it is delaying the rollout of controversial child safety features which it had planned to launch later this year.
The length of the delay is not clear, but the company faced significant criticism following the announcement in August of its CSAM (child sexual abuse material) detection system, which involved automatically scanning iPhone pictures before they are uploaded to iCloud.
Chief among the fears of academics and security experts was that the system could be modified to search for non-CSAM images that may be of interest to government authorities.
In a statement on Friday, Apple described the plans as “features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of CSAM”.
It said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
In an initial response to the fears raised, Apple said that it would “refuse any such demands” from governments to “force Apple to add non-CSAM images to the hash list” – referencing the list of fingerprints used to identify abuse material without sharing that material itself.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” the company added.
Subsequently the company added that it would be using a hash-list of material provided by authorities in multiple countries, intended to again reduce the risk that a single authority could attempt to exploit the system for surveillance purposes.
Apple’s ability to resist government authorities’ demands was questioned by Professor Steven Murdoch at University College London, who noted that the company’s refusal “to build new functionality to unlock an iPhone” is “different from adding a hash to an existing database”.
Apple stated: “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”
Professor Murdoch noted similarities to a system in place in the UK, which internet service providers (ISPs) were able to use to block abuse material and then were subsequently forced to expand to cover less serious crimes such as intellectual property infringement.
ISPs – including British Sky Broadcasting Limited, then the owner of Sky News – lost in court when they attempted to challenge this.
In his judgment, Justice Arnold noted: “The orders would not require the ISPs to acquire new technology: they have the requisite technology already. Indeed, most of the ISPs now have greater technical capacity to implement such orders than they did three years ago.”