On Thursday, more than 90 policy and human rights organizations from across the world signed an open letter pushing Apple to abandon plans to check children’s communications for nudity and adult phones for images of child sex abuse.
“Though these capabilities are intended to protect children and reduce the spread of child sexual abuse material,” the groups wrote in the letter, “we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT).
Some overseas signatories in particular are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.
“It’s so disappointing and upsetting that Apple is doing this, because they have been a staunch ally in defending encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project.
According to an Apple spokeswoman, the firm addressed privacy and security concerns in a paper released on Friday that explains why the scanning software’s sophisticated design should withstand attempts to manipulate it.
Multiple organisations in Brazil signed the petition, where courts have frequently barred Facebook’s WhatsApp for failing to decrypt communications in criminal investigations, and the senate has enacted a measure requiring message traceability, which would need some form of content tagging. This year, India approved a similar law.
“Our main concern is the consequence of this mechanism, how this could be extended to other situations and other companies,” said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, which signed. “This represents a serious weakening of encryption.”
India, Mexico, Germany, Argentina, Ghana, and Tanzania were among the other signatories.
Apple has given a number of explanations and documentation to demonstrate that the chances of erroneous detections are negligible, despite the initial uproar following its announcement two weeks ago.
Apple has stated that it will refuse to expand the image-detection technology beyond images of minors identified by clearinghouses in several countries, but it has not stated that it will withdraw from a market rather than comply with a court order.
Though most of the objections so far have been over device-scanning, the coalition’s letter also faults a change to iMessage in family accounts, which would try to identify and blur nudity in children’s messages, letting them view it only if parents are notified.
The signers said the step could endanger children in intolerant homes or those seeking educational material. More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has staunchly defended in other contexts.
“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” the letter says.
Other groups that signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.