Apple

Policy groups request Apple abandon plans to scan devices for child abuse imagery

Views: 108

An international coalition of policy and civil rights groups published an open letter Thursday asking Apple to “abandon its recently announced plans to build surveillance capabilities into iPhones, iPads and other Apple products.” The groups include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Earlier this month, Apple announced its plans to use new tech within iOS to detect potential child abuse imagery with the goal of limiting the spread of child sexual abuse material (CSAM) online. Apple also announced a new “communication safety” feature, which will use on-device machine learning to identify and blur sexually explicit images received by children in its Messages app. Parents of children age 12 and younger can be notified if the child views or sends such an image.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter.

Apple’s new “Child Safety” page details the plans, which call for on-device scanning before an image is backed up in iCloud. The scanning does not occur until a file is being backed up to iCloud, and Apple says it only receives data about a match if the cryptographic vouchers (uploaded to iCloud along with the image) for an account meet a threshold of matching known CSAM. Apple and other cloud email providers have used hash systems to scan for CSAM sent via email, but the new program would apply the same scans to images stored in iCloud, even if the user never shares or sends them to anyone else.

In response to concerns about how the technology might be misused, Apple followed up by saying it would limit its use to detecting CSAM “and we will not accede to any government’s request to expand it,” the company said.

Much of the pushback against the new measures has been focused on the device-scanning feature, but the civil rights and privacy groups said the plan to blur nudity in children’s iMessages could potentially put children in danger and will break iMessage’s end-to-end encryption.

“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” the letter states.

Tags: , ,
Twitter makes small changes to direct messages
Vine’s creator is now working on NFT blockchain video games

Latest News

Film

Cars

Artificial Intelligence

SpaceX

You May Also Like