Over 90 Activist Groups Ask Apple to Abandon CSAM Detection System

Over 90 privacy and rights groups from across the globe have asked Apple to nix its upcoming Child Sexual Abuse Material (CSAM) detection system in an open letter published Thursday — reports Reuters.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” reads the letter.

The letter is part of a campaign organized by U.S.-based nonprofit organization Center for Democracy & Technology (CDT), the largest to target an encryption issue originating from a single company to date.

Apple’s proposed CSAM detection system will scan minors’ iMessage conversations for nudity, and media stored on adults’ devices for images classified as Child Sexual Abuse Material. The measures will be implemented across Apple’s entire ecosystem, including iCloud.

“It’s so disappointing and upsetting that Apple is doing this, because they have been a staunch ally in defending encryption in the past,” said Sharon Bradford Franklin, co-director of CDT’s Security & Surveillance Project about a company that has for long touted itself as a champion of privacy.

Apple’s announcement of the new child safety features has given rise to a lot of confusion and push-back, with concerns being raised internally as well, and also invited criticism from rivals like Facebook.

Apple has since made an effort to address concerns surrounding the feature-set with support documents and through executives like Senior Vice President Craig Federighi and the iPhone maker’s head of privacy.

Apple maintains that the system will not be prone to false positives, and will not impeach user privacy.

The privacy groups that have signed the open letter, including the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, the Tor Project, and activist groups from India, Mexico, Germany, Argentina, Ghana, and Tanzania, don’t believe that to be the case.

These privacy advocates believe Apple’s CSAM detection system defeats the purpose of end-to-end encyrption, and could, in the future, be subverted to target more than just Child Sexual Abuse Material or abused by authoritarian governments.

“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” says the letter.

The petitioners also believe that the system could potentially endanger children living in orthodox or intolerant homes, or those simply seeking educational material.

Canada’s OpenMedia is also asking Apple to stop its CSAM detection system.

The coalition of advocacy groups sees Apple’s measures as a step that creates more problems than it solves, and urges Apple to abandon its plans.

P.S. - Like our news? Support the site with a coffee/beer. Or shop with our Amazon link. We use affiliate links when possible--thank you for supporting independent media.