CSAM

Apple Needs to Drop CSAM Detection Plans Entirely, Says Digital Rights Group

On Friday, Apple announced it would delay its new child safety and Child Sexual Abuse Material (CSAM) detection features, taking "additional time over the coming months to collect input and make improvements." International digital rights group Electronic Frontier Foundation, in response to the announcement, said that delaying what basically amounts to a backdoor in Apple's...

OpenMedia Petition Urges Apple to Scrap CSAM Detection System

Vancouver-based non-profit OpenMedia, a free-internet and digital rights advocate, has created a petition urging Apple to scrap its upcoming Child Sexual Abuse Material (CSAM) detection system, branding the move a complete reversal of Apple's "position on the importance of the privacy and security of their customers." Apple's proposed CSAM detection system will scan minors' iMessage...

Over 90 Activist Groups Ask Apple to Abandon CSAM Detection System

Over 90 privacy and rights groups from across the globe have asked Apple to nix its upcoming Child Sexual Abuse Material (CSAM) detection system in an open letter published Thursday — reports Reuters. "Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that...

Apple Details its CSAM Detection System’s Privacy and Security

Apple has just provided a more detailed overview of its recently announced child safety features in a new support document, outlining the design principles, security and privacy requirements, and threat model considerations of its CSA detection system (via MacRumors). The document addresses concerns raised against the company’s plan to detect known Child Sexual Abuse Material (CSAM)...

Apple Privacy Head Addresses Concerns About New CSAM Detection System

Apple has recently previewed a handful of upcoming new child safety features for both iOS and macOS, including a Child Sexual Abuse Material (CSAM) detection system, which several researchers and companies, including WhatsApp, have labeled as “very concerning." In an interview with TechCrunch, Head of Privacy at Apple Erik Neuenschwander has shared detailed answers to many...

WhatsApp Says Apple’s CSAM Detection System ‘Very Concerning’

WhatsApp has spoken out about Apple's new child safety features for its devices, condemning them as a "very concerning... surveillance system" — reports the Financial Times. As we previously reported, Apple has introduced all-new child safety features for both iOS and macOS, complete with a Child Sexual Abuse Material (CSAM) detection system that can scan a...

Apple Shares Expansion Plans for its New CSAM Detection System

Yesterday, Apple previewed its new child safety features coming to iOS and macOS with software updates later this year, including the Child Sexual Abuse Material (CSAM) system, which has sparked some concerns among some security researchers. Although Apple said its method of detecting known CSAM is designed with user privacy in mind, researchers say the company could eventually...