WhatsApp Says Apple’s CSAM Detection System ‘Very Concerning’

WhatsApp has spoken out about Apple’s new child safety features for its devices, condemning them as a “very concerning… surveillance system” — reports the Financial Times.

As we previously reported, Apple has introduced all-new child safety features for both iOS and macOS, complete with a Child Sexual Abuse Material (CSAM) detection system that can scan a user’s messages and iCloud media for potential CSAM, and even notify authorities and law enforcement if it finds any.

The reception of the new system has been heavily divided. Where governments and politicians laud the new features and pressure other tech companies to follow suit, security experts warn that the system can easily be copied and used to target a lot more than just CSAM, giving governments and overzealous law enforcement a backdoor into users’ personal data.

Many users have also brought into question whether texts on iMessage can still be considered end-to-end encrypted after the implementation of this new system.

“This approach introduces something very concerning into the world,” said Will Cathcart, head of WhatsApp to the Financial Times. “This is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. It’s troubling to see them act without engaging experts.”

“We will not adopt it at WhatsApp,” he added.

For Apple, a company that has always marketed privacy protection as a core feature of its products, to be the first to take CSAM measures certainly came as a surprise to some, but also applauded by others.

Check out this excellent breakdown of Apple’s CSAM system from John Gruber here.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.