Share: twitterTweet facebookShare

Apple Confirms It’s Been Scanning iCloud Mail for CSAM Since 2019

Share: twitterTweet facebookShare

Apple has officially confirmed that it is scanning iCloud Mail for CSAM, and it has been doing it since 2019.

Recently, Apple announced that they would be introducing a CSAM scanning photo feature for iCloud Photos. While it’s a good thing that Apple is using its influence and reach to try and stem such activities, many feel that there’s potential for the system to be abused by governments for non-CSAM related materials.

Now, according to a new report from 9to5Mac, the Cupertino company has confirmed that they have actually been scanning iCloud Mails for CSAM content since at least 2019.

The vast majority of iCloud Mail users likely had no idea this was happening, but Apple didn’t keep it a secret, as there are a number of clues that Apple had to have been doing some kind of CSAM scanning. An archived version of Apple’s child safety page hinted towards scanning:

Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.

Apple chief privacy officer Jane Hovarth also confirmed the practice at CES in January 2020. “We are utilizing some technologies to help screen for child sexual abuse material,” she said during a panel she participated in, without providing further details on the technology Apple was using then.

The reasons behind Apple’s decision to expand CSAM scanning still aren’t completely clear. But according to a conversation between Apple employees in 2020, uncovered as part of the company’s ongoing legal battle against Epic Games, anti-fraud chief Eric Friedman described Apple as “the greatest platform for distributing child porn.”

Share: twitterTweet facebookShare