Apple Delays Plans to Scan Users’ Libraries for CSAM

Apple today announced that it would delay its plan to scan iPhone users’ libraries for images of CSAM.

According to a new CNBC report, Apple has announced that it won’t immediately launch iOS features designed to protect children from sexual predators. The move comes in response to feedback from researchers and users who voiced concerns about how the features would affect user privacy.

On August 5, Apple announced that a feature to combat CSAM would be coming to iOS 15 and MacOS Monterey. It’s designed to detect whether users had child exploitation images or videos stored on their device. The device would convert suspected images into “hashes” and compare them against a national database.

In a statement, Apple says it’s decided to push back the launch and keep working on the protections against child sexual abuse material, or CSAM, that it announced last month:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

The statement is vague and doesn’t say what kinds of changes Apple will make or even what kinds of advocacy groups and researchers it will collect input from. But given the backlash Apple has received from security researchers, privacy advocates, and customers concerned about privacy, it seems likely that Apple will try to address concerns about user privacy and the possibility that Apple could give governments broader access to customers’ photos.

Some child safety advocates were disappointment by Apple’s announcement.

“We absolutely value privacy and want to avoid mass surveillance in any form from government, but to fail children and fail the survivors of child sexual abuse by saying we’re not going to look for known rape videos and images of children because of some extreme future that may never happen just seems wholly wrong to me,” said Glenn Pounder, chief operating officer of Child Rescue Coalition, a nonprofit that develops software to help law enforcement identify people downloading child sexual abuse material.

Apple’s CSAM detection would have launched as an additional feature of iOS 15, iPadOS 15 and macOS Monterey later this year.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.