According to cryptography and security expert Matthew Green, Apple will soon introduce new photo identification features that will use hashing algorithms to match the content of photos in users’ photo libraries with known child abuse materials, 9to5Mac is reporting.
In the past, Apple has noted that it uses hashing techniques as photos are uploaded to iCloud. This new system, however, would be done on the client-side, on the user’s device. Since Apple has not yet officially announced the new initiative, exact details of how this will work are not yet known.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green)
It must be noted that hashing algorithms are usually not foolproof and may turn up false positives.
“Apple’s system will happen on the client — on the user’s device — in the name of privacy, so the iPhone would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.”
Green also points out that if Apple allows governments to control the fingerprint content database, then perhaps they could use the system to detect images of things other than illegal child content as well.