Apple Said to Introduce New Photo Identification Features to Detect Child Abuse

According to cryptography and security expert Matthew Green, Apple will soon introduce new photo identification features that will use hashing algorithms to match the content of photos in users’ photo libraries with known child abuse materials, 9to5Mac is reporting.

Child abuse

In the past, Apple has noted that it uses hashing techniques as photos are uploaded to iCloud. This new system, however, would be done on the client-side, on the user’s device. Since Apple has not yet officially announced the new initiative, exact details of how this will work are not yet known.

It must be noted that hashing algorithms are usually not foolproof and may turn up false positives.

“Apple’s system will happen on the client — on the user’s device — in the name of privacy, so the iPhone would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.”

Green also points out that if Apple allows governments to control the fingerprint content database, then perhaps they could use the system to detect images of things other than illegal child content as well.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.