Child Abuse

Collision in New Child Abuse Detection Algorithm ‘Not a Concern’ Says Apple

A new collision discovered by researchers in Apple’s newly announced CSAM detection system has raised new concerns about the integrity of the system. However, the iPhone maker believes the finding is not a concern, The Verge is reporting. Highlighted by a GitHub user named Asuhariet Ygvar, the flaw affects the hashing algorithm, called NeuralHash, which allows...

Apple Starts Scanning iCloud Photos to Check for Child Abuse

Last year, Apple made some changes to its privacy policy noting that it may scan iCloud images for child abuse material and now, the company’s chief privacy officer Jane Horvath has revealed that every image backed up to the company’s online storage service is being automatically scanned for illegal photos (via The Telegraph). While speaking...

Apple Pre-Screening Uploaded Content for Child Abuse Imagery

As pointed out by The Mac Observer, Apple has recently updated its privacy policy to reflect that it scans all uploaded content for potentially illegal material, including child sexual exploitation imagery. While the iPhone maker may have been doing this for years, the source notes that this is the first time Apple has revealed it in...