Share: twitterTweet facebookShare

Apple Publishes ‘Expanded Protections for Children’ FAQ Document

Share: twitterTweet facebookShare

Apple has published a FAQ document regarding its new ‘Expanded Protections for Children’ plan. The six-page document was created and published in hopes to extinguish any privacy concerns users and publishers may have once Apple’s scanning features are implemented.

The document explains that the company’s goal is to “create technology that empowers people and enriches their lives — while helping them stay safe.” In doing so, Apple is introducing scanning technology that can monitor and limit the spread and sharing of Child Sexual Abuse Material (CSAM) through Messages and on iCloud, which was announced last week.

The first major point of conversation Apple lays out is the difference between communication safety in Messages and CSAM detection in iCloud Photos. Since its announcement, the intricacies of both features may have gotten lost. As such, Apple has described both tools and how they differ.

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assur- ances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

The document then goes on to answer many questions that spawned since the announcement. Other notable questions and answers in the FAQ include:

Does this mean Messages will share information with Apple or law enforcement?

No. Apple never gains access to communications as a result of this feature in Messages. This feature does not share any information with Apple, NCMEC or law enforcement. The communications safety feature in Messages is separate from CSAM detection for iCloud Photos — see below for more information about that feature.

Does this mean Apple is going to scan all the photos stored on my iPhone?

No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.

Could governments force Apple to add non-CSAM images to the hash list?

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match knon CSAM images, the account would not be disabled and no report would be filed to NCMEC

Apple has been faced a number of criticisms since it announced the new set of tools and features. Organizations have raised their concerns, including WhatsApp. Additionally, more than 5000 signatures have been gained by organizations and individuals in an open letter to Apple. The movement has been circulating in an effort to halt the progress of these features from being released. Apple will first integrate the Expanded Protections for Children plan into devices in the US. Only time will tell whether this FAQ changes the concerned minds of many.

Share: twitterTweet facebookShare