The change was announced by Apple SVP Craig Federighi, who said that the company will collect information in a different way to before, as it seeks to improve the ability of Siri and the iPhone to predict the information the user wants. Federighi touted this approach as differential privacy.
Wired published an extensive piece about what differential privacy means and how Apple plans to implement it. As it turns out, there are lots of unknowns in the game, but Re/code has managed to shed some light on a few.
Apple says it is not using iOS users’ cloud-stored photos to power the image recognition features in iOS 10, instead relying on other data sets to train its algorithms. (Apple hasn’t said what data it is using for that, other than to make clear it is not using its users photos.)
As for what data is being collected, Apple says that differential privacy will initially be limited to four specific use cases: New words that users add to their local dictionaries, emojis typed by the user (so that Apple can suggest emoji replacements), deep links used inside apps (provided they are marked for public indexing) and lookup hints within notes.
Differential privacy will debut with iOS 10, and Apple has not so far collected such data. What’s interesting is that the feature will be optional: It will require the user’s opt-in consent, so you will decide whether or not you want the benefits of the improved Siri and iPhone.