Apple Contractors Often Hear Sensitive Recordings Captured Accidentally by Siri: Report

Apple is in hot water after a new report details how Apple contractors often hear sensitive recordings captured by Siri.

According to a new report from The Guardian, the key issue is that Siri can be accidentally triggered by audio cues, including anything from the sound of a zip to the word Syria, as well as gestures such as raising an Apple Watch on a certain angle. These inadvertent activations aren’t just frequent; on the Watch, they also lead to 30-second long recordings, a fraction of which are shared with Apple’s contractors for analysis, where they “can gather a good idea of what’s going on.”

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” The Guardian‘s source said. “These recordings are accompanied by user data showing location, contact details, and app data.”

While contractors aren’t specifically listening for private activities in the recordings, the whistleblower claims that Apple doesn’t do “much vetting of who works there, and the amount of data that we’re free to look through seems quite broad… If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The source speaking to The Guardian says they were motivated to speak out due to fears of such information being misused.

“There’s not much vetting of who works there, and the amount of data we’re free to look through seems quite broad,” the source said. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”

While Apple says, in response to the report, that the recordings taken by Siri and sent for grading are “pseudonymized,” “not associated with the user’s Apple ID,” and “analyzed in secure facilities” by reviewers working under “strict confidentiality requirements,” it’s a long way from what happens on your iPhone staying on your iPhone.

The latest report follows others on how Amazon and Google follow similar procedures. Most recently Google was forced to apologize after more than 1,000 Google Assistant audio recordings were leaked by a contractor, many of which were clearly recorded by mistake. Amazon has also been under fire for not sufficiently anonymizing voice data taken by Alexa.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.