Apple Reprogrammed Siri to ‘Deflect’ Questions on Feminism and #MeToo

Apple has re-engineered Siri, its voice assistant, to avoid answering questions on “sensitive topics” like the #MeToo movement and feminism.

Leaked guidelines from June 2018 told developers to have Siri approach questions about subjects like feminism and the #MeToo movement by not engaging, but rather deflecting and ultimately informing, reads a new report from The Guardian. It noted that the guidelines were leaked by a contractor in Apple’s grading program, which recently ended after privacy concerns were raised.

Apple told Siri’s design team there were three ways to respond to the questions: “disengage,” “deflect,” and “inform.” The company’s internal guidelines call for Siri to be “guarded when dealing with potentially controversial content,” while adding “care must be taken here to be neutral.”



Siri used to respond by saying, “Sorry [user], I don’t really know” when asked if it was a feminist. Now, there are specific responses written for that query, including “I believe that all voices are created equal and worth equal respect,” and “It seems to me that all humans should be treated equally.”

The voice assistant’s software was also rewritten for topics related to the #MeToo movement. Previously, if a user called Siri a “slut,” it would respond: “I’d blush if I could.” Now, it says: “I won’t respond to that.”

“Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers,” an Apple spokesperson in an emailed statement. “Our approach is to be factual with inclusive responses rather than offer opinions.”

P.S. - Like our news? Support the site with a coffee/beer. Or shop with our Amazon link. We use affiliate links when possible--thank you for supporting independent media.