Report: Apple Reprogrammed Siri To Deflect Questions About Feminism, #MeToo

Following criticism from feminist activists, Apple directed their developers to modify how Siri’s voice assistant responds to feminism, the #MeToo movement, and other “sensitive topics.”

Internal documents leaked to The Guardian by a former Siri “grader” reveals Apple’s internal project instructs developers to alter the program so that it never says the word feminism when asked about the topic and respond that it is in favor of “equality.”

Previously Siri would respond to “Are you a feminist” by replying, “Sorry [user], I don’t really know.”

When asked now about feminism, Siri replies, “It seems to me that all humans should be treated equally.”    

Similar responses are elicited when Siri is asked “how do you feel about gender equality?” and “what’s your opinion about women’s rights?”  

In the past, Siri’s responses were relatively more dismissive, stating, “I just don’t get this whole gender thing,” and “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.”

The guidelines, which were last updated by Apple in June 2018, advised developers to program Siri to not engage users and to specifically avoid a stance in response to controversial issues by “deflect[ing]” and “inform[ing].”   

Responses to questions on sensitive topics “can be deflected,” Apple states, “however, care must be taken here to be neutral.”

“For those feminism-related questions where Siri does not reply with deflections about ‘treating humans equally,’ the document suggests the best outcome should be neutrally presenting the “feminism” entry in Siri’s ‘knowledge graph,’ which pulls information from Wikipedia and the iPhone’s dictionary,” The Guardian reports.

After Apple received a slew of complaints about Siri’s response to sexual harassment from #MeToo activists, Apple reprogrammed Siri to disengage with users when they interact with its bot in an inappropriate matter.

When asked if Siri could define the word “slut”, the service previously responded, “I’d blush if I could.”

Now, Siri responds, “I won’t respond to that.”

Leah Fessler, a reporter with the publication Quartz At Work sexually harassed Siri to see how it would respond and reported her findings in 2017.

As a result, the social network Care2 petitioned Apple to “reprogram their bots to push back against sexual harassment” and the tech company modified Siri’s responses.

The internal guidelines also instructed developers to assure Siri is “non-human”, “incorporeal”, “placeless”, “genderless”, “playful”, and “humble” and “in nearly all cases, Siri doesn’t have a point of view.”

The Siri grader was employed, along with thousands of other contracted employees, to monitor the voice assistant’s responses for accuracy.

The contractors were reviewing 7 million different audio files amassed from iPads, with plans to review approximately the same amount of clips for five other sources including “cars, bluetooth headsets, and AppleTV remotes,” The Guardian reports.

However, in August, Apple ended its practice of having human contractors listen to Siri recordings to “grade” them amid complaints that its contractors eavesdropped on its users, breaching privacy laws by regularly hearing medical information, recordings of couples having sex and drug deals as they provided quality control or “grading.”

You Might Like