Apple’s Siri Is Programmed To Never Say “Feminism”

Apple’s Siri Is Programmed To Never Say “Feminism”
mcmurryjulie / Pixabay

Apple’s Siri is known for making quips and offering funny responses to certain questions like, “What is zero divided by zero?” However, the company has also gone to extreme lengths to keep it from saying certain words, including the word “feminism.”

Apple’s Siri deflects questions about feminism

The Guardian obtained leaked documents from Apple which reveal the lengths the company went to when it comes to programming its Siri digital assistant. The last time the guidelines were updated was in June 2018. A former Siri “grader” leaked a large volume of internal documents to The Guardian. Apple contracts graders to screen Siri’s responses and ensure they are accurate. The U.K. newspaper expressed concerns about privacy due to the program last month, and Apple later ended it in response to those concerns.

The guidelines for how Apple’s Siri deals with questions regarding feminism state that the assistant should be “guarded when dealing with potentially controversial content.” They also state that Siri should deflect sensitive questions and take care to be shown as “neutral.”

Looking Into the Gaming Industry with VanEck’s JP Lee

Online, GamingValueWalk's Raul Panganiban interviews JP Lee, Product Managers at VanEck, and discusses the video gaming industry. Q4 2020 hedge fund letters, conferences and more The following is a computer generated transcript and may contain some errors. Interview With VanEck's JP Lee ValueWalk's ValueTalks ·

According to The Guardian, the responses given by Apple’s Siri regarding questions about feminism have been rewritten. The included deflections include remarks about “treating humans equally.” For example, Siri might say, “I believe that all voices are created equal and worth equal respect.” The digital assistant previously responded with statements such as, “I just don’t get this whole gender thing,” and “My name is Siri, and I was designed by Apple in California. That’s all I’m prepared to say.”

The leaked document suggests that in cases where Siri doesn’t deflect questions about feminism, it should present the “feminism” entry from its knowledge graph neutrally. The knowledge graph pulls from Wikipedia and the dictionary included in the iPhone.

The leaked documents also showed that Siri’s responses for questions related to the #MeToo movement were also rewritten. For example, the digital assistant once responded when someone called it a “slut” by saying, “I’d blush if I could.” The response was later rewritten to say, “I won’t respond to that.”

 All about Siri

The leaked documents also reveal guidelines to help people write like Siri, explaining that “in nearly all cases, Siri doesn’t have a point of view.” The digital assistant is also “non-human,” “incorporeal,” “placeless,” “genderless,” “playful” and “humble.” Interestingly, the guidelines also claim Siri “definitely wasn’t a human invention” and that its “true origin is unknown.” The digital assistant’s main directive is “to be helpful at all times” but also to never make people think it is human.

The grader who leaked the documents was concerned about the program, which reportedly offered little guidance for how to deal with personal information contained in the recordings and transcripts of Siri conversations. The program was also very extensive. Graders reviewed nearly 7 million clips captured from iPads along in 10 different regions and were expected to go through the same number of transcripts from several more audio sources, including cars, Apple TV remotes and Bluetooth headsets.

Apple announced sweeping reforms for the Siri grading program late last month. The company revealed that it would no longer use contractors and would require users to opt in to share their information.

No posts to display