Home Technology Apple Stops Siri Grading That Lets Employees Listen To Your Queries

Apple Stops Siri Grading That Lets Employees Listen To Your Queries

When you purchase through our sponsored links, we may earn a commission. By using this website you agree to our T&Cs.

Apple CEO Tim Cook has said that user privacy is a “fundamental right.” The company has been running multiple ad campaigns positioning its products as the best choices for people who want to keep their private matters private. But The Guardian said in an explosive report last week that Apple has been doing the same thing that Amazon and Google are doing to improve their voice assistants. It’s no different. Under its Siri grading program, the Cupertino company has been letting its employees and contractors listen to your Siri conversations. What’s worse, its ambiguous terms of service document does not explicitly state that someone out there will be listening to your secret Siri conversations.

After the matter came to light, Apple is in damage-control mode. The company has announced that it would temporarily suspend the Siri grading program, where human contractors listen to select Siri recordings for accuracy. Citing a whistle-blower, The Guardian had reported that Apple’s employees and contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex.”

The iPhone maker said in a statement to TechCrunch, “We are committed to delivering a great Siri experience while protecting user privacy.” The company is suspending the Siri grading program worldwide. Apple added that it would roll out a software update in the future to let users choose whether they want to participate in Siri grading.

The Siri grading program involves humans listening to snippets of conversations and grading Siri’s response. It helps improve the AI assistant’s accuracy and prevent its accidental activation. The snippets of conversations aren’t linked to your Apple ID. “Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” said Apple.

Apple claims less than 1% of daily Siri requests are sent to contractors for analysis. But the whistle-blower told The Guardian that they were motivated to go public about it “because of their fears that such information could be misused.” Though the audio snippets are not associated with your name or Apple ID, they could contain personal information such as your finances, medical conditions, and other details.

It’s not uncommon for technology companies to send audio snippets to their employees or contractors for evaluation. Amazon and Google both have been doing it for a long time to improve their respective voice assistants Alexa and Google Assistant. But unlike Apple, they give users the option to opt-out of the program. Apple doesn’t offer users a way to opt-out, though it has promised to do so in the future via a software update.

Though Apple has temporarily suspended the Siri grading program, it stopped short of clarifying on whether it would stop saving the recordings of conversations on its servers. According to The Verge, Apple keeps the recordings with your identifying information for six months. After that, the identifying information is removed but the recordings remain on its servers for about two years.

Apple says in its terms of service that “certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols.” It doesn’t explicitly state that it has assigned humans to listen and evaluate your Siri queries.

Separately, Google has also decided to temporarily stop listening in to Google Assistant recordings in European countries after German regulators opened an inquiry into its practices. Smart voice assistants are still in the early stages of their evolution, and companies need a lot of data to improve them. Apple’s Siri lags far behind Google Assistant and Alexa in terms of accuracy and recognizing your natural voice.

The Siri grading controversy is going to affect Apple’s image as a privacy-focused company. The iPhone maker needs to reassure consumers that their data is safe with it. Otherwise, it’s no better than Google and other companies. These revelations are coming at a time when regulators across the globe are closely scrutinizing the business model of technology heavyweights.

Our Editorial Standards

At ValueWalk, we’re committed to providing accurate, research-backed information. Our editors go above and beyond to ensure our content is trustworthy and transparent.

Vikas Shukla

Want Financial Guidance Sent Straight to You?

  • Pop your email in the box, and you'll receive bi-weekly emails from ValueWalk.
  • We never send spam — only the latest financial news and guides to help you take charge of your financial future.