Skip to main content

Poll: Are you okay with Apple contractors listening to and grading Siri interactions?

A report yesterday from The Guardian placed Apple and Siri in the middle of privacy concerns over voice activated assistants. The report claimed that Apple contractors listen to Siri audio as part of its efforts to improve performance, and oftentimes hear sensitive conversations.

Apple defended the efforts in a statement, saying that “less than 1% of daily Siri activations” are used for grading. What do you think of this revelation?

The report detailed that many of the things heard by these reviewers are accidental activations of Siri – including private medical conversations, criminal dealings, and more. Recordings are also said to include location and contact details, as well as app data:

There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.

Furthermore, the people responsible for listening to and grading Siri responses are subcontractors, and yesterday’s report said that “there’s a high turnover” among this team.

In its statement, Apple said that Siri requests used for “grading” are not linked to any user’s Apple ID. It also noted that reviewers of the audio are held to Apple’s strict confidentiality requirements:

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

The report from The Guardian largely echoes a report from Bloomberg earlier this year, which detailed Amazon’s global team responsible for listening to Alexa audio clips.

On Sixcolors, Jason Snell makes a strong case for at least having an “opt out” process – or making this sort of Siri “grading” opt-in:

My feelings about this issue are the same as they are about Amazon: I’m not comfortable with the possibility that recordings made of me in my home or when I’m walking around with my devices will be listened to by other human beings, period. I’d much prefer automated systems handle all of these sorts of “improvement” tasks, and if that’s implausible, I’d like to be able to opt out of the process (or even better, make it opt-in).

I largely agree with Snell’s argument in this situation. If Apple finds it necessary to listen to Siri queries to improve the service, that’s fine, but at least give me the ability to opt-out. Apple notes in its Siri privacy policy that users can turn off features like Location Services for Siri or turn off Siri altogether, but this isn’t the full solution.

What do you think of this situation? Are you ok with Apple listening to Siri queries in an effort to  improve the service’s performance? Let us know in the poll below and down in the comments.

FTC: We use income earning auto affiliate links. More.

HyperDrive USB-C hub
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Subscribe to 9to5Mac on YouTube for more Apple news:

Comments

Author

Avatar for Chance Miller Chance Miller

Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac.

Tips, questions, typos to chance@9to5mac.com