sb.scorecardresearch
Advertisement

Published 12:20 IST, July 29th 2019

Apple workers eavesdrop on your sensitive private conversations to 'improve' performance of Siri voice assistant: Report

Apple's human contractors listen to your private Siri recordings, as part of their job to improve the quality of Apple's voice assistant, according to the Guardian

Reported by: Tech Desk
Follow: Google News Icon
  • share
Apple workers eavesdrop on your sensitive private conversations to 'improve' performance of Siri voice assistant: Report
null | Image: self
Advertisement

In what could be a shocking report, Apple's human contractors listen to your private Siri recordings, as part of their job to improve the quality of Apple's voice assistant. According to the Guardian, Apple uses certain recordings of yours in an attempt to help Siri better understand what users have to have. Apparently, Apple contractors These recordings include confidential medical information drug deals and in some cases, recordings of people engaging in sexual acts.

Apple does not explicitly disclose this in its consumer-facing documentation, however, a small proportion of the recordings are passed on to Apple contractors around the world and these human contractors are tasked with assessing Siri on the number of factors such as accidental activation, appropriate responses, etc. This is what Apple has to say:

"(The data) is used to help Siri and dictation … understand you better and recognise what you say.”

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple told the Guardian.

READ | After Amazon, Google workers also listen to your private Google Home audio files, claims a new report

Apple also said that less than 1 per cent of daily Siri activations are used for grading Siri and typically, those recordings are only a few seconds long. Meanwhile, an anonymous contractor tasked to grade Siri also raised privacy concerns about the lack of transparency and disclosure, especially considering the frequency at which accidental activations pick up extremely sensitive personal recordings.

"There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data," an anonymous contractor told the Guardian.

Recently, search giant Google acknowledged its human workers can access some recordings of what you say to Google Assistant through Google Home smart speakers. Previously, Amazon was also caught allowing its workers to listen to at least some of what you have to say to Alexa, aiming to improve its voice recognition

12:20 IST, July 29th 2019