Apple admits listening to Siri conversations: report

April 3, 2022 0 Comments

Apple says the data is “used to help Siri and Shrutimadhura … better understand you and recognize what you are saying”.

After Google and Amazon, Apple now includes scanners to hide users’ conversations. A whistleblower working for Apple told The Guardian on condition of anonymity that iPhone-making contractors regularly listen to confidential recordings, such as drug deals and couples’ love recordings, as part of their job to provide quality control, or “grading.” . These contractors categorize responses / recordings for a variety of reasons, such as “whether the voice assistant activation was intentional or accidental, whether the quality was something that could be expected to help Siri, and whether Siri’s response was appropriate”.

According to Whistleblower, the main reason for sending private conversations to Apple was accidental activation. Apple’s AI-powered virtual assistant Siri has been incorporated into a number of Apple devices, including the Apple Homepod Smart Speaker and the Apple Watch, which are claimed to be the most frequent source of inaccurate recordings. “The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds – not so long but you can get a good idea of ​​what’s happening, “Whistleblower was quoted as saying. .

Apple does not explicitly state in its consumer-oriented privacy documentation that “a small proportion of Siri recordings” are shipped to contractors worldwide. Apple says the data is “used to help Siri and Shrutimadhur … better understand you and recognize what you are saying”. “A small portion of the Siri requests are analyzed to improve Siri and dictation. The user requests are not linked to the user’s Apple ID. The company added that the recording is less than 1 percent of the daily Siri activation and is usually only a few seconds long.

Virtual assistants can accidentally become active when they accidentally hear the sound of waking them up – in Apple’s case it’s “Hey Siri”. In its privacy document, Apple states that Siri data “does not link to other data that Apple may have from your use of other Apple services”. Furthermore, the recording apparently has no name or information that could easily be linked to other recordings, meaning it is anonymous.

Interestingly, a few weeks ago, Google was involved in a similar controversy. It was revealed that Google’s AI Assistant, also known as Google Assistant, listens to users’ conversations. At that time Google also gave a similar explanation. “As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the subtleties and pronunciation of a particular language. These language experts review and copy a small set of questions to help us better understand those languages. It’s an important part of the process of creating speech technology, and it’s essential for creating products like Google Assistant, “said Google.

Leave a Reply

Your email address will not be published.