Microsoft’s new privacy policy makes it clear that third-party contractors listen

April 3, 2022 0 Comments

One by one, technology companies are acknowledging that they are not only recording snippets of your voice data, but also sharing them with third parties for analysis.

Voice-based assistants and smart speakers are becoming extremely popular among the general consumers. This has led to growing concern about “companies listening”, with companies reassuring customers that this was not the case. As it turned out, that was the case. At the time, companies were saying that random voice data was not being saved, which, it turns out, was being saved. Now, the third hit in this space comes with the revelation that third party people have heard the bits of that audio recording. Like Apple, Facebook is doing it. Now Microsoft has come forward to say that it allows third parties to hear some of your voice data.

The voice recordings in question come from any query in Microsoft’s own AI Cortana. Cortana is a part of every computer that currently runs Windows 10, and until it shuts down specifically, AI is always listening to your commands. This voice data, Microsoft says, can sometimes be shared with third-party contractors after it becomes anonymous. Microsoft endorsed this by adding a statement to their privacy policy stating, “In order to create, train and improve the accuracy of our automated processing methods (including AI), we manually review some of the predictions and assumptions produced by the automated method.” Underlying data from which predictions and predictions were made, “said Microsoft. “For example, we automatically review short snippets of a small sample of voice data that we have taken steps to identify to improve our speech services such as recognition and translation.”

While it relates that every laptop that runs Windows 10 potentially allows Microsoft to pick snippets of conversations, not everything is dark. Microsoft does not offer any way for third parties to opt out of listening to your audio, but you can turn off Cortana if you wish. Additionally, you can go to the Microsoft Privacy Dashboard where you will find any and all voice snippets stored by the company. In addition, you will receive a copy of the recording based on what Microsoft or a third party thinks it has.

At the moment, it’s not surprising that Microsoft is also bringing third-party audit snippets of audio files from your Cortana queries. Amazon was caught doing the same after refusing to save the audio recording. Unfortunately, due to the lack of legislation to control how voice data is handled, it is left to the mercy of technology companies in terms of how consumers choose to protect their data.

Leave a Reply

Your email address will not be published.