Google returns to getting people to analyze and rate anonymous audio clips from their users. But it has also taken the big step of automatically selecting each user from the setting that allows Google to store their audio. This is why you may receive an email today: Google wants you to opt out of the program and try to provide clearer information with information about what it’s about.
These are very large features that affect a large number of people – although Google says that the exact number of users who receive the email is confidential. It should land in the inbox of anyone who has interacted with a product that uses Google̵
Here is a PDF of the email sent to virtually anyone who has spoken into a microphone with a Google logo next to it, which in part reads:
To keep you in control of the audio recording setting, we have disabled it for you until you can review the updated information. Visit your Google Account to review and enable audio recording settings if you choose.
It will link to this URL (which I list because you should never just click a URL to an account setting without double-checking it): https://myactivity.google.com/consent/assistant/vaa
It’s hard to remember now, but last summer one of the biggest stories in technology was how every major company used people to review the quality of their AI transcripts. When some of these audio recordings started leaking, it rocked Google, Amazon, Apple, Microsoft and Facebook.
This meant that tech’s scandalous summer of 2019 was marked by technical explanations of how machine learning works, excuses, outrage, walkbacks, and in the end, every company finally began to make it easier for users to know what data is stored and how to delete it. I put a bunch of stories in a sidebar just to give you a sense of how intense it was.
All of these companies became significantly better at providing real revelations about how audio data was used and made it easier to delete them or opt out of providing them completely. Most of the major technology companies also resorted to using human reviewers to improve their services – with revelations and / or asking users to approve again.
But Google not bring back human reviewers after the breakup of the practice globally in September last year. When it did, it promised: “We will not include your voice in the human review process unless you have confirmed it again. [Voice & Audio Activity] VAA setting as on. “Today’s e-mail is then the promise that is real – albeit much later than everyone else.
Clicking on the link in the email will take you to a very short website with the YouTube video below that explains Google’s policy. You can also click on a link that provides more detailed information about how Google stores and uses audio.
If you choose to let Google store your audio, it will be used in two ways. There is a period where it is linked to your account. Google uses this data to improve voice matching, and you can go there to review or delete any of this data. From June 2020, the default timeline for data that is automatically deleted is 18 months.
Your audio is then hacked and “anonymized”, as it can be sent to human reviewers to check the accuracy of the transcript. And since it has been a conflict, I will add that some of these reviewers will be with third party vendors. Only anonymized data will be sent to people, says Google.
A strange warning for all of this: even if Google reverses the setting to save audio recordings for everyone, it does not change the policies for audio that has already has been uploaded. If you want it removed, you can go and do it yourself. However, if you do not mind, Google tells you that people will not review any audio uploaded during the break.
If you want to opt out or delete data from any of these large companies, here are some links to get you started: