Another major tech company is coming forward to say it cares about your privacy. This time around it’s Google with regards to its Google Assistant.
Consumers recently discovered that smart home devices such as Amazon’s Alexa, Apple’s Siri and Goggle’s Assistant were listening to us all the time. Even worse, the devices were recording our conversations, which were then turned over to third-party contractors whose employees listened in and transcribed what they heard.
After public outcry, Amazon, Apple and Google announced ways you could protect your privacy. Google is going a step further with its Assistant. We’ll tell you what the company has planned and how you can take advantage of the changes.
Google pauses human reviews
In response to users’ concerns about humans listening to their conversations, Google immediately paused human transcriptions around the world, according to a company blog post. The company stated that it conducted a full review of its systems and controls.
This has resulted in the company making additions to its user policy. Unlike Alexa, Assistant does not store your conversations by default.
Audio data can only be stored is if a user opts into the Voice & Audio Activity (VAA) setting on the Assistant. Soon, the company will add language informing users that by turning on VAA they also are agreeing to allow human reviewers to listen to “snippets” of the audio.
The company emphasized that humans only review 0.2% of all audio snippets and they are never associated with any user accounts. Google stated that it plans to add more security protections to the human review process to include an extra layer of privacy filters.
If you already are signed into VAA, Google said it will not resume the human-review process until you have re-confirmed that your VAA setting is switched to the “On” position.
To turn VAA on or off go to your Google Account. On the left navigation panel click Data and Personalization. In the Activity controls panel click Voice & Audio Activity. Toggle the switch to your preferred option.
If you opt into VAA, you still can delete recordings at any time.
Decreasing collection of audio data
In its blog post, Google also stated that it is updating its policy to reduce the amount of audio data it stores for users who have opted into VAA. It said that starting later this year, it will delete the vast majority of recorded data that’s older than a few months.
That recorded audio is supposedly used to improve the Assistant’s performance. All virtual assistants use machine-learning technology to literally train them to understand certain words, languages, and accents, which is why human transcribers are brought in.
Related: How to tell Alexa to delete what you say
It’s interesting to note that Google is promoting its updated Assistant policies at a time that it is getting some blowback for another Google Assistant-enabled device. The Google Nest Hub Max now features Face Match, a facial recognition program, on its smart display.
The front-facing camera on the Hub Max’s smart display screen allows for the use of Google’s Face Match. Google said that allows the Hub Max to recognize all the faces in a household so it can anticipate each person’s needs or commands.
If that sounds too intrusive to you, Google said Face Match is not a default program. A person who wants to use it has to set it up. This is done by taking photos of yourself in the Google Home app on your phone and then sending them to the hub.
Google said that if you later want to opt-out of Face Match, the hub will delete your facial data from the smart display. Google further emphasized that all of the facial recognition is done locally on the device and nothing is sent or stored in the cloud.
Whether you use Assistant, Alexa or Siri, it is very important that you know the recording and data storing policies of each service. Here at Komando.com, we strive to put this information right at your fingertips.