Tap or click to listen to the latest from Kim in just one minute.
Whenever a game-changing device or product comes out, skeptics are always the first to ask “What’s the catch?” Nothing is free in this world, after all. With technology, similar logic should apply. When something truly revolutionary is released, it’s always smart to ask yourself “What am I giving up in exchange for this?”
Many people have welcomed Amazon’s Alexa system into their homes without so much as a second thought. The device provides instant web services by voice command, which is a feature people have dreamed of since Star Trek first aired on TV.
These features are groundbreaking, so there has to be some sort of trade-off behind the scenes, right?
As it turns out, the truth is a bit more sinister. Alexa isn’t just smart because of its programming — it’s been getting human help on the back end. Thousands of employees around the world are analyzing audio clips from Alexa devices every day, and some of the things they’re picking up are private, personal, and disturbing.
“Alexa, are you listening?”
Alexa’s engineers are constantly tweaking the voice recognition algorithm that powers the home speaker, but doing so requires more than just coding knowledge. Real audio data is needed to build Alexa’s vocabulary and improve its ability to understand things like slang and regional dialects.
According to a new report by Bloomberg, this is the reason Amazon has employed a global team of analysts that listen to and transcribe audio samples from Alexa owners — including samples taken when the machine wasn’t activated, or was turned on by accident!
Working with as many as 1,000 audio clips each shift, these analysts report a mostly mundane workflow peppered with occasionally embarrassing or distressing content. The voice review process works by scanning harvested clips for “keywords” that Alexa is already familiar with, such as a brand name or musical artist. From here, the analysts listen to, transcribe and annotate the clips to improve Alexa’s overall recognition.
However, the clips they receive aren’t always from a normal voice command. Due to the nature of sound recognition software, false positives can trigger the device to record audio. Which has led to awkward collections, like a woman singing off key in the shower.
Some of the clips analysts received have a darker nature, like a child screaming for help. In one case, they heard what sounded like a sexual assault.
Amazon claims to have very strict protocols over how they handle randomly audited clips they collect, saying your full name and address is never attached. The company allegedly has a workflow in place for analysts that find upsetting content, urging them to decompress in an internal chatroom with fellow employees.
How to stop Alexa from recording you
For those of you wondering how Amazon is getting away with this, the process is outlined in their terms and conditions (those lovely fine-print agreements you need to click before you can get to the good stuff). The privacy settings of Alexa acknowledge that your voice recording might be analyzed during regular reviews of Alexa’s performance — even if you opt out of sharing clips with Amazon.
If you’re looking to get as much of your privacy back from Alexa as possible, your best bet is to access your stored recordings and delete them. Tap or click here to learn how to hear all your Alexa recordings and delete them, too. Scroll down to #4.
These are the same clips that analysts would be combing through, so by curating what analysts can and cannot access, you can enjoy your Alexa device minus the major creep factor.
Alexa might seem like something out of Orwell, but at least Amazon gives you the option to pull the plug on some of Big Brother’s listening habits.
The easiest way to ensure Alexa isn’t listening is to simply turn off the microphone on your Alexa-enabled device when you’re not using it. There is a button located on top of your Echo or Echo Dot that will turn off the mic. (Note: Example in image above.)