7 things you should never ask Siri, Google Assistant or Alexa
You’re suddenly thrown into a situation where you must perform CPR to save a life. Oh, no — you don’t remember anything from that course 15 years ago.
You might think a quick “Hey, Siri” would pull up the instructions clearly, but that’s absolutely the worst thing to do. In a recent study, researchers asked voice assistants about cardiac arrest emergencies. Yep, it was a complete disaster.
I don’t want you to make this mistake
When someone needs CPR, call 911. Period. Only nine of the 32 assistants’ responses somehow suggested this critical step. A whopping 88% of responses gave a website where you could read the steps to perform CPR. Really?
🏥 If you need the steps or want to take a refresher course, here’s the link to the Red Cross website. You may have heard that “Stayin’ Alive” by the Bee Gees is an excellent song to sing when doing CPR, as its beats per minute mimic those needed for chest compressions.
It’s great, but here are a few other recommendations you might remember better:
- “Baby Shark” by Pinkfong
- “Dancing Queen” by ABBA
- “Girls Just Want to Have Fun” by Cyndi Lauper
- “I Will Survive” by Gloria Gaynor
- “Sweet Home Alabama” by Lynyrd Skynyrd
The idea that your smart assistant would direct you to a website in an emergency got me thinking about other commands you shouldn’t ask. Here are seven things you’re better off handling yourself.
1. Play doctor
You’re better off not asking Siri, Google or Alexa for any medical advice — not just lifesaving advice. Trusting those smart assistants might just make things worse. It’s always best to call or book a telehealth appointment with your doctor.
2. How to hurt someone
Don’t ask your smart assistant about harming someone, even if you’re just venting. Those chats with Siri or Google Assistant could come back to bite you if you end up on the wrong side of the law. Keep those kinds of thoughts to yourself.
3. Anything that ends up with your mugshot
Don’t ask Alexa where to buy drugs, where to hide a body or anything else suspicious. Like asking your smart assistant how to hurt someone, these types of questions could be used against you.
4. Be your telephone operator
If you need to call your closest Home Depot to see if they have something in stock, find the number yourself. The same goes for asking that assistant to call emergency services. Dialing 911 takes two seconds.
5. Deal with your money
Although voice assistants can connect to your financial apps, there are many security issues with voice data. Savvy cybercriminals can hack into your phone, steal your voice and use it to drain your accounts. Just log into your bank’s website or mobile app and call it a day.
6. “Will I die if I eat this?”
If you’re on a hike wondering if the berries you found would make a good snack, voice assistants aren’t reliable sources. There’s conflicting information online about poisonous foods and plants, and taking their advice could land you a trip to the hospital.
7. “Get rid of this.”
Don’t ask Alexa or Siri to clear your search history, delete an app or remove photos. I’ve had a few mishaps where a simple misunderstanding led to something important getting wiped out. Trust me, it’s worth the extra minute to do it manually.
Smart assistants record everything
You can switch off those features if you don’t want Big Tech companies getting their virtual ears on what you say. Here’s how.
Some things are better left to human judgment. Stay smart with your smart assistants!
Tags: apps, cybercriminals, Google, home, Home Depot, security