Why asking Siri, Google or Alexa for medical advice is a terrible idea

CPR training
© Rawpixelimages | Dreamstime.com

You’re suddenly thrown into a situation where you must perform CPR to save a life. Oh no. You don’t remember anything from that course 15 years ago.

You might think a quick “Hey Siri” would pull up the instructions quickly and clearly, but that’s absolutely the worst thing to do. In a recent study, researchers asked voice assistants questions related to cardiac arrest emergencies. Yep. It was a complete disaster.

I don’t want you to make this mistake. I know this isn’t the most exciting topic to cover, but it is so very important. 

When someone needs CPR, call 911. Period. Somehow, only nine of the 32 assistant’s responses suggested this critical step. A whopping 88% of responses gave a website where you could read the steps to do CPR. Really?  

🏥 If you need the steps or want to take a refresher course, here’s the link to the Red Cross website. You may have heard that “Stayin’ Alive” by the Bee Gees is an excellent song to sing when doing CPR. It is, but here are a few others recommended that you might remember better:

  • “Baby Shark” — Pinkfon
  • “Dancing Queen” — ABBA
  • “Girls Just Want to Have Fun” — Cyndi Lauper
  • “I Will Survive” — Gloria Gaynor
  • “Sweet Home Alabama” — Lynyrd Skynyrd

The idea that your smart assistant would direct you to a website in an emergency got me thinking about other commands you shouldn’t ask. Here are seven things you’re better off handling yourself.

Play doctor: Better off not asking Siri, Google or Alexa for medical advice. Trusting those smart assistants might just make things worse. It’s always best to call or telehealth with your doctor.

How to hurt someone: Don’t ask your smart assistant about harming someone, even if you’re just venting. You never know. Those chats with Siri or Google Assistant could come back to bite you if you end up on the wrong side of the law. Keep those kinds of thoughts to yourself. 

Anything that ends up with your mug shot: Don’t ask Alexa where to buy drugs, where to hide a body or anything else suspicious. Like asking your smart assistant how to hurt someone, asking these types of questions could be used against you.

Be your telephone operator: If you need to call your closest Home Depot to see if they have something in stock, find the number yourself. Same goes for asking that assistant to call emergency services. Dialing 911 yourself takes two seconds. 

Deal with your money: Although voice assistants can connect to bank or credit apps, there are many security issues with voice data. Savvy cybercriminals can hack into your phone, steal your voice and use it to drain your accounts. Just log into your bank’s website or mobile app and call it a day.

Will I die if I eat this? If you’re on a hike wondering if the berries you found would make a good snack, Siri and the others aren’t reliable sources. There’s conflicting information online about poisonous foods and plants, and taking their advice could land you a trip to the hospital.

Get rid of this: Don’t ask Alexa or Siri to remove your search history, an app or photos. I’ve had a couple mishaps where a simple misunderstanding led to something important getting wiped out. Trust me, it’s worth the extra minute to do it manually and save the heartache.

Smart assistants record everything

If you don’t want Big Tech companies getting their virtual ears on what you say, you can switch those features off. Here’s how.

Some things are better left to human judgment. Stay smart with your smart assistants!

Tags: Amazon Alexa, Apple Siri, apps, cybercriminals, emergency, Google, medical advice, Red Cross, security, security issues, smart assistants, telehealth, voice assistants