Why asking Siri, Google or Alexa for medical advice is a terrible idea
You’re suddenly thrown into a situation where you must perform CPR to save a life. Oh no. You don’t remember anything from that course 15 years ago.
You might think a quick “Hey Siri” would pull up the instructions quickly and clearly, but that’s absolutely the worst thing to do. In a recent study, researchers asked voice assistants questions related to cardiac arrest emergencies. Yep. It was a complete disaster.
I don’t want you to make this mistake. I know this isn’t the most exciting topic to cover, but it is so very important.
When someone needs CPR, call 911. Period. Somehow, only nine of the 32 assistant’s responses suggested this critical step. A whopping 88% of responses gave a website where you could read the steps to do CPR. Really?
🏥 If you need the steps or want to take a refresher course, here’s the link to the Red Cross website. You may have heard that “Stayin’ Alive” by the Bee Gees is an excellent song to sing when doing CPR. It is, but here are a few others recommended that you might remember better:
- “Baby Shark” — Pinkfon
- “Dancing Queen” — ABBA
- “Girls Just Want to Have Fun” — Cyndi Lauper
- “I Will Survive” — Gloria Gaynor
- “Sweet Home Alabama” — Lynyrd Skynyrd
The idea that your smart assistant would direct you to a website in an emergency got me thinking about other commands you shouldn’t ask. Here are seven things you’re better off handling yourself.
Play doctor: Better off not asking Siri, Google or Alexa for medical advice. Trusting those smart assistants might just make things worse. It’s always best to call or telehealth with your doctor.
How to hurt someone: Don’t ask your smart assistant about harming someone, even if you’re just venting. You never know. Those chats with Siri or Google Assistant could come back to bite you if you end up on the wrong side of the law. Keep those kinds of thoughts to yourself.
Anything that ends up with your mug shot: Don’t ask Alexa where to buy drugs, where to hide a body or anything else suspicious. Like asking your smart assistant how to hurt someone, asking these types of questions could be used against you.
Be your telephone operator: If you need to call your closest Home Depot to see if they have something in stock, find the number yourself. Same goes for asking that assistant to call emergency services. Dialing 911 yourself takes two seconds.
Deal with your money: Although voice assistants can connect to bank or credit apps, there are many security issues with voice data. Savvy cybercriminals can hack into your phone, steal your voice and use it to drain your accounts. Just log into your bank’s website or mobile app and call it a day.
Ukraine cyberattacks could be sign of things to come - Secure your systems
Your accounts and devices are always at risk for cyberattacks. You could be personally targeted or be one of thousands or millions of victims if a company you have an account with is hacked.
The Red Cross was recently targeted by cybercriminals, exposing more than half a million people’s data. Tap or click here for our report.
Red Cross hit by massive cyberattack - What it means for you
The International Committee of the Red Cross (ICRC), better known as simply the Red Cross, is a humanitarian organization providing aid to millions worldwide. Tap or click here for nine essential apps you don’t want to be without in an emergency.