7 things you should never ask Siri, Google Assistant or Alexa
You’re suddenly thrown into a situation where you must perform CPR to save a life. Oh, no — you don’t remember anything from that course 15 years ago.
You might think a quick “Hey, Siri” would pull up the instructions quickly and clearly, but that’s absolutely the worst thing to do. In a recent study, researchers asked voice assistants about cardiac arrest emergencies. Yep, it was a complete disaster.
We’re giving away a $1,000 laptop! Enter to win right now here. Good luck!
TECH LIFE UPGRADES SMARTER THAN THE STUFF ON TIKTOK
I don’t want you to make this mistake
When someone needs CPR, call 911. Period. Only nine of the 32 assistants’ responses somehow suggested this critical step. A whopping 88% of responses gave a website where you could read the steps to perform CPR. Really?
If you need the steps or want to take a refresher course, here’s the link to the Red Cross website. You may have heard that “Stayin’ Alive” by the Bee Gees is an excellent song to sing when doing CPR, as its beats per minute mimic those needed for chest compressions.
It’s great, but here are a few other recommendations you might remember better:
“Baby Shark” by Pinkfong”Dancing Queen” by ABBA”Girls Just Want to Have Fun” by Cyndi Lauper”I Will Survive” by Gloria Gaynor”Sweet Home Alabama” by Lynyrd Skynyrd
The idea that your smart assistant would direct you to a website in an emergency got me thinking about other commands you shouldn’t ask. Here are seven things you’re better off handling yourself.
7 WAYS TO STOP PAYING SO MUCH ON STREAMING EVERY DANG MONTH
1. Play doctor
You’re better off not asking Siri, Google or Alexa for any medical advice — not just lifesaving advice. Trusting those smart assistants might just make things worse. It’s always best to call or book a telehealth appointment with your doctor.
2. How to hurt someone
Don’t ask your smart assistant about harming someone, even if you’re just venting. Those chats with Siri or Google Assistant could come back to bite you if you end up on the wrong side of the law. Keep those kinds of thoughts to yourself.
3. Anything that ends up with your mug shot
Don’t ask Alexa where to buy drugs, where to hide a body or anything else suspicious. Like asking your smart assistant how to hurt someone, these types of questions could be used against you.
4. Be your telephone operator
If you need to call your closest Home Depot to see if they have something in stock, find the number yourself. Same goes for asking that assistant to call emergency services. Dialing 911 takes two seconds.
5. Deal with your money
Although voice assistants can connect to your financial apps, there are many security issues with voice data. Savvy cybercriminals can hack into your phone, steal your voice and use it to drain your accounts. Just log into your bank’s website or mobile app and call it a day.
TECH SECURITY TO-DO: LOCK DOWN YOUR SMART STUFF
6. “Will I die if I eat this?”
If you’re on a hike wondering if the berries you found would make a good snack, voice assistants aren’t reliable sources. There’s conflicting information online about poisonous foods and plants, and taking their advice could land you a trip to the hospital.
7. “Get rid of this.”
Don’t ask Alexa or Siri to clear your search history, delete an app or remove photos. I’ve had a few mishaps where a simple misunderstanding led to something important getting wiped out. Trust me, it’s worth the extra minute to do it manually.
Smart assistants record everything
You can switch off those features if you don’t want Big Tech companies getting their virtual ears on what you say. Here’s how.
Some things are better left to human judgment. Stay smart with your smart assistants!
Keep your tech-know going
My popular podcast is called “Kim Komando Today.” It’s a solid 30 minutes of tech news, tips, and callers with tech questions like you from all over the country. Search for it wherever you get your podcasts. For your convenience, hit the link below for a recent episode.
PODCAST PICK: This fear keeps Sam Altman up at night
Plus, your AI girlfriend collects a lot of data. Kim and Andrew also talk about the White House’s plan to tackle deepfakes and take a look back at the first kiss ever recorded.
CLICK HERE TO GET THE FOX NEWS APP
Check out my podcast “Kim Komando Today” on Apple, Google Podcasts, Spotify, or your favorite podcast player.
Listen to the podcast here or wherever you get your podcasts. Just search for my last name, “Komando.”
Sound like a tech pro, even if you’re not! Award-winning popular host Kim Komando is your secret weapon. Listen on 425+ radio stations or get the podcast. And join over 400,000 people who get her free 5-minute daily email newsletter.
Copyright 2024, WestStar Multimedia Entertainment. All rights reserved.