Grandma got played, but not by bingo

Let me tell you a true story that’ll shake you and hopefully protect someone you love from heartbreak or a well-rehearsed scam call.

It happened to a 90-year-old grandmother in Nashville. She picked up the phone because that’s what she’s always done. On the other end was a young woman, her voice trembling, the connection staticky. “Grandma, help! I was in a car accident. I need you to talk to a lawyer right now.”

Her granddaughter Amanda had just graduated from Auburn University. The voice? Familiar enough, especially with the poor signal. And Amanda calling from an emergency? Of course, Grandma was going to listen.

Next comes the ‘lawyer’ 

He said Amanda needed $6,000 immediately or she could end up in serious legal trouble. So the woman did what any terrified grandmother might do … she complied. She went to the bank and withdrew the cash.

An “errand person” came to her house and picked up the money. Yes, you read that right. Someone came to her front door. That’s how elaborate this scam was.

But later that night, something gnawed at her. She called the number again and said, “I want to talk to my granddaughter.” The scammer hung up. That’s when she dialed the real Amanda and realized the heartbreaking truth.

The good news

Amanda was OK. Her voice was deepfaked by the scammers. The bad news was Grandma was scammed out of $6,000. Lied to. Manipulated. Her family was furious, not at the scammer, but at her

She said her daughters made her feel embarrassed, even ashamed, she fell for such a scam. That might be the worst part of this entire story. 

Let me say this loud and clear: It is not her fault. This wasn’t a mistake, it was a targeted heist. A well-rehearsed act designed to prey on love and urgency.

You need to do this

Continue reading

Trump’s crypto push — July 5th, Hour 1

Open/download audio

Will the U.S. be the world’s crypto capital? Here’s what to know before you invest. Plus, streaming fails, Taylor Swift vanity phone numbers, and a viral airport theory. Holly from Phoenix says her brother lost $400K to a Jennifer Aniston deepfake scam.

🤖 Deepfake diplomacy panic: Someone faked Marco Rubio, using AI-generated voice and messages to DM world leaders and U.S. officials via Signal. At least five targets bit, including three foreign ministers. The fake account even left voicemails. No word yet on who did it or if they got anything.

$539

Lost to the average deepfake call. Criminals use AI to impersonate Medicare workers, politicians, Amazon reps, insurance agents, you name it. When in doubt, hang up.

Deepfake red flags: Here’s how to spot if someone on a Google Meet, Zoom call or Teams meeting is really an AI bot. Ask them to wave their hand across their face. This can trigger a glitch. Watch for their lips not matching what they’re saying, changes in lighting, robotic movements or if they say, “AI is my master.”

🎭 Who owns your face? Get this. Denmark’s rewriting copyright law to give people ownership over their face, voice and vibe. Yes, really. If a deepfake of you pops up without consent, you can make platforms take it down. It’s the first law of its kind in Europe, and the U.S. might want to take notes. This will be ID theft in 2035.

Deepfake dames on the loose: Award-winning actress Helen Mirren is warning fans after scammers used her name to send “charity” emails from drogogo91(at)gmail.com (Yes, really). She says, again, in all caps IT’S NOT HER. If you believe that 79-year-old Dame Helen’s emailing you about crypto, it’s time for you to get off the internet, forever.  

🪞 Deepfake boss attack: A crypto employee thought they were on a Zoom call with their company’s C-suite. Turns out it was North Korean hackers deepfaking the entire leadership team. That “Zoom extension” they asked you to download? Straight malware on macOS. Someone out there is cosplaying your manager to steal your crypto and mess with your M1 chip.

Deepfake p*rn is now a crime

Open/download audio

The Take It Down Act is a big win for victims, but good luck getting shady sites to actually take your image down.

23andMe data sold for $256M

Open/download audio

Your DNA is now in the hands of biotech giant Regeneron. They say they’ll protect it. Plus, Owen Wilson deepfake scams, Meta lets fraud off the hook, and phone-free vacations. Got T-Mobile? Here’s how to claim your part of the $350M data breach settlement.

🚨 Listen up: The FBI says scammers are now using deepfake audio to impersonate government officials. They clone voices that sound shockingly real to trick you into sending money or giving up personal info. Bottom line? If something feels off, hang up.

Deepfake Elon stole millions — May 17th, Hour 1

Open/download audio

Want billions like Musk? Don’t fall for this crypto scam. Some people are paying $217K to freeze their bodies. I’ll also share five signs your phone might be tapped. Plus, I talk to Randy from Oregon, who wants to cash in on a viral video.

Deepfake dumpster fire: Jamie Lee Curtis went full Final Girl on Meta after a sketchy AI ad used her image in a fake endorsement. She posted to Insta, tagged Zuck directly and got the ad pulled. Lesson here: Don’t mess with someone who’s survived multiple maniac attacks in Halloween and nonstop sequels and reboots.

Posting deepfake nudes is now a federal crime

Open/download audio

Finally, victims have protections. That doesn’t mean the internet will forget.

Elon Musk crypto scam - March 22nd, Hour 4

Open/download audio

Did you see Elon Musk promote his new crypto on YouTube? That’s not him: it’s a deepfake. Plus, tasting food in VR, why you need a burner phone, and fake job postings.

2 in 2,000 

People could spot every deepfake image and video of faces. About 39% of people over 65 hadn’t even heard of deepfakes, and 60% of younger people (18-34) were way too confident they could spot fakes. Take the quiz yourself.

This is a Pisa work: Scammers created a deepfake voice of Italy’s defense minister to call big names like Giorgio Armani and Prada’s Patrizio Bertelli (paywall link). The fake minister said journalists had been kidnapped and they needed cash to pay the ransom. At least one fell for it, wiring over $1 million to scammers.

😡 Justice, served: This is horrifying. A woman received a webpage filled with deepfake porn of herself, along with detailed, disgusting fantasies, and the page included her phone number, address and other private info. She went into full detective mode. The culprit? A close friend from college. A judge threw the sicko in prison for nine years.

We’re No. 1, sadly: The U.S. is the leading nation for using AI to create sexually explicit images. In 2024 alone, Americans visited deepfake sites 59.7 million times to upload pics to and create fake nudes, namely of people they know and celebrities. India and Japan came in second and third, respectively.

November 23rd, 2024

Open/download audio

The DOJ is going after Google, calling for changes that could lead to breaking up the tech giant or even selling off Chrome.  A deepfake scandal shuts down a school, Missouri cops are in trouble for searching womens’ phones for nudes, and Meta deleted millions of scam accounts.