Scams are an underground industry

You already know the usual scams. Romance, crypto, tech support, job offers, fake investments, and the list goes on. But what you might not realize is that a growing number of people behind these scams aren’t evil masterminds. 

They’re victims, too.

Across Southeast Asia, scam compounds have become a massive underground industry. These operations run out of hotels and warehouses and are worth up to $19 billion a year in Cambodia alone. That’s more than half the country’s entire GDP.

People are being lured in by fake job ads, kidnapped and forced into these human trafficking compounds where they’re made to scam people all day long. They are beaten, locked up and told to meet daily scam quotas. 

If they don’t, they’re punished. Some have even died trying to escape.

🎯 Now they’re after kids 

This isn’t just about fake emails and text messages anymore. Scam networks are using AI and deepfake tech to create fake explicit content of tweens and teens. They use it to blackmail families in sextortion scams, threatening to release real and fake explicit images unless they get paid. 

Kids are committing suicide when faced with the reality they shared an intimate photo with a scammer threatening to send it to friends and family, as well as post it on social media. This is getting darker by the day.

🛡️ Protect yourself and family

The next time you get a sketchy message from someone you don’t know, it might not be a criminal in a hoodie. It could be someone who was tricked, trafficked and forced to work inside one of these scam mills.

  • Stop and think. Scammers target emotion, desperation, loneliness, fear, even love. If it feels off, walk away.
  • Don’t click links in messages. If it’s legit, you’ll be able to find it from the source.
  • Talk to your family. Especially your tweens and teens. They need to know that they are being targeted. Often, it’s a swap of a nudie pic that leads to sextortion.

Scams are only profitable if people fall for them. If less money flows into these organizations, maybe the scales will tip against such hostile industries. 

Continue reading

Deepfake Elon Musk stole her money

Open/download audio

Laura from California fell for a crypto scam you won’t believe. Why Gen Z grads are saying no to Big Tech jobs, Tesla’s 24-hour charging diner, Apple’s foldable iPhone. Plus, a cool tip to get your own voice on Waze.

Make AI dad go away: Zelda Williams slammed fans for sending AI-generated clips of her late father, Robin Williams, calling them “disgusting, over-processed hotdogs” of human art. She said he’d never want his voice or face used that way, and I get it. Let him rest, folks. Chill out on the deepfake Sora 2 celeb videos. 

$36 million

That’s how much AI “nudify” sites are making each year turning regular photos into fake nudes. New research examined 85 deepfake websites where someone could take your selfie and, with a few clicks, turn it into something you definitely didn’t sign off on.

🎭 Deepfake stole her home: A 66-year-old California woman lost her life savings and home after scammers used AI deepfakes to impersonate soap star Steve Burton. You know the drill, Steve said he was in love and they would be together forever. But he needed money. She sent him $81K, then he pushed her into selling her $350K condo for quick cash. By the time her daughter intervened, the house was long gone.

🌀 Grift of gab: An AI deepfake of their grandson’s voice convinced an 83-year-old Pennsylvania woman and her husband to hand over $18K in cash. Scammers even used rideshare drivers to ferry them to the bank, twice. Police have the footage, but the cash is gone. Family code words could’ve saved them. 

🗣️ AI’s new favorite party trick? Stealing your TikTok rants, word-for-um-filled-word, and deepfaking them with a totally different face and voice. A wild “incinerator at Alligator Alcatraz” video hit 20M views, copied from a real person, made by a robot. And TikTok barely flinched. Next up: a deepfake of you reacting to this deepfake.

Fake views, real gas money: A couple drove three hours to visit a scenic cable car spot … that doesn’t exist. It was conjured by Google’s Veo 3 AI video maker and posted by a fake AI news channel on TikTok. Deepfake scams are up 2,137% since 2022, and now your vacation plans might be, too.

🙊 Catfishing finance: Scammers are using deepfake Zoom calls (paywall link) and cloned exec voices to steal millions. Companies keep falling for it. One Hong Kong firm wired $25M to a video call of “CFOs” who turned out to be AI sock puppets. Congrats to the bots, you now officially have LinkedIn clout.

🤖 Deepfake diplomacy panic: Someone faked Marco Rubio, using AI-generated voice and messages to DM world leaders and U.S. officials via Signal. At least five targets bit, including three foreign ministers. The fake account even left voicemails. No word yet on who did it or if they got anything.

Trump’s crypto push — July 5th, Hour 1

Open/download audio

Will the U.S. be the world’s crypto capital? Here’s what to know before you invest. Plus, streaming fails, Taylor Swift vanity phone numbers, and a viral airport theory. Holly from Phoenix says her brother lost $400K to a Jennifer Aniston deepfake scam.

🎭 Who owns your face? Get this. Denmark’s rewriting copyright law to give people ownership over their face, voice and vibe. Yes, really. If a deepfake of you pops up without consent, you can make platforms take it down. It’s the first law of its kind in Europe, and the U.S. might want to take notes. This will be ID theft in 2035.

Deepfake dames on the loose: Award-winning actress Helen Mirren is warning fans after scammers used her name to send “charity” emails from drogogo91(at)gmail.com (Yes, really). She says, again, in all caps IT’S NOT HER. If you believe that 79-year-old Dame Helen’s emailing you about crypto, it’s time for you to get off the internet, forever.  

🪞 Deepfake boss attack: A crypto employee thought they were on a Zoom call with their company’s C-suite. Turns out it was North Korean hackers deepfaking the entire leadership team. That “Zoom extension” they asked you to download? Straight malware on macOS. Someone out there is cosplaying your manager to steal your crypto and mess with your M1 chip.

Deepfake p*rn is now a crime

Open/download audio

The Take It Down Act is a big win for victims, but good luck getting shady sites to actually take your image down.

23andMe data sold for $256M

Open/download audio

Your DNA is now in the hands of biotech giant Regeneron. They say they’ll protect it. Plus, Owen Wilson deepfake scams, Meta lets fraud off the hook, and phone-free vacations. Got T-Mobile? Here’s how to claim your part of the $350M data breach settlement.

🚨 Listen up: The FBI says scammers are now using deepfake audio to impersonate government officials. They clone voices that sound shockingly real to trick you into sending money or giving up personal info. Bottom line? If something feels off, hang up.

Deepfake Elon stole millions — May 17th, Hour 1

Open/download audio

Want billions like Musk? Don’t fall for this crypto scam. Some people are paying $217K to freeze their bodies. I’ll also share five signs your phone might be tapped. Plus, I talk to Randy from Oregon, who wants to cash in on a viral video.

Deepfake dumpster fire: Jamie Lee Curtis went full Final Girl on Meta after a sketchy AI ad used her image in a fake endorsement. She posted to Insta, tagged Zuck directly and got the ad pulled. Lesson here: Don’t mess with someone who’s survived multiple maniac attacks in Halloween and nonstop sequels and reboots.

Posting deepfake nudes is now a federal crime

Open/download audio

Finally, victims have protections. That doesn’t mean the internet will forget.