Deepfake bot 'undresses' 100K women with fake nude photos on messaging app
Deepfakes have been a topic of controversy ever since they first emerged on the web. To make a deepfake, creators will take ordinary photos and digitally stitch them to videos with the help of machine-learning AI. The result: an uncanny clip of someone who was never filmed.
Prior to 2020, security analysts feared that deepfakes would be deployed to interfere with elections. This led to Facebook making the decision to ban deepfake videos altogether earlier this year. Tap or click here to see how Facebook made this decision.
Believe it or not, deepfakes are still being used every day — but not for political purposes. Instead, cybercriminals are taking ordinary photos from social media to create deepfake revenge porn that anyone can use for blackmail. Here’s what you need to know so you can protect your reputation.
Deepfake finds its niche: revenge porn
If you thought deepfakes would be used to trick you into voting one way or another, think again. Sensity, an online deepfake monitoring firm, recently uncovered a bot service that creates custom deepfake porn based on images uploaded by users. This means any photo, no matter how tame, could be superimposed on an explicit image or video.
The service was discovered on Telegram, an encrypted messaging app that masks communications between users. To create a custom deepfake, users would send an image of a woman they wanted to see nude to the bot. The bot would then strip down the image, generate a fake nude body, and apply it to the original picture using AI.
Have more questions on deepfakes? Tap or click here to see Kim’s in-depth coverage.
All images generated by the bot were watermarked, so users would need to pay a fee in order to download them without one. According to a poll taken by the bot, most users were interested in deepfakes based on people they knew versus celebrities. Yikes!
This service wasn’t limited to a small group of customers, either. Sensity found that nearly 101,000 members were using the service by the end of July 2020 and that most of the users were based in Russia and Eastern Europe. What’s more, at least 680,000 women had their likenesses stolen from social media and used by the bot.
The network still appears to be active as of now, and many more women may end up victimized by users of the service without ever knowing. Unlike real revenge porn, which is addressed by laws in multiple states and countries, deepfake porn isn’t considered “real,” meaning it slips through the cracks and can’t be fought as effectively through the law.
How do I know if my images were used? What can I do to protect myself
At this point in time, the only way you’ll know if your pictures were used in a deepfake is if you come across the file itself. Unless you’re wandering shady corners of the web, you will probably never find it. It’s not like every customer who uses the bot publicizes the media they purchase. Some may keep the files and never share them.
Working from home? You need a VPN now more than ever
You’ve heard Kim talk about how important it is to have a virtual private network (VPN). Now that a lot of you are working from home, it’s even more important to choose a VPN you can trust.
The coronavirus pandemic has prompted many companies across the U.S. to have employees work from home (WFH). That means it’s not just the personal information stored on your device at risk from security threats, but also your work data.
5 alternatives to FaceTime for Android
There’s just something about video chat that makes you feel close to far-away friends and family. Apple’s FaceTime is one of the best options out there for connecting via video, but what if you use an Android phone?
First off, don’t be fooled by apps in the Google Play store branded as FaceTime. They’re fakes, and they’re probably harmful to your device. Tap or click here to learn about other dangerous apps you need to avoid.