This new AI software makes you look like someone else in real time
When I saw this video on social media, it stopped me in my tracks. A new, free AI tool can turn a person into someone else with just one click.
Before you panic, let’s take a look at how it works and how you can protect yourself and your identity.
The easiest cloning software yet
Picture an app that, with a few clicks, turns anyone into very convincing video clones of mega-celebrities like Karl Urban (that handsome actor from New Zealand) and Elon Musk.
Deep-Live-Cam was created by a developer called “Hacksider.” He’s released around 70 similar projects since 2016. He says this is a tool for artists and animators to work more efficiently. Like anything tech, it can go bad when the wrong people get their hands on it.
In this case, we’re talking misinformation and fraud. Deep-Live-Cam is far from the first tool of its kind, but it’s scarily accurate and the most popular by far. After the video demo went viral, it became the most downloaded app on GitHub.
The scariest thing about it is how easy it is to install. Download a zip file, unpack it, run the installer, and you’re ready to be someone else.
Rather than lose sleep over the possibilities that come with accessible fraud tools, though, it’s important to stay vigilant. Here are a few ways to spot fake videos.
A deeper look
Deep-Live-Cam takes two components (a webcam video and a still image of the subject to imitate) to make a single output. These fakes are so good that you’ll have to really analyze them to find a few things AI isn’t so good at replicating:
- Facial hair and glasses: Most AI software is confused by complex facial features and accessories, slightly warping the illusion.
- Hair and skin tone: The closer the match, the better the final vid. If the subject of the video and the source in the image have different complexions, you might notice a plasticky look.
- Unnatural movement: Since the video is rendered in real time, people using Deep-Live-Cam can’t put anything in front of their faces. Take a mental note if someone doesn’t touch their face and they look a little “off.”
Now you’ll have better luck spotting fake videos, but what about criminals using your likeness to make one?
Quick security checklist
It’s scary to think about, how anyone can get their hands on tools like this. That’s why it’s up to you to protect yourself.
- Go private: Unless you’re an online influencer (or trying to be one), set all your social media accounts to private.
- Monitor new followers: That new account with no profile picture might just be trying to steal your identity and trick the people closest to you. Don’t add anyone you don’t know.
- Stop posting nice headshots: If identity thieves can’t find a good enough photo to map to your face, they may just give up and move on.
- Use a tool that blocks AI: Glaze, for example, takes your images, recompresses them and encodes them in a way that confuses AI programs.
- Watermark your photos: Adding some text to your face in pictures takes just a few seconds but can save you a lot of trouble. This can be as hidden or visible as you like.
🔗 Perhaps most importantly, talk to the people in your life about this. Close friends, coworkers and family members are often targets for AI scams, like voice calls and fake videos. Use the buttons below to share on social media or through email.
Don’t get left behind – Stay tech ahead
Award-winning host Kim Komando is your secret weapon for navigating tech.
- National radio show: Find your local station or listen to the podcast
- Daily newsletter: Join 575,000 people who read The Current (free!)
- Watch: On Kim’s YouTube channel
- Podcast: “Kim Komando Today” – Listen wherever you get podcasts
Tags: accessories, family, misinformation, scams, security, social media