Leave a comment

Terrifyingly realistic deepfake video is a scary celebrity mashup

Terrifyingly realistic deepfake video is a scary celebrity mashup

Whenever you hear a piece of news that sounds too crazy to be true, your first reaction is probably that you need proof. That's the most sensible thing to do.

Look for a piece of audio, or better yet, a video that captured whatever supposedly went down. But there's technology now that can fool you into thinking something happened, when it actually didn't.

They are known as deepfake videos and the technology is getting scary good. A recent example blends a couple of celebrities together and has been tricking tons of people on social media.

Deepfake video depicts Jennifer Buscemi

Ok, so this deepfake video isn't malicious in any way. It's not going to stir up controversy that could lead to World War III.

But if you watch the video, you can see just how much the technology has improved over the years. In this clip, someone took a speech given by Jennifer Lawrence at the 2016 Golden Globe Awards and replaced her face with Steve Buscemi's. It's crazy how well it works.

Check out the clip below to see this deepfake video:

The video has gone viral on social media. Thousands of people have been sharing it on sites like Facebook and Twitter, with some calling it the stuff nightmares are made of.

It was originally posted on Reddit by someone with the username VillainGuy. He said, "Steve Buscemi + Jennifer Lawrence discussing her favorite/least favorite housewives on the Bravo channel. Trained on my custom model, trying to achieve more detail."

What is deepfake video technology?

Deepfake video technology uses facial mapping, artificial intelligence and deep machine learning to create ultra-realistic fake videos. They can make it appear to have people say and do things that they've never actually said or done.

Just some computer scans of images and videos of a certain person is all it takes. Deepfake software will then process this information and mimic the target's voice, facial expressions and even individual mannerisms.

The technology is getting to be so good, some are worried that it could lead to problems. Imagine if someone created a deepfake video of a world leader saying things that could lead to real chaos. For example, a nation's leader could be made to say their country just launched nuclear missiles at another country. How would the world react?

Not only do we have to worry about these types of fake videos tricking us, but what about real videos of people doing things they don't want to own up to? Once deepfake video technology is perfected and we can't distinguish between what's real and fake, a person will be able to claim a real video is fake. Not good!

Bonus: The serious threat of deepfake videos

Ultra-realistic but fake videos are being made of people saying and doing things they’ve never done. It’s deepfake technology and in this Consumer Tech Update, Kim explains why it’s even more disturbing and dangerous than fake news.

How to avoid work-at-home scams and make money with legitimate jobs

Have you ever wanted to work a side job to make a little extra money from home? Maybe you've dreamed of leaving your current job to work full time from the comfort of your own home. Sounds good, but how do you differentiate between scams and legitimate money-making jobs? In this episode of Komando on Demand, Kim takes a hard look at work-at-home scams as well as work at home success stories. Kim also gives some tips and advice on how to tell whether that job offer is legitimate or a scam.

Tap or click to get the inside scoop on how to legitimately make money working from home.

Next Story
If you ate at this restaurant chain, hackers may already have your credit card information
Previous Happening Now

If you ate at this restaurant chain, hackers may already have your credit card information

New Amazon Alexa skills: Sports Jeopardy, Brief Mode, set reminders and more
Next Happening Now

New Amazon Alexa skills: Sports Jeopardy, Brief Mode, set reminders and more

View Comments ()