Can you spot election deepfakes? Here’s how not to be duped

Election misinformation and deepfakes
© Tero Vesalainen | Dreamstime.com

There was the deepfake audio robocall of President Biden telling you to hold your vote. And just last week, a phony video of Donald Trump with black voters made the rounds.

AI deepfakes are a massive problem this election season, and it’s easy to get taken — especially when your news and social media feeds are full of this junk.

By the way, you’re not alone if you have been fooled. Nearly two-thirds of people can’t tell the difference between AI-generated images and voices and the real thing. Those are awful odds. Here are some rules of thumb to protect your vote.

Viral doesn’t mean verified

Almost all AI-generated slop online is peddled for clicks on social media, not published by major news outlets. Of course, these publications still get tripped up, but it’s rare.

I’m all for citizen journalism, but when it comes to our elections, stick to publications you know you can trust. Be wary of anonymous accounts that post without a legitimate person or organization attached to them.

If it’s some random person you’ve never heard of on Facebook, do your homework before you hit share.

Look for other coverage

Scammers can put together a convincing image or video but can’t fake the context. When President Biden or Donald Trump says something, I promise it will be reported a hundred times and recorded from 20 angles — especially if it’s outlandish.

Pro tip: Search for related keywords on Google and social media platforms like YouTube, TikTok and Instagram. If you’re struggling with ways to search, you can even take screenshots of critical parts of the video and do a reverse image search.

Slow down

We’re all busy and in a hurry, but it’s worth slowing down — especially if something makes you feel something big. Deepfakes are often created with emotion in mind. The point is to make you mad, sad or scared enough to share.

When it comes to political figures, pay attention to mannerisms. They’re as unique as fingerprints. President Obama’s signature head lift and slight frown were present whenever he said, “Hi, everybody,” in his weekly addresses. If the star of a video seems like an impersonator, they very well could be.

When in doubt, use this AI image checklist

Election fakes are particularly tricky to spot because there’s so much public footage of politicians speaking in front of similar backgrounds to copy. But you can still use these guidelines to verify if it’s AI or not:

  • Backgrounds: A vague, blurred background, smooth surfaces or lines that don’t match up are immediate red flags that an image is AI-generated.
  • Context: Use your head — if the scenery doesn’t align with the current climate, season or what’s physically possible, that’s because it’s fake.
  • Proportions: Check for objects that look mushed together or seem too large or small. The same goes for features, especially ears, fingers and feet.
  • Angle: Deepfakes are the most convincing when the subject faces the camera directly. Glitches may appear once a person starts to turn to the side and move.
  • Text: AI can’t spell. Look for fake words on signs and labels.
  • Chins: Yep, you heard me. The lower half of the face is the No. 1 giveaway on AI-generated candidate videos. It’s subtle, but check to see if their chin or neck moves unnaturally or in an exaggerated way.
  • Fingers and hands: Look for weird positions, too many fingers, extra-long digits or hands out of place.

If you spot it, don’t spread it

I get that some of these images and videos are shocking or even hilarious — but they’re putting our elections at risk. Don’t contribute to the “Great American Fake-Off.” If you’re going to share something you know is AI-generated, call it out clearly in your text or post. You’re better off not sharing it at all.

Tags: Amazon, Google