There are already enough concerns in life out there to keep people with anxiety up at night. The coronavirus is a perfect example — it’s spreading across the globe and scaring the bejesus out of everyone.
As if the coronavirus wasn’t bad enough on its own, criminals are now using public fears of catching the virus to rip people off. Tap or click here to find out how to avoid being tricked by coronavirus scams.
Think things can’t get any worse? Well, you’re wrong. This new tech could really be a game-changer for scammers.
We’ve discussed deepfake videos before. Facial mapping, artificial intelligence and deep machine learning are used to create ultra-realistic fake videos of people doing and saying things they haven’t actually said or done.
These deepfakes can be used to blackmail people and creepers have been caught creating deepfake videos and posting them on popular porn sites to extort victims into paying in exchange for removing them. Tap or click here for details on this shady extortion scam.
Now, a new twist on deepfake technology is emerging and it could lead to all kinds of problems.
You may also like: Find out who’s calling you from an unknown or blocked number
We’re talking about AI-enabled voice cloning technology, and it could become the next big thing in security scams.
Pindrop, a company that focuses on voice fraud, is warning that voice cloning technology is becoming a huge threat. Criminals are cloning people’s voices and using them to commit scams.
During a recent presentation, Pindrop’s CEO said, “We’re starting to see deepfake audios emerge as a way to target particular speakers, especially if you’re the CEO of a company and you have a lot of YouTube content out there. What these fraudsters are starting to do is use that to start synthesizing your audio.”
Deepfake audio is being used in conjunction with familiar Business Email Compromise (BEC) attacks.
The FBI describes BECs as a sophisticated scam that targets businesses “working with foreign suppliers and/or businesses that regularly perform wire transfer payments.”
Basically, a BEC scammer attempts to trick employees into sending money transfers or handing out sensitive information by impersonating executive email accounts. These attacks are initiated either by social engineering tricks, email spoofing or malware, targeting employees from companies across the U.S.
You may also like: 7 signs your device is infected with a virus or keylogger
You can probably see where this is heading.
Yep, with the help of voice cloning, BEC attacks become much more convincing. In theory, scammers can use a company’s CEO’s voice to convince employees to send money. Of course, the CEO didn’t really order the transfer, but unsuspecting employees won’t know that.
The only good news is the technology is in the early stages and rarely seen in scams. But how long will that last? Cybercriminals always use the most sophisticated tools at their disposal, and once voice cloning becomes more mainstream you can bet they’ll use it.
Voicemails and phone calls might need to become a thing of the past if these scams become rampant. Face-to-face could end up being the only secure way to conduct business