Don’t believe your ears. Is voice cloning the future for clever scammers?
There are already enough concerns in life out there to keep people with anxiety up at night. Think things can’t get any worse? Well, you’re wrong. New voice impersonation technology could be a game-changer for scammers.
Who dis?
We’ve discussed deepfake videos before. Artificial intelligence, facial mapping, and deep machine learning enable the creation of ultra-realistic fake videos that show individuals doing or saying things they have never actually done or said.
Scammers can use these deepfakes to blackmail people. In fact, there have been cases where creepers created deepfake videos and posted them on popular porn sites, extorting victims into paying money in exchange for their removal. Tap or click here for details on this shady extortion scam.
Now, a new twist on deepfake technology is emerging and it could lead to all kinds of problems.
We’re talking about AI-enabled voice impersonation technology
It could become the next big thing in security scams
Pindrop, a company that focuses on voice fraud, is warning that voice cloning technology is becoming a huge threat. Criminals are cloning people’s voices and using them to commit scams.
During a recent presentation, Pindrop’s CEO said, “We’re starting to see deepfake audios emerge as a way to target particular speakers, especially if you’re the CEO of a company and you have a lot of YouTube content out there. What these fraudsters are starting to do is use that to start synthesizing your audio.”
Scammers are using deepfake audio in combination with familiar Business Email Compromise (BEC) attacks. The FBI describes BECs as a sophisticated scam that targets businesses “working with foreign suppliers and/or businesses that regularly perform wire transfer payments.”
Basically, a BEC scammer attempts to trick employees into sending money transfers or handing out sensitive information by impersonating executive email accounts. Scammers initiate these attacks by using social engineering tricks, email spoofing, or malware to target employees from companies across the U.S.
You may also like: 7 signs your device is infected with a virus or keylogger
How voice impersonation tech can change the game
Yep, with the help of voice cloning, BEC attacks become much more convincing. In theory, scammers can use a company’s CEO’s voice to convince employees to send money. Of course, the CEO didn’t really order the transfer, but unsuspecting employees won’t know that.
The only good news is the technology is in the early stages and rarely seen in scams. But how long will that last? Cybercriminals always use the most sophisticated tools at their disposal, and once voice cloning becomes more mainstream you can bet they’ll use it.
Voicemails and phone calls might need to become a thing of the past if these scams become rampant. Face-to-face could end up being the only secure way to conduct business.
Read more
Find out who’s calling you from an unknown or blocked number
Tags: AI (artificial intelligence), blackmail, cybercriminals, deepfake videos, Deepfakes, extortion scam, malware, scammers