ArabicChinese (Simplified)EnglishFrenchGermanItalianPortugueseRussianSpanish
Business

Scammers use voice-cloning AI to mimic relatives

In the near future, you may receive a call from a relative who needs urgent help, asking you to send him money quickly. And you might be convinced they are because you know their voice.

Artificial intelligence is changing that. New generative AI tools can create all sorts of output from simple text prompts, including essays written in the style of a particular author, pictures worthy of art awards, and—with just a snippet of a person’s voice—with to work with – language that sounds convincingly like a specific person.

In January, Microsoft researchers demonstrated a text-to-speech AI tool that can accurately simulate a person’s voice given just a three-second audio sample. You haven’t shared the code for others to play around with; Instead, they warned that the tool, dubbed VALL-E, “can pose potential risks if misused…like spoofing speech recognition or the identity of a specific speaker.”

But a similar technology is already in the wild – and scammers are taking advantage of it. If they find 30 seconds of your voice online somewhere, there’s a good chance they can clone it — and get it to say anything.

“Two years ago, even a year ago, it took a lot of audio to clone a person’s voice. Now…if you have a Facebook page…or if you recorded a TikTok and your voice is on there for 30 seconds, people can clone your voice,” said Hany Farid, a professor of digital forensics at the University of California at Berkeley Die Washington Post.

“The money is gone”

The post reported on the danger this weekend, detailing how a Canadian family fell victim to scammers who used AI voice cloning – and lost thousands of dollars. Elderly parents have been told by a “lawyer” that their son has killed an American diplomat in a car accident, is in prison and needs money for legal fees.

The alleged lawyer then allegedly turned the phone over to the son, who told the parents he loved and cherished them and needed the money. The cloned voice sounded “close enough for my parents to actually believe they were talking to me,” said the son, Benjamin Perkin post.

The parents sent more than $15,000 through a bitcoin terminal to — well, scammers, not their son as they thought.

“The money’s gone,” Perkin told the newspaper. “There is no insurance. There is no going back. It’s gone.”

A company that offers a generative AI language tool, ElevenLab, tweeted on Jan. 30 that there were “increasing numbers of abuse cases involving voice cloning.” The next day, it announced The voice cloning feature would no longer be available for users of the free version of the VoiceLab tool.

“Almost all malicious content was generated by free, anonymous accounts,” it says wrote. “An additional identity check is necessary. Because of this, VoiceLab will only be available at paid tiers.” (Subscriptions start at $5 per month.)

Card verification won’t stop every bad actor, she conceded, but it would make users less anonymous and “force them to think twice.”

Learn how to navigate and build trust in your organization with The Trust Factor, a weekly newsletter exploring what leaders need to succeed. Login here.

Related Articles

Back to top button
ArabicChinese (Simplified)EnglishFrenchGermanItalianPortugueseRussianSpanish