Reading view
Best Value Signings of NFL Free Agency
Will Ospreay Divides Wrestling Fans With Controversial Statement About AEW
Best Duke vs. Louisville Prop Bets, Picks: Devils Favored in ACC Title Game
DraftKings Promo Code: Claim $150 Bonus as March Madness Nears
Drew McIntyre Teases Huge Plans With The Rock Following WWE Elimination Chamber
The Irony of Justin Thomas' Record Round at The PLAYERS is Palpable
Deadly Tornadoes Tear Through Midwest
Insider Believes Chiefs Let Patrick Mahomes Down in Free Agency
This Justin Thomas Stat Amid PLAYERS Domination Will Shock Even Tiger Woods
Map Shows Impacted Counties as Over 130 Fires Blaze Across Oklahoma
PXG's Jake Knapp Passing Golf's 'True Test' at The PLAYERS Championship
Canada Could Scrap $13 Billion F-35 Fighter Jet Deal With US
Antonio Brown Posts Crazy AI Video Pitching Return to Steelers With Aaron Rodgers
LeBron James' Wife Savannah Twins With Daughter Zhuri, 10, During Family Outing
How to Watch AFC Bournemouth vs Brentford: Live Stream Premier League, TV Channel
Aston Martin F1 Driver's Late Weather Warning Sparks Doubt Over Australian GP Season Opener
Shedeur Sanders Plummets to AFC Contender in Wild New Mock Draft
Greatest March Madness Players Who Flamed Out in the NBA
AI voice-cloning scams: A persistent threat with limited guardrails
With just mere seconds of audio, artificial intelligence voice cloning programs can make a copy of a voice, virtually indistinguishable from the original to the human ear.
Why it matters: That tech can have legitimate accessibility and automation benefits β but it can also be an easy-to-use tool for scammers. Despite that threat, many products' guardrails can be easily sidestepped, a new assessment found.
- The "granny scam," as experts refer to imposter scams that sometimes weaponize voice cloning tech to scam people using their loved ones' voices, is not a new phenomenon.
- However, "the pace at which it's now happening and the believability of the voice has fundamentally changed," says Rahul Sood, the chief product officer at Pindrop, a security company that develops authentication and fraud detection tools.
- It's not just individuals who are at risk, he noted. The corporate sector faces many cyber threats, from account takeover scams targeting call centers to recruiting impersonation.
Zoom in: A study out this week from Consumer Reports found many leading voice-cloning technology products lacked significant safeguards to prevent fraud or misuse.
- For four of the six products in the test set, researchers were able to "easily create" a voice clone using publicly accessible audio, with no technical mechanism to ensure the creators received the speaker's consent to use their voice or to limit the cloning to the user's own voice.
- For four of those services, it was free to create a custom voice cloning.
By the numbers: While the Federal Trade Commission does not have specific data on voice-cloning scams, over 845,000 imposter scams were reported in the U.S. in 2024.
The intrigue: Scams and spoofs using AI voice cloning and deepfake technology also often impersonate well-known individuals, like celebrities, CEOs and politicians.
- After former President Biden's voice was cloned using AI in fake robocalls discouraging voting in the New Hampshire primary, the Federal Communications Commission unanimously outlawed the use of AI-generated voices in scam robocalls.
- In July, Elon Musk shared a fake Kamala Harris ad that featured a phony voice that sounded just like the then-vice president β teeing up a debate over whether such media is obvious "parody" or dangerous AI.
What they're saying: Such scams on social media platforms are only growing, and voice cloning "is far more mature" and widely accessible today than facial cloning technology, Sood says.
- The commercial services Pindrop tracks are often "very easy to use," Sood added.
- He said the quality of voice cloning has now passed the so-called "uncanny valley" β meaning the human ear can no longer detect the difference between what is human and what is machine-generated.
Philadelphia attorney Gary Schildhorn detailed to a Senate panel in 2023 how he almost became the victim of a voice-cloning imposter scam, when he received a call from his "son," who tearfully told him he was in a car accident with a pregnant woman and was in jail.
- What ensued was a multi-layer scam that ended with Schildhorn being told to wire money to the man claiming to be his son's attorney.
- "I'm a father; I'm a lawyer," he said. "My son's in trouble, a pregnant woman was hurt, he's in jail; I'm in action mode."
- But before he could send the money, Schildhorn received a call from his son β who had not been in an accident and was not in jail.
The Consumer Reports assessment recommended mitigation practices that include requiring unique audio consent statements and watermarking AI-generated audio.
- For individuals, it can be prudent to change the way people think about sharing their voices online, such as through custom voicemail messages, experts told Axios.
Yes, but: Steve Grobman, McAfee's chief technology officer, acknowledges it's not practical in a digital world to expect everyone to erase their voice from the internet.
- "I think of it a little bit like developing a healthy skepticism," he said, recommending a family code word to verify a caller's identity.
The bottom line: Grobman highlighted the legitimate, powerful benefits voice cloning tech can have: providing a voice for those who may not be able to speak, bridging language divides and saving time and resources.
- "I think in many ways, we have to think about our voice being out there as something that is a cost of doing business for all the great things the digital world of 2025 can bring to us," he added.
Go deeper: IBM researchers use AI voices to hijack phone calls