A New Wave of Scams
Jennifer DeStefano, a mother from Arizona, received a frantic call from someone who sounded exactly like her 15-year-old daughter, Brianna. The voice pleaded for help before a man took over the call, claiming to have kidnapped her daughter and demanding a ransom of up to $1 million. DeStefano later discovered it was all a hoax, orchestrated using AI to clone her daughter’s voice.
Similar stories abound, with one family hearing what seemed to be their daughter’s voice, unmistakably familiar, asking for $5,000. Americans lost nearly $9 billion to scams in the past year, a 150% increase over two years, highlighting the rapid rise of these sophisticated frauds.
How AI Voice Cloning Works
AI voice-cloning technology can recreate a person’s voice with uncanny accuracy using just a few seconds of audio, often sourced from publicly available online content. These tools can now produce real-time responses, allowing scammers to simulate live conversations that feel authentic and personal.
Cybersecurity expert Pete Nicoletti demonstrated how easy it is to replicate a voice, even using old news clips. In one test, a cloned voice of a journalist successfully tricked his own mother into sharing sensitive information.
Protecting Yourself from AI Scams
As voice-cloning scams become more prevalent, experts recommend these steps to safeguard against them:
- Establish a Family Safe Word: Use a unique word or phrase known only to family members for identity verification during emergencies.
- Limit Social Media Exposure: Set accounts to private and avoid posting audio or video that could be exploited.
- Verify Suspicious Calls: If you receive an alarming call, hang up and contact the person directly through a trusted channel.
- Be Skeptical: Don’t trust voices, photos, or videos at face value, as AI can convincingly mimic all three.
A Stark Warning for the Future
Experts warn that as AI technology improves, scammers will be able to respond instantly during calls, making their deceptions even more realistic. Nicoletti noted, “You can’t trust the voice, the photo, or the video anymore,” emphasizing the importance of vigilance in the digital age.
AI voice-cloning scams are a sobering reminder of how technology can be weaponized. Awareness, caution, and preventive measures are critical to staying one step ahead of these evolving threats.
Reference: CBS News. (2023, July 13). How phone scammers are using AI to imitate voices [Video]. YouTube. https://www.youtube.com/watch?v=pJZYd_65xs4