Fraudsters are using artificial intelligence (AI) to mimic voices and pull off money-making scams. Voice cloning technology is capable of recording and imitating voices so that they can be used to extort money. For example, Jennifer DeStefano from Phoenix recently testified to Congress that she got a phone call from scammers saying they had kidnapped her daughter and used AI to imitate the girl’s voice.
Virtual kidnapping scams are increasing across the US and according to the Federal Trade Commission, Americans lost around $2.6 billion last year from these and similar scams. Other reported scenarios include phone calls ostensibly from family members saying they are in trouble and need money. The voice is imitated either by capturing samples from social media or by phoning a person’s home and recording the voice of whoever answers.
Hany Farid, a digital forensics professor at the University of California at Berkeley, said, “If you made a TikTok video with your voice on it, that’s enough. This is part of a continuum. We started with the spam calls, then email phishing scams, then text message phishing scams. So this is the natural evolution of these scams.”
The Federal Trade Commission has issued guidance on how to avoid falling victim to the criminal schemes. The guidance suggests that families create a code word to ensure the person speaking is who they say they are, and if a call is answered, to allow the caller to speak first. A caller asking for money is also a red flag that should be double-checked and verified. Hany Farid suggests answering only familiar numbers or when a call is pre-planned.
Producers of voice-imitating software acknowledge the dangers. One company is Eleven Labs, which says voice-related fraud is on the rise. It produces identity verification software, but this is only available to users for an extra fee. Microsoft’s Vall-E, which is not yet publicly available, will “include a protocol to ensure that the speaker approves the use of their voice.”