A mother of a 15-year-old girl, Jennifer DeStefano, received a disturbing phone call on January 20th, 2023, while taking her younger daughter, Aubrey, 13, to a dance rehearsal in Scottsdale Arizona. The call’s caller ID showed an “Unknown number”, yet a familiar voice was heard on the other end of the telephone call. The voice belonged to her other teenager, Brianna DeStefano.
“Mom! I messed up!” screamed a girl’s voice.
“What did you do? What happened?” the mother asked.
“The voice sounded just like Brie’s, the inflection, everything,” DeStefano told CNN in an interview. “Then, all of a sudden, I heard a man say, ‘Lay down, put your head back.’ The deep male voice began to list off commands and various threats: “Listen here. I have your daughter. You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again.”
At the request of her mother, Aubrey, used her phone in an attempt to call Brianna and text her father, who was with her at a ski resort 110 miles away in northern Arizona. A puzzled Brianna called to tell her mother that she didn’t know what the fuss was about and that everything was fine. After the series of chaotic events that included a $1 million ransom demand, a 911 call and a frantic effort to reach Brianna, the “kidnapping” was exposed as a scam.
Imposter scams have been around for years with various levels of sophistication. Some have included the caller reaching out to grandparents and saying their grandchild had been in an accident and needs money. Fake kidnappers have used generic recordings of people screaming.
But federal authorities warn these schemes are getting more sophisticated, and most recent cases have utilized a common tactic: cloned voices. With growth and adoption of now readily available and accessible AI programs has allowed con artists to clone voices and create snippets of dialogue that sound like their purported captives. “The threat is not hypothetical — we are seeing scammers weaponize these tools,” said Hany Farid, a computer sciences professor at the University of California, Berkeley and a member of the Berkeley Artificial Intelligence Lab. “A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough,” he added.
According to the mother, Brianna has a social media presence — a private TikTok account and a public Instagram account with photos and videos from her ski racing events, but her followers are mostly close friends and family. With the help of AI software, voice cloning can be done for as little $5 a month, making it easily accessible to anyone, Farid said. The growing use of AI in social engineering attacks is reflected in the increasing number of complaints to the Federal Trade Commission (FTC).
In the three months following ChatGPT’s release, the FTC saw a significant increase in social engineering complaints, with a 34% increase in reports related to imposter scams and a 50% increase in those related to government imposter scams. These scams often use AI-enabled tools to impersonate trusted institutions or individuals, making it easier for attackers to gain victims’ trust and steal their personal information.
Recommendation
RSM Defense Intelligence Analyst recommends some methods that could help mitigate similar tactics attacking personnel on either business or personal level. When answering a phone call from an unknown number, use short commands or remain silent until the caller speaks. This would help mitigate robocalls cloning a voice. Also, utilizing a challenge and password code phrase to authenticate the validity of the caller. This would be a predetermined code language or verbiage that would validate and authenticate the caller. This is a well-known tactic in the military to validate an individual on the battlefield. Observed in many WWII films of a “Coded challenge” could be a way to validate an individual when a technology option, such as a verifiable and encrypted email, is not available.