CNN  — 

Jennifer DeStefano’s phone rang one afternoon as she climbed out of her car outside the dance studio where her younger daughter Aubrey had a rehearsal. The caller showed up as unknown, and she briefly contemplated not picking up.

But her older daughter, 15-year-old Brianna, was away training for a ski race and DeStefano feared it could be a medical emergency.

“Hello?” she answered on speaker phone as she locked her car and lugged her purse and laptop bag into the studio.

She was greeted by yelling and sobbing.

“Mom! I messed up!” screamed a girl’s voice.

“What did you do?!? What happened?!?” DeStefano asked.

“The voice sounded just like Brie’s, the inflection, everything,” she told CNN recently. “Then, all of a sudden, I heard a man say, ‘Lay down, put your head back.’ I’m thinking she’s being gurnied off the mountain, which is common in skiing. So I started to panic.”

As the cries for help continued in the background, a deep male voice started firing off commands: “Listen here. I have your daughter. You call the police, you call anybody, I’m gonna pop her something so full of drugs. I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again.”

DeStefano froze. Then she ran into the dance studio, shaking and screaming for help. She felt like she was suddenly drowning.

After a chaotic, rapid-fire series of events that included a $1 million ransom demand, a 911 call and a frantic effort to reach Brianna, the “kidnapping” was exposed as a scam. A puzzled Brianna called to tell her mother that she didn’t know what the fuss was about and that everything was fine.

But DeStefano, who lives in Arizona, will never forget those four minutes of terror and confusion – and the eerie sound of that familiar voice.

“A mother knows her child,” she said later. “You can hear your child cry across the building, and you know it’s yours.”

Artificial intelligence has made kidnapping scams more believable

The call came in on January 20 around 4:55 p.m. DeStefano had just pulled up outside the dance studio in Scottsdale, near Phoenix.

DeStefano now believes she was a victim of a virtual kidnapping scam that targets people around the country, frightening them with altered audio of loved one’s voices and demanding money. In the United States, families lose an average of $11,000 in each fake-kidnapping scam, said Siobhan Johnson, a special agent and FBI spokesperson in Chicago.

Overall, Americans lost $2.6 billion last year in imposter scams, according to data from the Federal Trade Commission.

In audio of the 911 call provided to CNN by the Scottsdale Police Department, a mom at the dance studio tries to explain to the dispatcher what’s happening.

“So, a mother just came in, she received a phone call from someone who has her daughter … like a kidnapper on the phone saying he wants a million dollars,” the other mom says. “He won’t let her talk to her daughter.”

Jennifer DeStefano, right, was at a dance rehearsal for her younger daughter, Aubrey, center, when she got a call claiming that her older daughter, Brianna, left, had been kidnapped.

In the background, DeStefano can be heard shouting, “I want to talk to my daughter!”

The dispatcher immediately identified the call as a hoax.

“So that is a very popular scam,” she said. “Are they asking for her to go get gift cards and things like that?”

Imposter scams have been around for years. Sometimes, the caller reaches out to grandparents and says their grandchild has been in an accident and needs money. Fake kidnappers have used generic recordings of people screaming.

But federal officials warn such schemes are getting more sophisticated, and that some recent ones have one thing in common: cloned voices. The growth of cheap, accessible artificial intelligence (AI) programs has allowed con artists to clone voices and create snippets of dialogue that sound like their purported captives.

“The threat is not hypothetical — we are seeing scammers weaponize these tools,” said Hany Farid, a computer sciences professor at the University of California, Berkeley and a member of the Berkeley Artificial Intelligence Lab.

“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough,” he added. “The trend over the past few years has been that less and less data is needed to make a compelling fake.”

With the help of AI software, voice cloning can be done for as little $5 a month, making it easily accessible to anyone, Farid said.

The Federal Trade Commission warned last month that scammers can get audio clips from victims’ social media posts.

“A scammer could use AI to clone the voice of your loved one,” the agency said in a statement. “All he needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you … (it will) sound just like your loved one.”

DeStefano: ‘It was … the sound of her voice’

Until that day, DeStefano had never heard of virtual kidnapping schemes. Law enforcement has not verified whether AI was used in her case, but DeStefano believes scammers cloned her daughter’s voice.

She’s just not sure how they got it.

Brianna has a social media presence – a private TikTok account and a public Instagram account with photos and videos from her ski racing events. But her followers are mostly close friends and family, DeStefano said.

“It was obviously the sound of her voice,” she said. “It was the crying, it was the sobbing. What really got to me is that she’s not a wailer. She’s not a screamer. She’s not a freak out. She’s more of an internal, try-to-contain, try-to-manage person. That’s what threw me off. It was the voice, matching with the crying.”

Jennifer DeStefano, right, with her daughter, Brianna: "A mother knows her child," she said.

That day in the dance studio, DeStefano persuaded the caller to lower the ransom amount. She asked her daughter, Aubrey, to use her phone to call Brianna or her dad, who was with her at a ski resort 110 miles away in northern Arizona.

Aubrey,13, was shaking and crying as she listened to the screams she believed were her sister’s.

“Aubrey was … hearing all the vulgarities of what they were gonna do with her sister. A lot of swearing, threats,” DeStefano said.

Another mom took Aubrey’s phone and tried to reach DeStefano’s husband and Brianna. At that point, the threat still seemed real.

Many such scams originate in Mexico, the FBI says