Connect with us

Hi, what are you looking for?

U.S. News

Scammers Use Sophisticated New Technology To Terrorize California Family

via Pixabay License
This article was originally published at Publications approved for syndication have permission to republish this article, such as Microsoft News, Yahoo News, Newsbreak, UltimateNewswire and others. To learn more about syndication opportunities, visit About Us.

A California family fell victim to scammers who used AI to replicate their son’s voice, leading his mother to believe he was in a car accident and needed bail money.

The scammers used the mother’s emotional distress to manipulate her into withdrawing $15,500.

The incident highlights the growing threat of AI-powered voice scams, with experts warning that such scams will likely increase. (Trending: Joe Biden’s Approval Crashes To New Low)

The family, initially convinced by the scammers, eventually realized it was a hoax and reported the incident to the police.

The use of AI in such crimes is expected to rise, and experts advise caution, suggesting that individuals verify requests before taking any action involving money.

“It was my son’s voice on the phone crying, telling me, ‘Mom, mom, I’ve been in a car accident,'” mother Amy Trapp said.

“Scams like this are growing every day as deepfake technology improves and becomes easier to access,” Pioneer Development Group chief analytics officer Christopher Alexander said.

“Scams like this succeed because immense stress is placed on the victims in the hopes that imperfections in the deepfake go undetected.”

“Scam calls have been a consistent thorn in the side of Americans for years, and no one expects to be the person falling for it,” Bull Moose Project president Aiden Buzzetti said.

“The proliferation of AI voice replication is going to make even the most tech-savvy user susceptible to deception, especially when human instinct comes into play.”

“The criminals behind these robocalls fabricate high-stakes situations to keep parents and grandparents short-sighted, emotionally aggravated and prevent them from thinking clearly. We need to keep encouraging the federal government and state agencies to crack down on robocalls by prosecuting services that allow for foreign spam callers.”

“We are witnessing an alarming explosion in Al-powered voice scams, which now pose a growing threat to families. Criminals and scammers are capitalizing on advancements in generative Al to impersonate individuals with frightening accuracy,” research associate Jake Denton said.

“These scams tend to prey on unsuspecting people’s emotions, often mimicking the voice of a family member pleading urgently for help. To stay safe from these scams, families should take proactive measures now to protect themselves.”

“He said, ‘Don’t tell [the bank] why you’re getting the money because you don’t want to tarnish your son’s reputation,’” Trapp said, emphasizing that she “would have done anything he said” in her desperation.

“It all comes down to that voice being recognized by his own mother, who he speaks to several times a week,” father Andy Trapp, said. “I never, ever, thought I would ever fall for anything like that,”

When the couple was told a courier was coming to grab the money from them, the light bulbs went off.

“That sounded totally wrong,” Andy said.

“Where is my son? Where is my son?” Amy screamed.

“Yo, what’s up?” Will Trapp said when answering the parents’ call.

“Phishing scams remain a serious issue. But now that AI enables bad actors to use sophisticated voice cloning technology, everyone with a cellphone is a potential target,” The Federalist’s Samuel Mangold-Lenett said. “More needs to be done to hold the people who run these scams accountable. Laws already on the books can likely be applied to situations like these, but perhaps more specific regulations ought to be implemented.”

“Many seniors and older people will not take the time to understand AI because they won’t use it, which will make them ripe targets for scams such as this,” Center for Advanced Preparedness and Threat Response Simulation (CAPTRS) founder Phil Siegel
said. “Whether it’s being in jail, in a car accident, needing home repair money or many other scams, this will happen more often in the next few years.”

“The lesson is to always call the person back before doing anything,” Siegel said. “Never give someone you don’t know or haven’t hired any cash, wire money, Venmo money or cryptocoins. And, of course, if it is anything to do with law enforcement, check with them first.”

“It’s hard to imagine how that could be used because it’s not like my speaking voice,” son Will Trapp said. “It’s really scary.”

Most Popular:

Trump Responds To Biden’s Bombing In Yemen

Bombshell UFO Footage Released To The Public

Hunter Biden’s Art Scheme Exposed By GOP Probe

You May Also Like


This article was originally published at Publications approved for syndication have permission to republish this article, such as Microsoft News, Yahoo News, Newsbreak,...