Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Tech

Fla. pol targeted in elaborate ‘car crash’ AI scam — which almost fooled his dad into forking over $35K

An attorney has issued a warning over an elaborate AI voice-cloning scam which fooled his own dad into nearly handing over $35,000.

Scammers impersonated Jay Shooster, 34, and called his dad Frank, 70, convincing him his son had been in a serious car accident, was arrested and needed bail money.

Terrified Frank, a retired attorney, said he was convinced it was his “hysterical” son and has been deeply traumatized by the scam.

Jay is running for Florida’s 91st District in the House of Representatives, and think scammers managed to create a fake voice from his 15-second TV campaign ad.

Jay Shooster, 34, has issued a warning over an elaborate AI voice-cloning scam that fooled his dad into nearly handing over $35,000.
Courtesy of Jay Shooster / SWNS

Frank, also from Boca Raton, Florida, who was visiting his daughter in New York at the time, said: “Just as the Uber car arrived to take me into New York City, I got a phone call.

“It was my son, Jay. He was hysterical, but I knew his voice immediately.

“He said he had been in an accident, broke his nose, had 16 stitches, and was in police custody because he tested positive for alcohol after a breathalyzer.

“He blamed it on the cough syrup he had taken earlier.”

The impersonator, posing as Jay, pleaded with Frank not to tell anyone about the situation, on September 28.

Moments later, a man identifying himself as ‘Mike Rivers’, a supposed attorney, called and said Jay needed a $35,000 cash bond to avoid being held in jail for several days.

The scam escalated when ‘Rivers’ instructed Frank to pay the bond via a cryptocurrency machine — an unconventional request that heightened Frank’s suspicions.

“I became suspicious when he told me to go to a Coinbase machine at Winn-Dixie,” Frank says. “I didn’t understand how that was part of the legal process.”

Frank eventually realized something was wrong after his daughter, Jay’s twin sister, Lauren, and her friend discovered AI voice-cloning scams were on the rise.

He ultimately hung up the phone.

“It’s devastating to get that kind of call,” said Frank.

“My son has worked so hard, and I was beside myself, thinking his career and campaign could be in ruins.”

Jay, who has presented on scams like this as an attorney, was shocked to find himself a target.

Scammers impersonated Jay Shooster, 34, and called his dad Frank, 70, convincing him his son had been in a serious car accident, was arrested and needed bail money, according to reports. Courtesy of Jay Shooster / SWNS

He speculated the scammers might have cloned his voice from his recent campaign ad, which had aired on television just days before the incident.

“I’ve been paying attention to AI and its effects on consumers, but nothing prepares you for when it happens to you,” Jay says.

“They did their research. They didn’t use my phone number, which fit the story that I was in jail without access to my phone.”

The scam’s sophistication left Jay stunned.

“All it takes is a few seconds of someone’s voice,” he said.

“The technology is so advanced that they could have easily pulled my voice from my 15-second campaign ad.

“There’s also other video footage of me online, so they could’ve used any of that to clone my voice.”

Jay is advocating for changes in AI regulation to prevent such scams from harming others.

“There are three key policy solutions we need,” he says. “First, AI companies must be held accountable if their products are misused.

“Second, companies should require authentication before cloning anyone’s voice. And third, AI-generated content should be watermarked, so it’s easily detectable, whether it’s a cloned voice or a fake video.”

If elected to the Florida House of Representatives, Jay plans to take action against the rising misuse of AI technology, including voice-cloning scams.

He aims to introduce legislation that would hold AI companies liable for misuse, ensuring they implement necessary safeguards such as voice authentication and watermarking.

Jay is advocating for changes in AI regulation to prevent such scams from harming others. Courtesy of Jay Shooster / SWNS

“We need to create clear regulations to stop these types of crimes from happening,” Jay says. “It’s not just about technology — it’s about protecting people from the trauma and financial damage that can result from these scams.

“I want to push for more stringent requirements for AI developers to ensure their tools are not used maliciously.”

As AI technology rapidly evolves, Jay and Frank hope their story serves as a warning for others to stay vigilant.

“This shows how important it is to stay calm and think things through carefully,” Frank notes. “You have to listen and ask questions if something doesn’t add up. Scams like this are becoming more sophisticated, but we can’t let our guard down.”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button