Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Tech

Teens using AI to make fake nudes of classmates

They’ve turned tech into a weapon — and no one’s safe from the scandal.

Teens are using artificial intelligence to whip up disturbingly realistic nude images of their classmates — and then share them like digital wildfire, sending shockwaves through schools and leaving experts fearing the worst.

The AI-powered tools, often dubbed “nudify” apps, are as sinister as they sound. With just a headshot — often lifted from a yearbook photo or social media profile — these apps can fabricate explicit deepfake images that appear scarily real.

And yes, it’s already happening in schools.

AI “nudify” apps are fueling a disturbing trend among teens: generating fake nude images of classmates and spreading them with devastating consequences. Getty Images

These hyper-realistic images — forged with AI tools — are turning bullying into a high-tech nightmare. 

“We’re at a place now where you can be doing nothing and stories and pictures about you are posted online,” Don Austin, superintendent of the Palo Alto Unified School District, told Fox News Digital.

“They’re fabricated. They’re completely made up through AI and it can have your voice or face. That’s a whole other world.”

This is a full-blown digital crisis. Last summer, the San Francisco City Attorney’s office sued 16 so-called “nudify” websites for allegedly violating laws around child exploitation and nonconsensual images. 

Those sites alone racked up more than 200 million visits in the first half of 2023.

This trend is a full-blown digital crisis. Getty Images

But catching the tech companies behind these tools? That’s like playing a game of Whac-A-Mole. 

Most have skated past current state laws, though some — like Minnesota — are trying to pass legislation to hold them accountable for the havoc they’re wreaking.

Still, the tech moves faster than the law — and kids are getting caught in the crossfire.

AI apps are making bullying disturbingly easy — no skills needed, just a face and a few taps to create shockingly real fake nudes. Getty Images/iStockphoto

Josh Ochs, founder of SmartSocial — an organization that trains families on online safety — told Fox News Digital that AI-generated nudes are causing “extreme harm” to teens across the country.

“Kids these days will upload maybe a headshot of another kid at school and the app will recreate the body of the person as though they’re nude,” Ochs revealed to the outlet.

“This causes extreme harm to that kid that might be in the photo, and especially their friends as well and a whole family,” he noted.

He said parents need to stop tiptoeing around their children’s digital lives — and start laying down some boundaries.

“Before you give your kids a phone or social media, it’s time to have that discussion early and often. Hey, this is a loaner for you, and I can take it back at any time because you could really hurt our family,” Ochs said.

In February, the U.S. Senate unanimously passed a bill to criminalize publishing — or even threatening to publish — nonconsensual AI deepfake porn. 

It now awaits further action.

Austin said the only way to get ahead of the curve is to keep talking — with parents, teachers, students, and anyone else who will listen.

“This isn’t going away,” he warned. “It’s evolving — and fast.”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button