Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

News

‘Take It Down Act’ targets deepfake perverts exploiting teens online

Elliston Berry was 14 years old when a classmate used an AI editing app to turn her social media photo into a deepfake nude. He circulated the fake image on Snapchat. The next day, similar deepfake images of eight more girls spread among classmates.

The victims’ parents filed a Title IX complaint. Authorities charged the student who created the images with a class A misdemeanor. Still, the deepfake nudes stayed online. Berry’s mother appealed to Snapchat for more than eight months to remove the images. Only after U.S. Sen. Ted Cruz (R-Texas) personally contacted the company did Snapchat finally take the pictures down.

The Take It Down Act would make it illegal to knowingly publish ‘nonconsensual intimate imagery’ depicting real, identifiable people on social media or other online platforms.

As AI becomes cheaper and more accessible, anyone can create exploitative digital content — and anyone can become a victim. In 2023, one in three deepfake tools allowed users to produce AI-generated pornography. With just one clear photo, anyone could create a 60-second pornographic video in under 25 minutes for free.

The explosion of deepfake pornography should surprise no one. Pornography accounted for 98% of all online deepfake videos in 2023. Women made up 99% of the victims.

Even though AI-generated images are fake, the consequences are real — humiliation, exploitation, and shattered reputations. Without strong laws, explicit deepfakes can haunt victims forever, circulating online, jeopardizing careers, and inflicting lifelong damage.

First lady Melania Trump has made tackling this crisis an early priority — and she’s right. In the digital age, technological advancement must come with stronger protections for kids and families online. AI’s power to innovate also carries a power to destroy. To curb its abuse, the first lady has championed the Take It Down Act, a bipartisan bill sponsored by Cruz and Sen. Amy Klobuchar (D-Minn.).

The bill would make it illegal to knowingly publish “nonconsensual intimate imagery” depicting real, identifiable people on social media or other online platforms. Crucially, it would also require websites to remove such images within 48 hours of receiving notice from a victim.

The Take It Down Act marks an essential first step in building federal protections for kids online. Pornography already peddles addiction in the guise of pleasure. AI-generated pornography, created without the subject’s knowledge or consent, takes the exploitation even further. Deepfake porn spreads like wildfire. One in eight teenagers ages 13 to 17 know someone who has been victimized by fake nudes.

The bill also holds AI porn creators accountable. Victims would finally gain the legal means to demand removal of deepfake images from social media and pornography sites alike.

Forty-nine states and Washington, D.C., ban the nonconsensual distribution of real intimate images, often called “revenge porn.” As AI technology advanced, 20 states also passed laws targeting the distribution of deepfake pornographic images.

State laws help, but they cannot fully protect Americans in a borderless digital world. AI-generated pornography demands a federal solution. The Take It Down Act would guarantee justice for victims no matter where they live — and force websites to comply with the 48-hour removal rule.

We are grateful that the first lady has fought for this cause and that the Senate has acted. Now the House must follow. With President Trump’s signature, this critical protection for victims of digital exploitation can finally become law.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button