Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Business

AI-driven deepfake sites that ‘undress’ women and girls face landmark lawsuit

San Francisco officials filed a landmark lawsuit against popular deepfake websites that use artificial intelligence to “undress” images of clothed women and girls.

The city attorney’s office is suing 16 of the most viewed AI “undressing” sites – which were collectively visited more than 200 million times in just the first half of 2024, the suit said.

The websites allow users to upload images of real, clothed people – which the AI then “undresses” and turns into fake nude images. 

San Francisco City Attorney David Chiu announced Thursday that his office is suing 16 deepfake nude website operators in a landmark case. AP

“[I]magine wasting time taking her out on dates, when you can just use [our website] to get her nudes,” one of the deepfake websites said, according to the lawsuit.

The lawsuit said the deepfake nudes are made without consent and are used to intimidate, bully and extort women and girls in California and across the country.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” San Francisco City Attorney David Chiu said in a statement.

“This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible,” he said.

The “undressing” sites have broken federal and state laws banning revenge pornography, deepfake pornography and child pornography, the lawsuit said.

The suit claimed the defendants also broke California’s unfair competition law because “the harm they cause to consumers greatly outweighs any benefits associated with those practices.”

The city attorney’s office is seeking civil penalties and the removal of the deepfake websites, as well as measures to prevent the site owners from creating deepfake pornography in the future.

The lawsuit referenced a case from February 2024, when five students were expelled from a California middle school after creating and sharing AI-generated nude images of 16 eighth-grade students.

A deepfake shows singer Katy Perry attending the Met Gala. Katyperry/Instagram

“I feel like I didn’t have a choice in what happened to me or what happened to my body,” a victim of deepfake nudes said, according to the lawsuit.

Another said she and her family live in “hopelessness and perpetual fear that, at any time, such images can reappear and be viewed by countless others.”

The lawsuit is the first to take on deepfake nude generators head-on. 

As the AI industry ramps up, deepfakes – or AI-generated and manipulated images – have become more mainstream. 

AI-generated images can often spread misinformation like wildfire.

A deepfake image of the pope confused social media users in 2023, until media outlets confirmed it was made by AI. AP

A deepfake of Pope Francis in a white Balenciaga puffer jacket went viral in 2023 – which many believed was real until news outlets began reporting otherwise.

But often, deepfakes turn sinister.

Fake nude images of children landed at the top of some search results on Microsoft and Google engines, according to an NBC News report in March.

Non-consensual deepfake nudes of celebrities like Taylor Swift have circulated the web.

The AI-generated nude images can often lead to sextortion schemes, when a victim is forced to pay money to prevent the release of fake images. 

“We have to be very clear that this is not innovation – this is sexual abuse,” Chiu said. “We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children.”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button