Premium

San Francisco Sues Adult Sites That Use AI to Generate Content

Creative Commons Attribution-Share Alike 4.0

Will artificial intelligence kill online porn? Since most of you are probably not porn addicts, you probably don't know that the fastest-growing moneymaker in porn today is "nudifying" clothed people using AI. It's not only an invasion of privacy, but the city of San Francisco says it's "sexual abuse."

The city attorney's office is suing 16 of the most visited websites on the internet that allow users to generate deep fake pornographic images of real women and even children. The city is also suing 50 "John Doe" individuals who are engaged in deepfake commerce. The website operators violate "revenge pornography" laws that prohibit producing explicit images without the consent of its victims. 

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” City Attorney David Chiu said in a statement shared with The San Francisco Standard. “We have to be very clear that this is not innovation—this is sexual abuse.”

The lawsuit argues that a judge could shut down the websites for services that generate such images, demand monetary damages for victims, and add more defendants as they are discovered. The aim of the suit is to remove the websites, permanently restrain defendants from running websites, and pursue civil penalties and legal costs.

The origins of this lawsuit are unique. The chief deputy city attorney in San Francisco, Yvonne Meré, has a 16-year-old girl, and after reading about the deepfakes high school boys are generating to "nudify" their female classmates, Meré talked her co-workers into crafting a lawsuit that will shut down the sites used to create the explicit images.

New York Times:

The sites’ A.I. models have been trained using real pornography and images depicting child abuse to create the deepfakes, Mr. Chiu said. In mere seconds, the sites can make authentic-looking images of breasts and genitalia under real faces.

The technology has been used to create deepfake nudes of everyone from Taylor Swift to ordinary middle-school girls with few apparent repercussions. The images are sometimes used to extort victims for money or humiliate and harass them. Experts have warned that they can harm the victims’ mental health, reputations and physical safety, and damage their college and job prospects.

“You can be as internet-savvy and social media-savvy as you want, and you can teach your kids all the ways to protect themselves online, but none of that can protect them from somebody using these sites to do really awful, harmful things,” said Sara Eisenberg, the mother of a 9-year-old girl and "the head of the unit in the city attorney’s office that identifies major social problems and tries to solve them through legal action," according to The Times.

Once the images are "live," there's virtually no way to track them to their origin. The young women who are victims of this insidious commerce can't sue anyone, and anyone who paid to generate the deepfake is safe from discovery.

This is not about the First Amendment. It's not a matter of free speech. The images clearly injure individuals are are not protected in any way. It's the most personal violation of someone's privacy, and finding these perpetrators and fining them is the least that should be done.

Recommended

Trending on PJ Media Videos

Advertisement
Advertisement