San Francisco City Attorney David Chiu has announced that his office is litigating 16 websites that use AI to develop and distribute non-consensual deepfake nude images of women and girls. This comes amid heightened attention on the creation and distribution of AI nonconsensual images.
The lawsuit, according to Chiu is the first of its kind in San Francisco, blames operators of the websites for violating state and federal laws that denounce deepfake pornography, child pornography, and revenge pornographic material as well as California’s unfair competition law.
Chiu wants to raise alarm about this ill-practice
According to the New York Times, the initiative was chief deputy attorney Yvonne Mere’s idea, who rallied her colleagues to craft a lawsuit, that should result in the closure of 16 websites.
The websites’ names were censored in the copy of the lawsuit availed to the public Thursday.
While the Attorney’s office said it was yet to identify most of the websites’ owners —officials from that office have expressed optimism over finding the names of the sites’ proprietors and holding them accountable.
During a press conference Thursday, Chiu revealed that the sites produce nonconsensual “pornographic” materials. Apart from raising alarm and ending this form of “sexual abuse,” Chiu also hinted that the lawsuit should also shut down the said websites.
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation.”
Chiu.
On the said websites, users upload photos of fully dressed real people, and make use of AI to create images of naked women and girls. The sites’ AI models were trained on real pornography and pictures of child abuse to make deepfakes.
As indicated by the lawsuit, one of the sites promotes the non-consensual nature of the images, asserting, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
The availability and accessibility of open-source AI models has made it easy for anyone to access and adapt AI-powered engines for their purposes. This results in creation of sites and apps that can generate deepfake nudes from scratch or “nudify” existing photographs in realistic ways, often for a fee.
San Francisco is not the only place to witness this challenge
In January, deepfake apps made headlines when fake nude photos of Taylor Swift went viral online. Many other and far less popular people were persecuted before and after Swift.
Chiu has admitted that the “proliferation of these images has exploited a shocking number of women and girls across the globe,” from celebrities to middle school students.
Through its own investigations, the city attorney’s office found that the websites were visited over 200 million times in the first six months of this year. The office expressed concerns that once an image goes online, it becomes difficult for victims to determine the websites that were used to “nudify” their photographs.
This is because the images have nothing unique or identifying marks that one can refer back to the websites. In addition, it’s also difficult for victims to get rid of the photographs from the internet, which affects their self-esteem and their digital footprint.
Earlier in the year, five Beverly Hills eighth-graders were expelled for generating and sharing deepfake naked images of 16 eighth-grade girls, overlaying the girls’ faces onto AI-generated bodies.
Similar cases have been observed at other schools in California, New Jersey and Washington with the images used to humiliate, bully and threaten women and girls.
The net effect on the victims, the attorney’s office said, has been shattering on their reputations, suffer mental health, loss of self-esteem, and in some instances, causing individuals to become suicidal.