Loading...

AI-Generated Fake Nudes Surge, Posing Threats to Privacy and Lives

Nudes

Most read

Loading Most Ready posts..

TL;DR

  • AI-powered fake pornographic content is surging, posing privacy threats to women and teens.
  • Legal gaps and challenges in penalizing creators leave victims with limited recourse.
  • Stronger regulations are urgently needed to combat the proliferation of AI-generated fake nudes.

The rise of artificial intelligence (AI) tools has fueled a disturbing surge in fake pornographic images and videos, raising significant concerns about privacy and exploitation. Victims, primarily women and teens, find themselves ill-prepared for the visibility that AI-generated fake nudes bring.

The invasion of privacy

Gabi Belle, a YouTube influencer, discovered that her image was being used in a fake nude photograph circulating on the internet. The image depicted her in a compromising situation, even though she had never consented to such content. Belle’s experience is not unique, as many individuals, including celebrities and ordinary teenagers, have fallen victim to AI-generated fake nudes. These fake images can incite shame, extort money, or serve as an outlet for private fantasies.

Unprecedented growth in fake nudes

Artificial intelligence has enabled an unprecedented boom in fake pornographic images and videos. On the top 10 websites known for hosting AI-generated porn photos, the number of fake nudes has surged by over 290 percent since 2018, according to industry analyst Genevieve Oh. These sites feature not only celebrities and political figures but also unsuspecting individuals who have had their likenesses exploited without consent.

Lack of legal protections

Victims of AI-generated fake porn have limited recourse due to the absence of federal laws governing deepfake porn. Only a handful of states have enacted regulations, leaving a significant legal gap. President Biden’s recent AI executive order recommends companies to label AI-generated content but does not make it mandatory. Legal scholars argue that AI-generated fake images may not fall under copyright protections for personal likenesses since they draw from vast data sets.

Women and teens at particular risk

AI-generated fake nudes pose a significant risk to women and teenagers, who often find themselves unprepared for the sudden visibility. Research by Sensity AI, a deepfake monitoring company, reveals that 96 percent of deepfake images are pornographic, with 99 percent targeting women. Genevieve Oh’s examination of the top 10 websites hosting fake porn images found that over 415,000 such images had been uploaded in 2023 alone, amassing nearly 90 million views.

Explosion of AI-generated porn videos

AI-generated porn videos have also proliferated across the web. In 2023, more than 143,000 fake videos were added to the 40 most popular websites featuring deepfake content, surpassing the total number of new videos from 2016 to 2022. These videos have garnered over 4.2 billion views. The Federal Bureau of Investigation (FBI) has warned of a rise in sexual extortion, with over 26,800 people falling victim to “sextortion” campaigns as of September.

Google’s limited protections

Popular search engine Google has policies in place to prevent nonconsensual sexual images from appearing in search results. However, its protections for deepfake images are not as robust, allowing AI-generated fake porn to prominently feature in search results. Google is working on enhancing these protections and aims to build safeguards that do not require individual requests for content removal.

Challenges in penalizing creators

Penalizing creators of AI-generated fake porn content proves to be challenging. Section 230 of the Communications Decency Act shields social media companies from liability for content posted on their platforms, leaving little incentive for websites to police such images. Victims can request the removal of their likenesses, but AI-generated content draws from extensive data sets, making it challenging to prove copyright violations or personal likeness infringement.

State-level efforts and varied regulations

At least nine states, including California, Texas, and Virginia, have passed legislation targeting deepfakes. However, these laws differ in scope, with some allowing victims to press criminal charges and others permitting only civil lawsuits. Identifying the responsible parties for legal action can be a complex task.

The need for stronger regulations

The push to regulate AI-generated images and videos primarily aims to prevent mass distribution and address concerns related to election interference. However, these rules do little to mitigate the harm caused by deepfake porn, which can severely impact an individual’s life, even if shared in small groups.

The proliferation of AI-generated fake nudes represents a grave threat to privacy and personal integrity, particularly for women and teenagers. The absence of comprehensive federal laws and the challenges in penalizing creators of such content highlight the urgent need for stronger regulations and protections. As AI continues to advance, addressing the growing issue of AI-generated fake porn becomes increasingly critical to safeguard individuals from exploitation and harm in the digital age.

Disclaimer. The information provided is not trading advice. Cryptopolitan.com holds no liability for any investments made based on the information provided on this page. We strongly recommend independent research and/or consultation with a qualified professional before making any investment decisions.

Share link:

Glory Kaburu

Glory is an extremely knowledgeable journalist proficient with AI tools and research. She is passionate about AI and has authored several articles on the subject. She keeps herself abreast of the latest developments in Artificial Intelligence, Machine Learning, and Deep Learning and writes about them regularly.

Stay on top of crypto news, get daily updates in your inbox

Related News

Cryptopolitan
Subscribe to CryptoPolitan