Artificial Desires: How AI is Shaping our Consumption of Pornography

By: Devina Stone

The rise of Artificial Intelligence (AI) in creating pornography has has introduced novel challenges, threatened a growing move towards ethical creation and consumption, and remains mostly ungoverned by the law. The law must address the accountability of those using AI to exploit likenesses and content produced by legitimate creators. The best option for victims to seek justice now may be utilizing the principle of the right of publicity to allow for civil action against these perpetrators, but plaintiffs may face significant challenges.

The world of adult entertainment has long presented issues of ethics, and the pornography industry is frequently perceived as vulgar  and indecent. Statistics, however, say that most of us consume it anyways; in fact, 92% of men and 60% of women report consuming some form of pornography monthly, whether visual, auditory, or written. PornHub, the most popular video website, sees 42 billion visitors in a year

In the past decade, Millennials have become the largest share of adults worldwide and Gen Z has emerged as ethically motivated, progressive young adults. Social issues are  taking  a forefront in marketing, politics, and nearly every other aspect of our lives. This recent wave of socially conscious dispositions has influenced how the adult entertainment industry approaches pornography creation and consumption. Ethically focused creators have emerged, creating pornography that is rooted in consent, where actors are regularly tested for STD’s, paid fairly and intimacy coordinators monitor actors’ wellbeing and treatment. One such director, Erika Lust, states “[p]orn forms part of a healthy sexual experience…[it] can also be artistic and beautiful.” Apps like Quinn and Dipsea and streaming platforms like Make Love Not Porn have attracted people who feel traditional pornography is too graphic, unrealistic, and crass. Queer stories have surpassed “girl on girl” videos, and diversity in race, gender, and sexuality has taken a front seat. Liberated female creators online have taken control of their own narratives, with sites like OnlyFans allowing women to produce the content they want, without the pressure of a director, set, or others’ expectations. Currently, OnlyFans boasts 2.1 million creators and 500,000 new viewers every day.

It would seem, then, that the porn industry is in the midst of a tectonic shift towards sensitivity. The rise of AI, however, threatens this shift. AI has presented ethical concerns from the start, from privacy and surveillance, to bias and discrimination, and, of course, the role of human judgment. Add the increased challenge of sexual content into the mix, and the potential use of AI is downright worrying. Ethically, human porn creators consent to the acts they participate in, but AI isn’t real, and hence there is no consent. One can effectively order the sexual content they want and have it delivered to them—a transaction which in no way reflects how sexual experiences occur in the real world. More, this directs traffic away from legitimate, ethical creators, and towards the free, easily accessible content created by generative AI. 

An AI user can request the creation of pornography that uses the faces of real people, from celebrities to children. It creates content that is, at best, deeply embarrassing for the subject, and, at worst, downright illegal. Not to mention that AI has to learn from existing content on the web, so it inevitably incorporates content and faces of existing porn actors without their consent

The law is trying to catch up, but it’s lagging behind. First, it was deepfakes, or manipulated images using real faces convincingly pasted onto a video or photo of someone else. Only after a deepfake of Taylor Swift engaging in sexual acts went viral did the DEFIANCE Act appear. Passed through the Senate this July, the DEFIANCE Act allows victims of deepfake pornography to file civil suit against perpetrators. Criminal penalties are left to the states, and some states have passed new laws, some have expanded existing law while others have yet to legislate. This progress offers hope for the victims of deepfake pornography, hoping to put power back in the hands of the victim. 

The more obscure issue of AI learning from existing content, without the consent of creators and using actual faces and bodies to create “fake” content, is harder to legislate. Most laws regarding nonconsensual porn and even new deepfake legislation all focuses on one identifiable victim. This means that there is no  penalty for AI users or developers when the AI model uses existing content, against the wishes of the person pictured, to create new, unrecognizable content. Not only is this a violation of privacy and choice, but it allows for the creation of content that would not otherwise exist, like rape-pornography and child sex abuse materials, which presents the possibility of encouraging real-world offenses of the same crimes. 

Exposure to violent pornography has a profound and tangible impact. Teen boys who reported consuming sexually violent content were 2-3x more likely to perpetrate “teen dating violence” against their real world partners. Consumption of “sexually aggressive pornography contributes to increased hostility toward women, acceptance of rape myths, decreased empathy, and compassion for victims and an increased acceptance of physical violence toward women.”

Legislators and law enforcement around the country have begun pushing for legislation to criminalize the creation of this content, including the Child Exploitation and Artificial Intelligence Expert Commission Act of 2024, which would create a commission to explore the issue and propose “appropriate safety measures and updates to existing laws.” But this solution still ignores adult creators whose content is used to create non-consensual AI generated pornographic videos and images. 

The right of publicity “allows individuals to control the commercial exploitation of their identity and reap the rewards associated with their fame or notoriety by requiring others to obtain permission (and pay) to use their name, image, or likeness.” Cases like White v. Samsung Electronics have allowed the right of publicity to stand even when the unpermitted use of likeness is not identical, but extremely suggestive of a particular person. In White, where a robot with blonde hair and a long gown turning over letters on a game show was found to violate Vanna White’s right of publicity. However, the use of generative AI may produce a result that is not explicitly recognizable as any given person, and the right of publicity may not apply. 

Moreover, creative content featuring celebrities has sometimes been held to not violate the right of publicity when courts balance the use of celebrity images with the creator’s right to expression under the First Amendment. Tiger Woods could not sue an artist who painted images of him and sold them, because the work was substantially creative. AI users who create sexually explicit content could potentially use loopholes like this to evade civil liability for their use of real faces in creating this content. 

Today, there is no way to prevent such exploitation, but the law is an ever-evolving body, and hopefully the vigor with which legislators have brought forth the DEFIANCE Act and the Child Exploitation and Artificial Intelligence Expert Commission Act will continue, and creators will be protected soon.