Deepfakes – A Disastrous Merger of AI and Porn

fakepornlead

By David O’Hair

First appearing on reddit, a new trend called “deepfakes” has captured the public’s attention with one of the internet’s oldest promises – nude celebrity photos. Intimate celebrity images appearing online is nothing new in and of itself. A 2014 hack exposed hundreds of nude-celebrity images, while Gawker notoriously posted Hulk Hogan’s sex tape.

However, deepfakes present a novel issue in that the images, and often videos, of the celebrities are fake – but the underlying porn is real. Deepfakes use artificial intelligence mixed with facial-mapping software to essentially copy and paste someone’s face into preexisting porn content. The AI-software’s sophistication is such that content created by it, i.e., deepfakes, can be virtually indistinguishable from an authentic porn video featuring a specific celebrity. Celebrities are often the victims of deepfakes, because deepfakes require massive amounts of “raw footage” to import into the pornographic video. Chances are a celebrity has more time collected on video than the average person, but non-public figures can be the victims of deepfakes too.

The moral harms of using AI and facial mapping technology for such purposes are self-evident, but the legal remedies available to deepfake victims do not present themselves as easily. Many legal causes of action, such as misappropriation of likeness, defamation, and invasion of privacy, are theoretically available to deepfake victims. But, while Celebrities may have the means to bring lengthy and expensive civil suits against deepfake creators or distributors, the average person may not. What happens when someone is the victim of a deepfake and does not have the resources to bring a civil suit?

Deepfakes present the appropriate motivation for the Washington State Legislature to update the State’s revenge porn laws. Currently, both Washington’s civil and criminal revenge porn laws do not contain any language that would trigger liability for creating or distributing “fake” intimate images. Washington’s civil revenge porn law covers intimate images that were either distributed by the victim with the intent for them to remain private; i.e., sending a significant other a nude photo, or images that were obtained by unauthorized access from the victim’s property, i.e., hacking. To address the new threat of deepfakes, a provision should be added to include liability for creating or distributing fake or falsely created intimate content. The intent aspect of revenge porn law would be sufficient for punishing the creation or distribution of deepfakes.

The current revenge porn laws apply to “any person who distributes an intimate image of another . . . and at the time of such distribution knows or reasonably should know that disclosure would cause harm to the depicted person shall be liable to that other person for . . . damages.”

Washington’s civil revenge porn law can be amended to punish creators and distributors of deepfakes, but the reform should not stop with civil remedies. Because of the substantial financial burden a civil suit brings, many non-public figures will not have the resources to litigate a civil suit. Washington’s criminal revenge porn law should be amended to add an addition deepfake provision to deter creating and distributing this content.

Amending the revenge porn laws is the systemic way to combat the proliferation of deepfakes, but private industry can play a pivotal role as well. Websites where deepfakes could find a home can amend their Terms and Conditions to disallow any non-consensual content on their sites; therefore, the hosts would be purging their sites of deepfakes. Reddit and other industry leaders have already started following this lead by removing non-consensual deepfake content and banning the genre in general.

Leave a comment