Skip to content
Published:

Senators Hassan, Cornyn, King, Butler Introduce Bipartisan Bill to Combat Non-Consensual “Deepfake Porn”

Legislation Establishes Both Civil and Criminal Penalties for Sharing Artificially Generated Intimate Images Online Without Consent

WASHINGTON — U.S. Senators Maggie Hassan (D-NH), John Cornyn (R-TX), Angus King (I-ME), and Laphonza Butler (D-CA) today introduced legislation to hold accountable people who create and share non-consensual intimate deepfake images online, often referred to as “deepfake porn.” Using advanced technology, such as artificial intelligence, it is easier than ever to make fake photos and videos of people that are convincingly real — called deepfakes — and bad actors have been able to use this technology to create intimate, sexualized imagery of people without their knowledge. Women across the country, including several public figures, have been victimized by having these deepfake images posted online without their consent.   

This legislation would not only establish a new criminal offense for distributing these images, but it would also create a “private right of action” for victims to file a lawsuit against someone who intentionally distributes these images, including a website knowingly hosting the images. The criminal penalties can include a fine and up to two years in prison in most cases, and the civil penalties can range up to $150,000 in most cases and sometimes can go further.  

“The sharing of intimate images without consent can cause extraordinary emotional distress and harm and can put victims at risk of stalking and assault. Especially as technology advances to the point where it is hard to tell which photos and videos are real and which have been entirely faked, we need stronger guardrails that protect people’s safety, privacy, and dignity and prevent non-consensual intimate images from proliferating across the internet,” said Senator Hassan. “This bipartisan bill provides tools to hold accountable – both financially and criminally – the people and websites who are knowingly sharing these images without consent, and I urge my colleagues to support it.”

“While there are many benefits to artificial intelligence, the use of deepfake technology to generate nonconsensual and realistic intimate images of actual people poses a growing threat,” said Senator Cornyn. “This legislation will help safeguard against the malicious use of this technology by closing loopholes in revenge porn laws and criminalizing the creation and spread of nonconsensual intimate deepfakes.”

“Artificial intelligence is rapidly helping to advance critical components of society, but it’s also being used maliciously to victimize innocent Americans,” said Senator King. “The Preventing Deepfakes of Intimate Images Act would ensure that Maine people, and Americans nationwide, have legal civil and criminal recourse in the event they become victims to fake content posted online. In the age of digital ingenuity and innovation, legislation is needed to protect individuals from bad actors exploiting new technology.”

“As artificial intelligence continues to advance, we must take steps to prevent its misuse,” said Senator Butler. “That’s why we need this legislation to protect victims and hold perpetrators accountable.”

This bipartisan legislation has been endorsed by the Cyber Civil Rights Initiative.

In 2022, Senator Hassan helped pass into law her bipartisan measure to create a private right of action for people whose intimate images were shared online without their consent, often called “revenge porn,” allowing victims to seek compensation and relief in federal court. The bill introduced today builds on that measure by including deepfake imagery as a cause for civil lawsuits. Additionally, earlier this year, the U.S. Senate passed legislation Senator Hassan helped introduce to help prevent child abuse. Senator Hassan also helped pass into law a bill to stop perpetrators from being able to push survivors of sexual harassment and assault into secretive, forced arbitration proceedings.

###