Can victims of explicit deepfakes take legal action against their abusers? That’s the question at the heart of a new legislative push in the U.S. The answer may soon be yes. As of this week, lawmakers have reintroduced the DEFIANCE Act—a bill that would give victims of explicit deepfakes the power to sue those who create, share, or distribute nonconsensual sexual content, whether AI-generated or authentic.
Following the recent signing of the Take It Down Act—which forces platforms to remove nonconsensual explicit imagery within 48 hours—the DEFIANCE Act builds on that momentum, addressing a crucial gap: civil recourse.
What Is the DEFIANCE Act and Why Does It Matter?
The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act gives victims a way to pursue civil lawsuits against perpetrators. Co-led by Representative Alexandria Ocasio-Cortez, the bill is designed to empower survivors of deepfake pornography and other intimate image abuse, regardless of how the content is distributed.
While the Take It Down Act addresses public posts, it doesn’t cover more targeted attacks—like images emailed to employers or shared in private networks. The DEFIANCE Act closes this loophole and allows victims to claim compensation for job loss, therapy costs, reputational harm, and other damages.
Why the Need for Civil Lawsuits?
Unlike criminal charges, which depend on prosecutors and are harder to win, civil cases let victims take control. According to Omny Miranda Martone, founder of the Sexual Violence Prevention Association, “Civil recourse is essential because it puts power directly in the hands of victims.”
When Martone was targeted with deepfake pornography, the images were both posted online and sent directly to her boss—with the intent to damage her career. While the Take It Down Act could help remove public content, it didn’t protect her from private distribution or reputational fallout. “The DEFIANCE Act fills this gap,” she explains.
Deepfake Pornography: A Growing Threat
Artificial intelligence has made it alarmingly easy to create realistic explicit images from innocent photos—without consent. Research shows that the vast majority of these deepfakes target women. A 2019 study by DeepTrace found 96% of deepfake videos online were pornographic and nonconsensual. By 2023, that figure had risen to 98%, with 99% targeting women, according to cybersecurity firm Security Hero.
These aren’t just digital pranks—they’re real violations with emotional, professional, and financial consequences. Victims face harassment, job threats, and long-term psychological trauma.
How Deepfakes Impact Careers and Reputations
This issue isn’t just about privacy—it’s also about power and control. Sharing explicit deepfakes is often used to undermine women, especially in professional or political settings. Studies show that objectification makes women appear less competent, less trustworthy, and less relatable. In politics, candidates who’ve been objectified receive less voter support and are seen as lacking leadership qualities.
The damage isn’t just emotional—it’s reputational and economic. The DEFIANCE Act would allow victims to seek financial restitution for therapy, lost income, and increased security costs. It’s a legal lifeline in a system that often overlooks online abuse.
A Step Toward Justice—and Prevention
By allowing civil lawsuits, the DEFIANCE Act puts real consequences in place for people who misuse AI or share intimate images without permission. It also offers something that’s been missing in many cases: a path to healing.
For survivors of digital abuse, the ability to sue can be life-changing. Not only does it hold offenders accountable, but it may also deter future harm by signaling that these actions come with real penalties.
What’s Next for Victims of Deepfake Abuse?
If passed, the DEFIANCE Act would mark a major shift in digital rights and online safety—especially for women. It recognizes that the internet isn’t separate from real life, and that the harm caused by explicit deepfakes is serious, far-reaching, and deserving of justice.
If you or someone you know has been affected by deepfake content, stay informed on your rights. These new laws may soon offer support, protection, and a path forward.
Found this helpful? Share this post, explore more content on digital safety, or join the conversation in the comments below.
Semasocial is where real people connect, grow, and belong.
We’re more than just a social platform — we’re a space for meaningful conversations, finding jobs, sharing ideas, and building supportive communities. Whether you're looking to join groups that match your interests, discover new opportunities, post your thoughts, or learn from others — Semasocial brings it all together in one simple experience.
From blogs and jobs to events and daily chats, Semasocial helps you stay connected to what truly matters.