Deepfake & AI Porn: What the Law Says
The Take It Down Act makes the U.S. the first country with federal criminal penalties for non-consensual deepfake pornography. Here's everything the adult industry needs to know about the new legal landscape.
What Is the Take It Down Act?
Signed into law on May 19, 2025, the TAKE IT DOWN Act is the first major federal legislation in the United States directly targeting non-consensual intimate imagery, including AI-generated deepfakes.
Historic Legislation
The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (TAKE IT DOWN) passed with overwhelming bipartisan support—409 to 2 in the House and unanimously in the Senate. President Trump signed it into law with First Lady Melania Trump, who championed the bill as part of her "Be Best" anti-cyberbullying initiative.
The law creates two major mechanisms: criminal penalties for those who create or distribute non-consensual intimate imagery, and mandatory takedown requirements for platforms hosting such content.
First Federal AI Law
The Take It Down Act is the first major federal law on artificial intelligence in the United States, setting a precedent for how Congress may approach AI regulation in other domains.
Key Provisions & Penalties
The law creates new federal crimes for publishing intimate images without consent—whether real photographs or AI-generated deepfakes—with penalties up to 3 years in prison.
| Offense | Penalty | Details |
|---|---|---|
| Publishing NCII | Up to 2 years | Intimate images without consent |
| Publishing Deepfakes | Up to 2 years | AI "digital forgeries" without consent |
| Aggravated Offense | Up to 3 years | Prior offenses, harassment intent |
| Threatening to Publish | Up to 2 years | Blackmail/coercion without publishing |
| Minor Victims | Enhanced penalties | Intent to degrade, harass, or arouse |
Key Definitions
Consent must be knowing, voluntary, and free of coercion. The law recognizes that consent can be withdrawn at any time.
Digital Forgery means any AI-generated image that appears real but was fabricated—commonly known as a "deepfake."
Identifiable Individual refers to a person clearly visible in the image based on face, birthmarks, tattoos, or other unique identifiable traits.
Intimate Visual Depiction includes uncovered genitals, pubic area, anus, or post-pubescent female nipple of an identifiable person.
The criminal provisions took effect immediately upon signing on May 19, 2025. Anyone who knowingly publishes non-consensual intimate images—real or AI-generated—is now subject to federal prosecution.
Platform Obligations
The law's most significant feature is Section 3, which creates mandatory notice-and-takedown requirements for platforms hosting user-generated content.
The 48-Hour Rule
Platforms must remove reported non-consensual intimate content within 48 hours of receiving a valid takedown request. They must also prevent the same content from being re-uploaded in the future.
By May 19, 2026, all covered platforms must have a formal notice-and-removal system in place. This includes adult content sites, social media platforms, and any website hosting user-uploaded material that could contain intimate imagery.
| Requirement | Details |
|---|---|
| Removal Time | 48 hours from valid notice |
| Re-upload Prevention | Must prevent same content from being re-posted |
| System Deadline | May 19, 2026 for full implementation |
| Required Notice Info | Physical/electronic signature, content location, good faith statement, contact info |
| Enforcement | FTC treats non-compliance as unfair/deceptive practice |
| Safe Harbor | Platforms protected for good faith compliance efforts |
Legislative Oversight
Legal experts note that the takedown system as written only explicitly applies to photographs, not deepfakes—likely an oversight that may require amendment. However, the criminal provisions clearly cover both.
The Deepfake Porn Crisis
The legislation responds to an explosion of AI-generated non-consensual pornography. The numbers reveal a crisis disproportionately affecting women and girls.
Who Gets Targeted
99-100% of deepfake pornography victims are women. This is overwhelmingly a form of gender-based digital violence. South Korean women, particularly K-pop idols, are the most frequently targeted demographic globally due to their high visibility and fan following.
School-aged victims are rising rapidly. During the 2023-24 school year, 71% of teachers reported that when students were caught spreading deepfake intimate imagery, punishments included suspension, expulsion, or law enforcement referrals. The Take It Down Act was partly inspired by Texas high schoolers who were victimized by a classmate.
Creation is now trivially easy. It takes less than 25 minutes and costs nothing to create a convincing deepfake pornographic video using just a single clear face image of the victim.
Human detection rates for high-quality video deepfakes are just 24.5%. AI detection tools lose 45-50% effectiveness against real-world deepfakes outside controlled lab conditions. Legislation without detection technology may be difficult to enforce.
Other Legislation in 2025
The Take It Down Act isn't alone. Multiple federal and state laws are creating a complex regulatory landscape for AI-generated content and deepfakes.
| Legislation | Status | Key Provisions |
|---|---|---|
| TAKE IT DOWN Act | Signed May 2025 | Federal criminal penalties, 48-hour takedowns |
| DEFIANCE Act | Reintroduced May 2025 | Civil damages up to $250,000 for victims |
| NO FAKES Act | Introduced Apr 2025 | Criminalizes unauthorized AI voice/likeness |
| Tennessee ELVIS Act | Effective Jul 2024 | Protects voice as property, civil remedies |
| UK Online Safety Act | Enforced Jul 2025 | Creating deepfake porn now criminal offense |
| PA Act 35 | Effective Sep 2025 | Criminal penalties for deepfakes w/ harmful intent |
The DEFIANCE Act
Reintroduced in May 2025 after passing the Senate in 2024, the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would give victims a federal civil cause of action. Damages include:
$150,000 in base statutory damages for identifiable victims.
$250,000 if connected to sexual assault, stalking, or harassment.
10-year statute of limitations from discovery or victim turning 18.
The bill also allows pseudonym filings to protect victim privacy and doesn't preempt stronger state laws.
State-Level Landscape
As of May 2025, 28 states have enacted laws regulating deepfakes in political communications, and all 50 states plus D.C. have some form of law banning image-based sexual abuse.
California, New York, and Texas lead with comprehensive deepfake legislation covering criminal penalties, civil remedies, and platform obligations. The Take It Down Act creates a federal floor that state laws can exceed.
Impact on the Adult Industry
Legitimate adult content platforms face new compliance requirements, but the law also provides clearer boundaries for legal operation.
What Platforms Must Do
Implement takedown systems by May 2026. Any platform hosting user-uploaded content that could include intimate imagery must create a formal notice-and-removal process. This includes adult tube sites, fan platforms, and social networks.
Act within 48 hours. Once a valid takedown request is received, content must be removed within two days. Platforms must also implement systems to prevent re-uploads.
Document consent. While the law targets non-consensual content, platforms should strengthen consent verification and documentation processes to demonstrate good faith compliance.
| Requirement | Deadline | Priority |
|---|---|---|
| Takedown request system | May 19, 2026 | Critical |
| 48-hour response capability | May 19, 2026 | Critical |
| Re-upload prevention tech | May 19, 2026 | Critical |
| Consent documentation | Immediate | High |
| AI content detection | Ongoing | High |
| Staff legal training | Immediate | High |
Safe Harbor Protection
Platforms that make good faith compliance efforts are protected from liability. The law doesn't punish platforms for hosting content they didn't know about—only for failing to act once properly notified.
Critics including the EFF and ACLU have raised concerns that the law could be abused to remove legitimate content, particularly political speech or satire. The law includes exceptions for content shared as part of legal proceedings or authorized investigations, but the boundaries remain untested.
Key Takeaways
- The Take It Down Act is now federal law — Signed May 19, 2025, it creates criminal penalties up to 3 years for publishing non-consensual intimate images, whether real photos or AI deepfakes.
- Platforms have 48 hours to remove content — Once notified, sites must take down non-consensual intimate imagery within 48 hours and prevent re-uploads. Full systems required by May 2026.
- This is the first major federal AI law — The Take It Down Act sets precedent for how Congress approaches AI regulation, treating AI-generated content the same as real images.
- 98% of deepfakes are pornographic, 99% target women — The law responds to a documented crisis: deepfake porn videos grew 464% from 2022-2023, overwhelming women and girls.
- Civil remedies may follow — The DEFIANCE Act, if passed, would allow victims to sue for up to $250,000 in damages, adding civil liability to criminal penalties.
- All 50 states now have related laws — The federal law creates a floor; states can and have enacted stronger protections. 28 states also regulate political deepfakes.
- Good faith compliance provides protection — Platforms acting in good faith to remove content are protected. The FTC enforces non-compliance as an unfair or deceptive practice.