On May 19, 2025, President Donald Trump signed the TAKE IT DOWN Act into law—a sweeping new federal measure that targets the online distribution of nonconsensual intimate images, including AI-generated deepfakes. The legislation is the first of its kind at the federal level, creating both criminal penalties for offenders and legal responsibilities for websites and platforms.

At National Security Law Firm (NSLF), we’re proud to be national leaders in the fight against revenge porn, digital exploitation, and image-based abuse. This law represents a major step forward in victim protection—but like any complex legal measure, it raises questions about enforcement, scope, and your rights.

In this post, we will go over the full text of the Take It Down Act and explain exactly what the law does, how it works, who it protects, and what challenges may lie ahead. Here’s what you need to know—in plain English.


SECTION 1: What Is the Law Called?

Official Title: Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act — aka the “TAKE IT DOWN Act.”

Translation: This law aims to criminalize and remove deepfake and revenge porn content from the internet—especially when it’s posted without consent.


SECTION 2: Criminalizing Nonconsensual Intimate Image Disclosure

This section creates new federal criminal offenses under the Communications Act for publishing intimate images or deepfakes without consent.

Key Definitions:

  • Consent: Must be knowing, voluntary, and free of coercion.
  • Digital Forgery: AI-generated image that looks real but was faked.
  • Identifiable Individual: A person clearly visible in the image—based on face, birthmarks, or other unique traits.

Two Core Crimes:

  1. Publishing Real Intimate Images Without Consent
    • Applies when the person had a reasonable expectation of privacy
    • Not for public/commercial settings
    • Must cause harm (psychological, reputational, etc.) or be intended to
  2. Publishing AI-Generated Sexual Images (Deepfakes)
    • Same rule: no consent + causes/intends harm = crime

Special Protection for Minors:

  • Publishing intimate content of minors is criminal with intent to degrade, harass, or arouse sexual desire.

Exceptions (Legal Safe Zones):

  • Law enforcement or intelligence uses
  • Good-faith reporting to police, legal use in court, education, or medicine
  • The person posts their own intimate images
  • Child porn laws still apply separately

Penalties:

  • Up to 2 years for adults
  • Up to 3 years for crimes involving minors
  • Threats to publish: Up to 18–30 months depending on age/content

Gray Areas:

  • “Public concern” exemption is vague—who decides what’s “newsworthy”?
  • How will courts assess “intent to harm”?
  • What happens when content is shared anonymously or virally?

SECTION 3: Platform Removal Duties

This section mandates that websites and apps (“covered platforms”) must remove reported nonconsensual intimate content.

Platforms Must:

  1. Set up a removal request system within 1 year
  2. Provide a visible, plain-English takedown process
  3. Remove content within 48 hours of a valid request
  4. Make reasonable efforts to remove duplicates/copies

What Makes a Takedown Request Valid?

  • Signature (digital or physical)
  • Proof of identity & link to the content
  • Statement that it was nonconsensual

Protections for Platforms:

  • Not liable for taking content down in good faith—even if it’s later found not to be illegal

Enforcement:

  • Platforms that don’t comply face FTC penalties under consumer protection law

Gray Areas:

  • No penalties for false claims (risk of abuse?)
  • No clarity on how “duplicate” content is identified
  • Unclear how this applies to encrypted/private content sharing

SECTION 4: Who Does This Apply To?

A “covered platform” includes:

  • Public-facing websites/apps
  • Services that host user-generated content (e.g., images, videos, chats)

Exemptions:

  • Email services
  • Broadband providers
  • Content platforms where user interaction is secondary (e.g., streaming services with comments off)

SECTION 5: Severability

If one section of the law is ruled invalid or unconstitutional, the rest still stands.


⚖️ Final Analysis from NSLF

The TAKE IT DOWN Act creates a long-overdue federal framework for combatting revenge porn and deepfake abuse. It offers strong tools for victims, puts real obligations on platforms, and imposes criminal penalties on offenders.

But as with any broad law, enforcement questions remain:

  • Will platforms over-remove content to avoid liability?
  • How will “intent to harm” be proven in ambiguous or anonymous cases?
  • How can victims without legal representation or resources navigate the takedown system?

At National Security Law Firm, we continue to lead the fight for online dignity, digital safety, and victim-centered justice.

If you’re a victim of revenge porn, deepfake abuse, or image-based blackmail, we’re here to help.

📅 Book a Free, Confidential Consultation