The bill strengthens protections and speedy takedown and enforcement options for victims of nonconsensual intimate images and deepfakes, but does so at the cost of potential free-speech chill, privacy risks for reporters/victims, legal ambiguities, and compliance burdens for smaller platforms.
Victims of nonconsensual intimate-image publication — including minors — can pursue criminal penalties, restitution, and forfeiture against perpetrators, increasing deterrence and potential compensation for harm.
People whose intimate images were shared without consent can request prompt takedown (48 hours) via a standardized notice-and-request process, with platforms required to explain how to submit requests.
Platforms that remove content in good faith receive liability protection, and the FTC is empowered to enforce compliance and impose penalties, creating a federal enforcement pathway to encourage platform responsiveness.
Rapid 48-hour removal deadlines and criminal/liability exposure may push platforms to over-remove or block lawful, newsworthy, or public-interest speech, creating significant free-speech and press concerns.
The required verification for takedown requests (signature and contact information) could expose victims to privacy and safety risks and deter reporting, especially for vulnerable populations.
Key legal standards (e.g., 'intended to cause harm,' 'reasonably expected privacy,' and AI indistinguishability tests) and conditional exclusions are ambiguous, risking uneven enforcement, legal uncertainty, and inconsistent outcomes for defendants and platforms.
Based on analysis of 5 sections of legislative text.
Creates new federal crimes for knowingly publishing nonconsensual intimate photos, videos, or digitally forged “deepfake” images of people and adds criminal penalties, forfeiture, and victim restitution. Requires websites and apps that host user content or that publish nonconsensual intimate images to run a clear notice-and-takedown process and to remove reported items and known copies within 48 hours of a valid request, with violations enforceable by the Federal Trade Commission. Defines key terms (consent, digital forgery, intimate visual depiction, minor, covered platform), sets differing mental-state and penalty rules for adults versus minors, preserves specified lawful and good-faith exceptions (law enforcement, medical/educational uses, self-published material), and requires courts to order forfeiture of distributed material and restitution to victims.
Introduced January 16, 2025 by Rafael Edward Cruz · Last progress May 19, 2025