Introduced January 22, 2025 by Maria Elvira Salazar · Last progress January 22, 2025
The bill strengthens remedies and enforcement against nonconsensual intimate-image publication and deepfakes—giving victims faster takedowns, criminal penalties, and FTC oversight—while imposing new compliance costs, enforcement risks, and potential for overbroad criminalization or over-removal that could chill speech and burden platforms.
People whose intimate images are published without consent (especially women, children, and other victims) gain a new federal criminal remedy, potential restitution, and higher penalties for offenses involving minors and digitally forged (deepfake-style) images.
Victims can require covered platforms to remove nonconsensual intimate images quickly (within 48 hours) and platforms must publish an accessible takedown process, giving victims faster relief and a clear federal enforcement avenue via the FTC.
Platforms that act in good faith to remove unlawful intimate images are shielded from liability, encouraging faster takedowns and reducing legal risk for platforms that comply promptly.
People who publish or share images online (including journalists, researchers, students, and platform users) could face new federal criminal penalties in borderline or ambiguous consent/public-interest cases, risking prosecutions that chill speech.
Platforms (including smaller services) will face substantial new compliance costs—searching for identical copies, developing removal tech, staffing takedown operations, and meeting FTC requirements—which may be passed to users, advertisers, or lead to reduced services.
The statutory shield for good-faith removals may incentivize platforms to err on the side of takedown, increasing the risk of over-removal of lawful speech and harming users and creators.
Based on analysis of 5 sections of legislative text.
Creates federal crimes for knowingly publishing nonconsensual intimate images and sexually explicit deepfake depictions of identifiable people, with stronger penalties when victims are minors. Requires websites and apps that host user content to provide a clear notice-and-removal process and to remove reported nonconsensual intimate images and known identical copies within 48 hours. Gives the Federal Trade Commission authority to enforce platform notice-and-removal failures and preserves exceptions for law enforcement, medical/scientific uses, and certain legal disclosures.