The bill significantly strengthens federal protections, removal processes, and remedies for victims of nonconsensual intimate images (including deepfakes and minors), but does so at the cost of new compliance burdens, expanded enforcement authority, and legal uncertainties that could chill lawful speech and strain smaller platforms and some organizations.
People whose intimate images are shared without consent — including AI-generated deepfakes and images of minors — gain federal criminal protections, penalties, and potential restitution/forfeiture, giving victims clearer remedies and creating stronger deterrents against misuse.
Victims can request removal and platforms must take down reported intimate images within 48 hours and publish plain-language request procedures, while the FTC can enforce those rules — speeding takedowns, reducing further spread, and making the process more accessible.
The bill provides clearer statutory definitions (e.g., "minor," "intimate visual depiction") and defines which services qualify as covered platforms, improving clarity about who is protected and when platforms can be held to account.
Online platforms and smaller websites face new criminal liability, stricter takedown timelines, identification and notice requirements, and greater moderation burdens — raising compliance costs that may be passed to users or reduce available services.
Ambiguous definitions (e.g., "identifiable individual," "digital forgery") and criminalization of certain publications risk overbroad prosecutions or chilling of lawful speech, reporting, and research because platforms and publishers may over-remove or be uncertain about legal exposure.
Platforms acting in good faith on takedown requests may temporarily remove lawful content, potentially harming users whose speech is wrongly taken down and creating due-process concerns.
Based on analysis of 5 sections of legislative text.
Creates federal crimes for publishing nonconsensual intimate images and deepfake forgeries and requires covered platforms to implement a notice-and-removal process with 48-hour takedowns for valid requests.
Introduced January 22, 2025 by Maria Elvira Salazar · Last progress January 22, 2025
Creates federal crimes for knowingly publishing nonconsensual intimate images and simulated deepfake intimate images of identifiable adults or minors, with penalties, forfeiture, and restitution. Requires covered online platforms to post a clear notice-and-removal process and remove reported intimate images depicting a person without consent as soon as possible and no later than 48 hours after a valid request; gives the FTC authority to enforce removal requirements as unfair or deceptive acts.