The bill strengthens protections and faster takedowns for victims of nonconsensual intimate‑image sharing and increases platform accountability (including for AI‑generated content), but does so at the cost of significant compliance burdens, privacy risks, and the potential for over‑removal and legal uncertainty that may hit smaller platforms and free expression.
Survivors of nonconsensual intimate-image sharing (especially women and youth) will get faster removals and stronger prevention: platforms must implement takedown and prevention processes and remove reported intimate images (and attempt to remove known identical copies) within 48 hours.
All users gain clearer notice and a standardized removal process because platforms must publish conspicuous, plain‑language notices and maintain written notice‑and‑removal procedures, and the FTC will issue implementing rules with public input.
Courts and law enforcement (and related responders) will have better evidence and removal channels because providers must log relevant data and implement notice‑and‑removal procedures.
Small platforms and online businesses will face substantial new compliance costs (detection, logging, 48‑hour takedowns), which could reduce competition, force consolidation, or raise fees for users.
Users and creators face increased risk of over‑removal and chilled speech because stricter liability and tight takedown deadlines incentivize platforms to err on the side of removal.
Expanded logging and data‑retention requirements raise privacy risks: stored logs could be breached or misused, harming users (including people with disabilities).
Based on analysis of 4 sections of legislative text.
Conditions §230 immunity on platform processes to prevent/remove cyberstalking and nonconsensual intimate-image deepfakes, narrows deepfake exceptions, and requires FTC regulations within 180 days.
Introduced December 1, 2025 by Jake Auchincloss · Last progress December 1, 2025
Conditions online platform immunity (47 U.S.C. §230) on having written processes to prevent and respond to cyberstalking and nonconsensual intimate-image deepfakes, expands the definition of content “creation or development” to include solicitation or use of generative models, tightens the criminal and civil rules for sexually explicit digital forgeries, and requires covered platforms to adopt notice-and-removal procedures with a 48-hour takedown rule for valid requests. The FTC must write implementing regulations within 180 days; the law applies to content made available on or after enactment and includes statutory definitions and limited liability protections for good-faith removals. Platforms must implement takedown procedures, logging, and removal capabilities; certain communications services (email, messaging, broadband, storage) are explicitly excluded from the covered-platform definition. The bill also narrows exceptions to the criminal prohibition on creating virtually indistinguishable intimate digital forgeries and directs federal agencies to issue rules and guidance to enforce the new requirements.