The bill gives victims of nonconsensual and AI-generated intimate forgeries stronger recognition, clear civil remedies, and privacy tools — improving accountability and relief — but does so at the cost of higher litigation and compliance burdens and risks of overbroad enforcement that could chill lawful expression and raise due-process concerns.
Identifiable victims of nonconsensual or AI-manipulated sexual images (e.g., women, students, young adults) can sue and obtain civil remedies because the bill recognizes deepfakes as image-based sexual abuse and creates an explicit private right of action.
Victims can obtain clear statutory monetary remedies (liquidated damages of $150,000, or $250,000 for enhanced conduct) and fee-shifting, improving compensation and deterrence against creators/distributors of intimate forgeries.
Courts can order deletion/destruction of intimate depictions and allow pseudonymous filings, providing practical privacy protections and immediate relief to reduce ongoing harm.
Creators, publishers, and satirists could face chill on lawful expressive speech because broad definitions and strict liability for indistinguishable forgeries (regardless of labeling) may sweep in satire, journalism, or political speech.
Large statutory damages and attorney-fee recovery will likely increase litigation volume and costs, burdening defendants, insurers, and potentially raising costs for middle-class families and taxpayers.
New compliance and enforcement obligations could impose costs on platforms, developers, and apps (moderating content, takedowns, verification), which may be passed on to users or limit service availability.
Based on analysis of 4 sections of legislative text.
Expands federal civil liability and definitions to cover nonconsensual sexually explicit AI/edited "intimate digital forgeries," allowing victims to sue creators, possessors, and distributors.
Introduced May 21, 2025 by Alexandria Ocasio-Cortez · Last progress May 21, 2025
Creates a federal cause of action and legal definitions aimed at nonconsensual sexually explicit synthetic images and videos (so‑called deepfakes). The bill adds new legal definitions for “identifiable individual” and “intimate digital forgery,” expands who can sue and on what grounds, and clarifies that labeling or disclosure does not negate liability for realistic fabricated intimate images. It also includes a severability rule and an explicit statement that the law does not change intellectual property rules. The practical effect is to broaden civil remedies for victims of image‑based sexual abuse committed with modern AI and editing tools, and to place more legal exposure on creators, possessors (with intent to disclose), distributors, and recipients of nonconsensual intimate forgeries.