TAKE IT DOWN Act
Introduced on January 22, 2025 by Maria Elvira Salazar
Sponsors (42)
House Votes
Senate Votes
AI Summary
This bill, called the TAKE IT DOWN Act, aims to stop people from posting intimate images online without consent, including fake images made with AI (“deepfakes”). It makes it a crime to post these images of adults when the image was private, shared without consent, and meant to cause harm or does cause harm; it also covers real and computer‑generated images of minors when the goal is to abuse, harass, humiliate, or to arouse sexual desire. Threats to post these images are also illegal.
Websites and apps that serve the public and mainly host user content must set up a clear, easy way for people to report nonconsensual intimate images within one year of the law. After a valid request, they must remove the image within 48 hours and try to remove identical copies. The Federal Trade Commission can enforce these takedown rules, and platforms are protected when they remove content in good faith.
- Who is affected: Adults and minors pictured in intimate images; people who post or threaten to post them; and online platforms that host user‑generated content.
- What changes: Posting or threatening to post nonconsensual intimate images (real or AI‑made) becomes a federal crime; courts can require repayment to victims and take any money made and tools used to commit the crime; penalties include fines and prison time (up to 2 years for adult images and up to 3 years for images of minors).
- Consent clarified: Saying yes to create a photo does not mean it’s okay to publish it; sharing a photo with someone doesn’t give them the right to post it.
- Exceptions: Allowances for law enforcement, legal proceedings, medical or educational use, reporting unlawful content, seeking help, or when someone shares their own image.
- When: Platforms must build the reporting process within 1 year of enactment and remove flagged images within 48 hours of a valid request.