The bill increases transparency and speed of removal for terrorist-linked content and provides oversight data, but does so by imposing heavy reporting and penalty-driven compliance on large platforms that risk privacy exposure, over-removal of speech, economic costs, and shifting harms to smaller services.
Platform users (broad public) gain clearer, platform-specific rules on how content tied to designated terrorist entities is handled, improving transparency and predictability about enforcement.
State and local governments, researchers, and the public get triannual, detailed DOJ action reports that increase oversight of platform enforcement practices.
Platform users can flag suspected terrorist-linked content and have clearer expectations about response times, which may speed removal of dangerous material and improve public safety.
Large platforms face substantial compliance costs and risk daily civil penalties (up to $5 million per violation), which could lead to higher consumer prices, reduced services, or slower innovation.
Detailed reporting obligations risk exposing user data or other sensitive information, creating privacy concerns for platform users despite confidentiality requirements.
To avoid heavy penalties and scrutiny, platforms may over-remove or over-enforce content, chilling lawful speech and harming expression—particularly for vulnerable groups like immigrants.
Based on analysis of 2 sections of legislative text.
Requires covered social platforms to publish user-facing enforcement terms for terrorism-linked content and submit detailed triannual reports to the Attorney General, with DOJ posting them publicly.
Requires covered social media companies to publish platform-specific terms of service (or state that none exist) and clear, user-facing information about how the platform enforces rules for accounts or content linked to designated foreign terrorist organizations and specially designated global terrorists. Companies must submit detailed periodic reports to the Attorney General with counts and breakdowns of flagged content, enforcement actions, views/shares, appeals, methods of flagging/actioning, and a longitudinal evaluation of trends; the Department of Justice will publish these reports in a searchable repository. Sets deadlines for publication and reporting: platform materials must be published within 180 days of enactment; the first detailed report is due within 360 days and subsequent reports are due Jan. 31, Apr. 30, and Oct. 31 each year. The bill specifies required data fields and disaggregation by content category, media type, flagging and actioning methods, and outcomes (including appeals and reversals).
Introduced October 3, 2025 by Josh S. Gottheimer · Last progress October 3, 2025