Need the quick take? I'll walk you through this bill.
This is not an official government website.
Copyright © 2026 PLEJ LC. All rights reserved.
Changes how online platforms and users are treated in lawsuits about publishing or moderating content. It reduces certain Section 230 protections by shifting the burden onto a provider or user to show they were not the creator of contested content, strips immunity where a platform appears to censor political speech in specified ways, and adds statutory definitions for “legitimate law enforcement purpose” and “national security purpose.” The result would raise legal risk for platforms and users involved in content-moderation decisions and likely change how companies set moderation rules, preserve records, and respond to government requests.
Amends subsection (c)(1)(A) of Section 230 by replacing the existing text so that (A) reads: "In general Subject to subparagraph (B), no provider;" (text in source).
Adds an affirmative defense rule: in any criminal or civil action that treats a provider or user as the publisher or speaker of information, the provider or user must prove they are not an "information content provider" with respect to that information for purposes of the amended subparagraph (A).
In subsection (c), paragraph (2)(B) is amended by replacing the reference "paragraph (1)" with "subparagraph (A)."
Adds a new paragraph (3) that removes subsection (c) protection for a provider that restricts access to or availability of material when the restriction (A)(i) reasonably appears to express, promote, limit the visibility of, or suppress legitimate political speech (including a discernible viewpoint), and (A)(ii) is the result of a communication that is sent to the provider by (text includes subclauses but is partly redacted in the file) and where the applicable entity under a listed subclause sends only to that provider and not to other entities. Includes an exception (B) stating that communications for a legitimate law enforcement purpose or a national security purpose are not considered to be communications described in (A)(ii).
Adds a definition for "legitimate law enforcement purpose": a communication whose purpose is so that a law enforcement agency can, within that agency's lawful authority, investigate a criminal offense.
Who is affected and how:
Platforms and platform operators: Face higher litigation risk and costs because immunity is narrower and the law can require them to prove they did not create content. They may change moderation policies (either more removal to avoid political‑speech liability or less removal to avoid being accused of censoring), improve documentation of moderation decisions, or restrict features that could create creator‑like liability.
Platform users and content creators: Users who post or moderate content (including moderators on smaller sites or volunteer moderators) could become defendants in more suits or face subpoenas and discovery. Users may self‑censor or alter how they engage on platforms.
Developers of moderation tools and algorithmic systems: May need to modify how automated systems flag or remove political content, add logging/audit trails, and incorporate legal risk assessments into design.
Advertisers, sellers, and third‑party businesses on platforms: Could see changes in content visibility and moderation that affect reach and revenue; they may demand clearer moderation rules or compensation for unexpected removals.
Courts and legal system: Anticipate increased caseloads and new questions about interpretive boundaries (e.g., who counts as a content creator, what constitutes political‑speech censorship, how the new definitions interact with existing law). Discovery burdens and pretrial disputes may grow.
Civil liberties and public interest outcomes: The change could chill moderation of political misinformation or, conversely, chill removal of harmful political content, depending on platform responses. The balance between protecting political speech and limiting harmful speech may shift unpredictably.
Net effect: The amendment increases legal uncertainty and compliance burdens, particularly for smaller platforms, and will push both industry and courts to develop new norms and practices for moderation, transparency, and cooperation with law enforcement and national‑security requests.
Expand sections to see detailed analysis
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Introduced January 9, 2025 by Eric Stephen Schmitt · Last progress January 9, 2025
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Introduced in Senate