Ask me what this bill is really trying to do.
This is not an official government website.
Copyright © 2026 PLEJ LC. All rights reserved.
Redesignates existing subsection (f) as subsection (g) and inserts a new subsection (f) titled "Algorithmic product design accountability," which imposes a duty of care for recommendation-based algorithms on providers of social media platforms, creates loss of liability protection under subsection (c)(1) for violations, establishes a private right of action for bodily injury or death arising from algorithm operation, invalidates predispute arbitration agreements and joint-action waivers for disputes under the new subsection, and adds definitions, severability, and relationship-to-other-laws provisions.
Alters wording in the definition provision by replacing a longer descriptive phrase identifying the Act with the short form name "Trademark Act of 1946."
Replaces a descriptive phrase in section 3(b)(1) of the Webb-Kenyon Act with the short form name "Webb-Kenyon Act."
Amends subsections (a) and (b) of 18 U.S.C. 2421A by striking existing text and inserting new text (the exact replacement text is not shown in this section).
Amends paragraph (6) of 31 U.S.C. 5362 by striking and inserting replacement text (the exact new text is not provided in this section).
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Introduced November 18, 2025 by John R. Curtis · Last progress November 18, 2025
Amends Section 230 of the Communications Act to hold for‑profit social media platforms responsible when their recommendation-based algorithms cause or foreseeably contribute to bodily injury or death. Platforms would be required to use "reasonable care" in designing and operating recommendation algorithms; failure to do so can strip Section 230 immunity and expose platforms to private lawsuits and civil liability for injuries or deaths caused by their recommendations. The change defines key terms (like “recommendation‑based algorithm” and “social media platform”), creates a private right of action for injury or death, and establishes loss of Section 230 protection as an enforcement mechanism. The amendment focuses on algorithmic recommendations rather than all content moderation or hosting activities, creating both new legal exposure for platforms and incentives to change recommendation design and operation to reduce foreseeable physical harms.
Redesignates current subsection (f) of 47 U.S.C. 230 as subsection (g), and inserts a new subsection (f) titled “Algorithmic product design accountability.”
Duty of care in algorithmic design: A provider of a social media platform must exercise reasonable care in the design, training, testing, deployment, operation, and maintenance of a recommendation-based algorithm to prevent bodily injury or death that (i) was reasonably foreseeable by the provider and (ii) is attributable, in whole or in part, to the design characteristics or performance of the recommendation-based algorithm.
Covered bodily injury or death: Defined as bodily injury to or the death of a user of the social media platform, or bodily injury or death inflicted by a user of the social media platform upon another person, where the harm arises from the operation of the recommendation-based algorithm.
Exceptions to the duty: The duty does not apply to algorithmic actions that (I) sort information strictly chronologically or reverse‑chronologically, or (II) respond to an individual search initiated by a user; however, the search exception is limited to the initially populated search results and does not protect provider algorithmic activity after the user navigates beyond those initial results.
First Amendment protection: The subsection does not authorize the Commission to enforce the duty based on the viewpoint of a user or information content provider when that viewpoint is protected speech under the First Amendment.
Who is affected and how:
For‑profit social media platforms: Directly affected because the law targets their recommendation systems. They would face a new legal duty to design and operate algorithms with reasonable care, increased exposure to civil suits, potential loss of Section 230 immunity in covered cases, higher compliance costs (safety engineering, audits, legal defense), and possible changes to product design and business models.
Algorithm developers and deployers: Companies and teams that design, train, test, and deploy recommendation systems will need to implement risk assessments, safety testing, documentation, and mitigation measures; third‑party recommender vendors may face contractual and liability ripple effects.
Platform users (general public and specific vulnerable groups): Could experience safer on‑platform experiences if platforms reduce amplification of content or interactions that lead to physical harm; however, users may also see less personalized content, reduced discovery, or increased content filtering.
Injured persons and families: Gain a clearer legal path to seek damages when a recommendation algorithm foreseeably causes bodily injury or death, rather than being blocked by Section 230 immunity.
Small and startup platforms: May face disproportionately high compliance and litigation costs relative to resources, potentially favoring larger incumbents and accelerating market consolidation unless definitions or thresholds exempt smaller actors.
Legal system and insurers: Courts will need to develop standards for algorithmic causation, foreseeability, and reasonable care in a new body of case law; insurers will reassess coverage and premiums for platform liability.
Public safety and civil liberties: The legislation aims to reduce real‑world harm tied to platform recommendations, but it may also create pressures that reduce online speech discovery, experimental recommendation techniques, or features that have mixed societal value.
Net effect: The amendment shifts legal risk onto platforms and their algorithmic systems, incentivizing engineering and policy changes to limit foreseeable physical harms while creating regulatory and litigation uncertainty until courts interpret the new standards.
Expand sections to see detailed analysis
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Introduced in Senate