Need help making sense of this bill?
This is not an official government website.
Copyright © 2026 PLEJ LC. All rights reserved.
Requires social media platforms and online marketplaces to publish clear, machine-readable terms of service and consumer-protection policies that explain how the service works, what content and products are allowed, and how users are protected. Platforms must run formal consumer-protection programs, name responsible officers, adopt safeguards and review processes, and (if they meet size/revenue thresholds) file annual certified reports with the Federal Trade Commission. Gives the FTC rulemaking and enforcement authority, creates a private right of action for consumers to sue for actual damages, allows state attorneys general to bring enforcement actions, and removes Section 230 defense for violations of this law. It also requires the FTC to study and potentially require short-form disclosures and icons, and includes definitions for key terms used throughout the law.
Require each social media platform and online marketplace to establish, maintain, and make publicly available at all times terms of service in a machine-readable format, written in clear, plain, and concise language.
Require the terms of service to include any terms or conditions of use, any policies regarding the service or its use, and the consumer protection policy described in subsection (b).
Require the terms of service to cover behavior-related issues and, at minimum, include terms related to payment methods, content ownership (including user-generated content), policies on sharing user content with third parties, any disclaimers/limitations/ notices of nonliability or consequences of not agreeing to the terms, and any other topic the Commission deems appropriate.
For social media platforms, require the consumer protection policy to describe permitted and prohibited content and behavior (for both platform and users); whether and on what grounds content may be blocked/removed/modified or service terminated; whether and how people can request blocking/removal/modification or termination; how users will be notified and can respond to such requests; whether and how requesters are informed of actions taken and reasons; how to appeal moderation or termination decisions and how appeal results will be communicated; and tools/support available to users who have experienced cyber harassment.
For online marketplaces, require the consumer protection policy to describe allowed and disallowed products/product descriptions/marketing materials; whether and on what grounds product listings or service may be blocked/removed/modified or service terminated; whether and how users will be notified of recalled or dangerous products.
Who is affected and how:
Social media platforms and online marketplaces: Primary targets. They must update terms of service into plain-language, machine-readable formats; draft and publish consumer-protection policies; create and run formal consumer protection programs; appoint named officers; maintain safeguards and appeal processes; and, if they meet thresholds, submit annual certified filings to the FTC. These changes mean operational, compliance, legal, and reporting costs and may require new staffing (e.g., consumer protection officers) and system upgrades to make policies machine-readable.
Platform users (consumers, advertisers, sellers): Stand to gain clearer information about platform rules, moderation, reporting and appeals processes, and remedies for harassment or unsafe products. Better disclosures and icons could make it easier to understand risks and rights. At the same time, platforms may change product listings, content moderation, or access models in response to higher compliance risk.
Sellers and third-party merchants on marketplaces: Must comply with marketplace reporting and remedy procedures (recalls, notices, product restrictions). They may face stricter listing rules, more active compliance checks, and potential liability exposure mediated through marketplace processes.
Federal Trade Commission: Gains explicit rulemaking and enforcement responsibilities, deadlines for studies/rules, and oversight of annual platform filings. The FTC will need resources to manage rulemaking, review filings, and enforce compliance.
State attorneys general and courts: State AGs get an enforcement pathway; individuals can bring private lawsuits. That broadens enforcement activity and creates litigation risk for covered platforms.
Likely consequences and tradeoffs:
Consumer protection gains: Clearer policies and mandatory program structures could improve transparency, user safety, and response to harmful products and harassment.
Compliance burden and market effects: Smaller platforms and marketplaces could face disproportionate compliance costs, potentially favoring larger incumbents that can absorb reporting and legal expenses. That could reduce entry or prompt changes in product offerings and moderation policies.
Legal and regulatory uncertainty: Removing Section 230 protection for violations of this Act and adding private and state enforcement may increase litigation and conservative content/product-moderation choices by platforms to limit liability.
Implementation timeline: The FTC must study and potentially adopt short-form disclosures within defined deadlines; the timing and substance of FTC rules will materially determine operational impact.
Expand sections to see detailed analysis
Referred to the House Committee on Energy and Commerce.
Introduced April 10, 2025 by Janice D. Schakowsky · Last progress April 10, 2025
Referred to the House Committee on Energy and Commerce.
Introduced in House