Requires online service providers to publish clear, easy-to-find acceptable use policies that explain when and why they may suspend, restrict, or cancel user accounts, and to notify users in advance before taking such actions in most cases. The bill also mandates yearly public enforcement reports in human- and machine-readable form and gives the Federal Trade Commission authority to enforce these rules under the FTC Act, with compliance guidance due within 180 days.
The Act aims to ensure that consumers, businesses, and organizations seeking to use an online service provider’s products or services have sufficient information about the provider’s commercial business standards, processes, and policies related to unilateral termination, suspension, or cancellation of user accounts or the ability to use the provider’s products or services.
The section states that providing this information will allow consumers to make informed choices about using or purchasing online services or products and will promote a competitive marketplace for those services and products.
The term “Commission” means the Federal Trade Commission.
The term “nonprofit organization” has the meaning given that term in section 201(i) of title 35, United States Code.
The term “online service provider” means the provider of a public-facing website, online service, or online application that is directed to a consumer or organization and meets the criteria listed in subparagraph (A).
Who is affected and how:
Platform operators (platforms): Directly subject to the new requirements. They must draft or revise acceptable use policies, implement user-notice systems, track enforcement metrics, and produce annual human- and machine-readable reports. Compliance will impose operational and administrative costs (policy drafting, notifications, recordkeeping, reporting systems), especially for smaller providers without established compliance teams. Providers that rely on third-party moderation or automated signals must disclose those relationships and data sources in reports.
Platform users: Gain clearer information about what behavior is prohibited, how enforcement works, and whether appeals are available. Users will often receive advance notice before restrictions, giving them a chance to respond or prepare. In some safety or legal-exception cases, users may be restricted without advance notice.
Consumers and the public: Benefit from greater transparency about how services moderate content and suspend accounts, potentially improving consumer choice and competitive comparison across services. Public disclosure of notices (where used) may increase scrutiny of platform enforcement practices.
Businesses and third parties (advertisers, moderators, content partners): May face indirect effects because platforms might change moderation practices to manage notification and reporting burdens. Third-party moderation vendors must be tracked and disclosed in enforcement descriptions.
Federal Trade Commission: Gains explicit enforcement authority and workload tied to issuing guidance and pursuing violations under the FTC Act; the agency will need resources and expertise to evaluate compliance and enforce rules.
Potential consequences and tradeoffs:
Referred to the House Committee on Energy and Commerce.
Last progress June 10, 2025 (8 months ago)
Introduced on June 10, 2025 by Craig A. Goldman
TERMS Act
Updated 1 day ago
Last progress June 10, 2025 (8 months ago)