Need the quick take? I'll walk you through this bill.
This is not an official government website.
Copyright © 2026 PLEJ LC. All rights reserved.
Prohibits social media platforms from creating or maintaining accounts for children under 13, bans platforms from using children’s and teens’ personal data to make personalized algorithmic recommendations (with narrow exceptions), and requires platforms to stop retaining or using children’s data after account termination. Gives the Federal Trade Commission primary enforcement authority and permits state attorneys general to bring civil actions. Requires elementary and secondary schools that receive discounted internet services under section 254(h) to certify that school-managed networks, devices, and services block student access to social media; directs the Federal Communications Commission to adopt implementing rules and publish a public database of schools’ Internet safety policies. Includes a severability clause and an effective date for the child-account prohibition one year after enactment.
Short title: The title may be referred to as the "Kids Off Social Media Act."
Defines "personalized recommendation system" as an automated system that suggests, promotes, or ranks content based on users' personal data.
Defines "child" as an individual under the age of 13, and "teen" as an individual over the age of 12 and under the age of 17.
Defines "Commission" as the Federal Trade Commission.
Defines "personal data" by reference to the Children’s Online Privacy Protection Act (COPPA) definition.
Who is affected and how:
Social media platforms and online services: Must prevent under-13 account creation, redesign age-verification and onboarding systems, stop using children’s and teens’ personal data for personalized recommendations (or implement permitted exceptions), and purge or cease using children’s data after account termination. This will require engineering, compliance, and product changes and could affect recommendation-driven engagement metrics.
Children and teens: Children under 13 will be barred from platform accounts; older teens may still use platforms but with limits on personalized algorithmic recommendations. The measures aim to reduce exposure to algorithmically tailored content and limit data collection and retention for minors.
Parents and families: May see reduced platform features for younger children and may need to seek alternative supervised or family-oriented services; parental choices around supervised accounts, age verification, or family product alternatives may increase.
K‑12 schools and IT staff: Schools receiving discounted broadband/services under section 254(h) (E‑rate recipients) must certify blocking of social media on school-managed networks/devices. That requires technical controls, policy updates, documentation, and interaction with the FCC database. Smaller or under-resourced districts may face administrative and technical burdens.
Federal agencies and state attorneys general: The FTC will have new enforcement responsibilities monitoring platform compliance; FCC will implement rulemaking, set deadlines, and maintain a public database; state AGs gain a private right of action to enforce the law on behalf of residents.
Civil society and industry stakeholders: Child privacy advocates will likely welcome stronger protections; technology companies, civil liberties groups, and some industry stakeholders may raise concerns about free expression, age verification privacy trade-offs, compliance costs, and effects on beneficial content and educational use cases.
Overall effects and trade-offs:
Benefits: Increased privacy protections for children, reduced use of algorithmic recommendations aimed at minors, and clearer responsibilities for schools to limit student access on school-managed systems. Could reduce some harms associated with recommendation algorithms and data-driven targeting of minors.
Costs and challenges: Implementation will require engineering and programmatic changes by platforms, burdens on school IT operations to implement and certify blocks, potential litigation over definitions (e.g., who is a “teen,” what counts as a “personalized recommendation”), and agency rulemaking disputes. The lack of explicit new funding may mean costs fall on private companies and school districts.
Enforcement uncertainty: The law empowers FTC enforcement and state civil suits, but practical enforcement will depend on agency rulemaking, resource allocation, and litigation outcomes about definitions and First Amendment or preemption claims.
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Introduced January 28, 2025 by Brian Emanuel Schatz · Last progress January 28, 2025
Expand sections to see detailed analysis
Placed on Senate Legislative Calendar under General Orders. Calendar No. 108.
Committee on Commerce, Science, and Transportation. Reported by Senator Cruz without amendment. With written report No. 119-33.
Committee on Commerce, Science, and Transportation. Ordered to be reported without amendment favorably.
Read twice and referred to the Committee on Commerce, Science, and Transportation.