Introduced May 14, 2025 by Marsha Blackburn · Last progress May 14, 2025
The bill strengthens default protections, parental controls, enforcement, and algorithmic transparency to better protect children online and give users more control, at the cost of increased compliance and litigation risk for platforms (which may raise prices or reduce services), potential privacy tradeoffs from age‑verification pressure, and uneven state-level implementation.
Children and minors: platforms will adopt stronger default privacy/safety settings, limit certain ads to known minors (e.g., tobacco, alcohol, gambling, drugs), and publish audited reports on minors' access and time use, reducing exposure to addictive or age-inappropriate content.
Parents and families: parental controls and clear tools to view and manage kids' accounts, restrict purchases, and monitor time will be established, with limited exemptions for traditional streaming services that implement comparable parental controls.
Consumers and governments: State attorneys general gain a clear enforcement path (injunctions, damages, restitution) and the FTC can coordinate and use its full authorities, creating combined state–federal enforcement to hold platforms accountable for minors' protections.
Users, advertisers, and small platforms: complying with new default settings, audits, parental tools, transparency toggles, and expanded enforcement could raise platforms' operational and legal costs, which may be passed on to users or advertisers and disproportionately burden smaller companies.
Platforms, states, and consumers: the law's enforcement design (non‑binding FTC guidance, expanded multi‑state liability, and preserved state authority) creates legal uncertainty and risks inconsistent state-level litigation and patchwork standards across the country.
Children and families: enforcement based on a 'totality of circumstances' and pressure to demonstrate compliance may incentivize platforms to collect more age/identity data, creating privacy and security tradeoffs for minors.
Based on analysis of 8 sections of legislative text.
Creates a federal Kids Online Safety framework that requires online platforms reasonably likely to be used by minors to adopt design, data, disclosure, and parental-control safeguards to reduce foreseeable harms to children. Platforms must default to the most protective settings for children, offer parental controls and time limits, limit targeted advertising to known minors for certain products, prohibit dark patterns, and provide clear notices and opt-outs for personalized recommendation systems. The bill gives the Federal Trade Commission primary enforcement authority, allows State attorneys general to sue as parens patriae with prior notice to the FTC, and requires large platforms to publish annual third-party-audited transparency reports. It also bans opaque algorithms without notice/opt-out, directs federal studies and FTC guidance, establishes a Kids Online Safety Council to advise Congress, and includes preemption and severability rules. Many protections take effect 12–18 months after enactment and include specific timelines for agency guidance and platform responses to reports of harm.