The bill strengthens accountability, auditing, and parental access to vetted safety tools for children on large platforms—reducing some online harms and monetization risks—while leaving many children on smaller platforms unprotected, creating privacy tradeoffs through mandated data sharing, and imposing compliance and legal burdens that may raise costs and reduce choices.
Parents and children (under 17) can authorize vetted third‑party safety apps and receive machine‑readable data transfers (at least hourly) and interoperable APIs, giving families timely control and monitoring tools to reduce children’s exposure to online harms.
Children and parents gain a clear federal enforcement pathway: the FTC can pursue large platforms for unfair or deceptive practices and require biannual compliance assessments, while registered third‑party providers face audits and Commission review, increasing accountability and oversight.
Platforms and registered third‑party providers must adopt state‑of‑the‑art safeguards, limit data use, and submit to independent audits, which strengthens data‑security and reduces some risks of misuse.
Children on many smaller apps and services are excluded because the Act only covers 'large social media platforms' meeting high user or revenue thresholds, leaving sizable numbers of children without these new protections.
Allowing third‑party safety tools to access children’s data and enabling frequent transfers increases the risk of data breaches, misuse, or over‑exposure of sensitive information (including mental‑health indicators), potentially harming children’s privacy and willingness to seek help.
Narrow authorization rules for parental safety tools and a ban on state‑mandated API delegation could exclude useful safety tools and limit what families and states can require, reducing options for protecting children in real time.
Based on analysis of 5 sections of legislative text.
Requires very large social platforms to provide real‑time APIs so children (or parents) can delegate account controls to registered safety apps and transfer limited child data, enforced by the FTC.
Introduced March 20, 2026 by Jon Husted · Last progress March 20, 2026
Requires very large social media platforms to give children (and parents of children under 13) a way to delegate the same in-app account controls they have to registered third‑party safety apps, and to allow those apps to receive the child’s user data in a common machine-readable format at least hourly. The Federal Trade Commission enforces the rule, conducts compliance checks every two years, and must set up complaint procedures; the law also limits states from imposing their own API‑mandate requirements against platforms.