Introduced April 3, 2025 by Debbie Wasserman Schultz · Last progress April 3, 2025
The bill expands legal authority, oversight, and technical pathways for parents and vetted third‑party apps to protect children on the largest social platforms, but it also raises significant privacy/security risks, increases compliance costs that may shrink the market for safety tools, leaves coverage gaps, and concentrates regulatory control at the federal level.
Parents and teens (13+) gain a clear legal path to authorize vetted third‑party safety apps to manage account settings and content for children (including a statutory definition of children under 17), giving families direct control and a recognized delegation mechanism.
Registered, audited US‑based third‑party safety apps can receive frequent data (at least hourly) and face Commission oversight and biannual platform assessments, enabling timelier monitoring and interventions for imminent risks to children.
Children and parents get an enforceable complaint avenue to the FTC and stronger enforcement tools (investigations and penalties) to hold large platforms accountable for unfair or deceptive practices affecting kids.
Children and families face increased privacy and security risks because frequent transfers of detailed user data to third parties, centralized registration, and required data handling raise the chance of misuse or breaches of sensitive child information.
Compliance, US‑only processing/storage, and audit/registration requirements will raise costs for platforms and third‑party developers, likely reducing the number of available safety apps, discouraging smaller providers, and harming competition and innovation.
Key scope and data limits (high user/revenue thresholds, exemptions for some services, and a 30‑day data access window) may leave many kids unprotected or create loopholes that let harmful content or necessary investigations slip through.
Based on analysis of 7 sections of legislative text.
Requires large platforms to provide real-time APIs so registered third-party safety software can manage child accounts and transfer recent user data when delegated, enforced by the FTC.
Requires large social media platforms to provide real-time application programming interfaces (APIs) so registered third-party safety software can manage a child’s account and regularly transfer a child’s recent user data when a child (age 13+) or the child’s parent/guardian delegates permission. The Federal Trade Commission (FTC) enforces the law, must issue guidance and assess compliance, and the law takes effect when the FTC issues that guidance. Platforms must limit third-party actions to protecting the child, notify account holders when delegations occur, and adopt policies to mitigate data-transfer risks. The law also prevents states from imposing their own API requirements while preserving state consumer-protection and related authorities.