Loading Map…
Introduced on April 3, 2025 by Debbie Wasserman Schultz
This bill aims to help parents keep kids safer on large social media platforms. It would make big platforms open secure, real-time connections that approved safety apps can use when a parent—or a child age 13 or older—gives permission. With that permission, the app can manage the child’s interactions, content, and account settings and receive the child’s data as often as every hour to look for risks. The goal is to protect kids from harms like cyberbullying, trafficking, illegal drugs, sexual harassment, and violence . Platforms must tell the child and parent what data is being shared and keep transfers secure, and any control by the safety app is limited to protecting the child, such as improving privacy and marketing settings.
Safety app companies must register with the Federal Trade Commission (FTC), be based in the U.S., store data only in the U.S., delete it quickly, and use it only to protect the child. They face security reviews and yearly independent checks. These apps can share data only in limited cases—like with parents to warn about serious risks such as self-harm, violence, sexual abuse, fraud, or trafficking—or when required by law or to prevent an immediate threat. The FTC enforces the rules, will accept complaints, and will issue guidance; the law starts when that guidance is issued .