Last progress September 4, 2025 (3 months ago)
Introduced on September 4, 2025 by Jon Husted
Read twice and referred to the Committee on Commerce, Science, and Transportation.
This bill sets rules for “companion” AI chatbots—apps that act like friends or counselors. It makes companies verify users’ ages, create protections for kids, and clearly tell people they’re chatting with a bot, not a human. Companies must require user accounts, freeze old accounts until age is checked, and classify each user as a minor or an adult . If a user is a minor, the account must be linked to a verified parent account, with parental consent required. Parents must be alerted if a chat suggests the child is thinking about self-harm, and kids must be blocked from bots that share sexual content . Age data collected just to verify age must be kept to the minimum needed and protected . Companies must watch for suicidal thoughts in chats and give the child and parent the National Suicide Prevention Lifeline contact info. Every chat must start—and at least every 60 minutes—show a clear pop-up that the user is talking to an AI, not a person .
The Federal Trade Commission (FTC) will issue guidance within 180 days and can enforce these rules, and state attorneys general can also take action. Breaking the rules counts as an unfair or deceptive practice under existing law. Companies that act in good faith on user-provided age info, follow FTC guidance, and meet industry standards for age checks get a “safe harbor.” The law takes effect one year after it is enacted .