The bill strengthens protections for minors—limiting advertising, profiling, and manipulative chatbot features and expanding enforcement and research—but does so at the cost of greater compliance, legal uncertainty, and limits on personalization that could reduce useful services and raise costs for developers and users.
Children and teens will face stronger limits on targeted advertising, data profiling, retention, and manipulative/addictive chatbot features, reducing privacy risks and harms to mental health.
Parents, state enforcers, and the public gain clearer enforcement tools: FTC rulemaking and guidance plus an explicit enforcement pathway and private suits give more ways to stop harmful or deceptive chatbot practices affecting minors.
Developers and deployers get clearer scope of regulation and some privacy-protective definitional limits (e.g., narrowing 'publicly available' for sensitive material), reducing ambiguity about what data and actors are covered.
Developers, deployers, and smaller firms will face substantial new compliance costs and higher litigation and liability risk, which could reduce free/low-cost chatbot options or raise prices for users.
Strict limits on personalization, data retention, and training on minor-provided inputs may reduce or eliminate beneficial tailored features (e.g., tutoring, accessibility, or individualized mental-health support) for children and students.
Ambiguous definitions and a low 'knowledge' standard, combined with potential overlapping state and FTC enforcement (and some federal preemption), create legal uncertainty and higher legal risk for firms and states.
Based on analysis of 10 sections of legislative text.
Introduced March 25, 2026 by Edward John Markey · Last progress March 25, 2026
Requires AI chatbots and their makers to follow special rules when they know a user is under 18: tell the user they are talking to an AI, limit how the chatbot collects and uses a minor’s data, ban targeted ads and profiling of minors, and stop features designed to create compulsive use. Gives the Federal Trade Commission enforcement authority, allows state attorneys general to sue, and lets parents sue for violations. Also directs federal health agencies to add questions about chatbot use to national surveys and authorizes funding for certain youth mental health efforts. The FTC must issue guidance and regulations on timing and details within set deadlines, while companies are protected from being forced to collect extra age data beyond normal business practices.