The bill increases protections for minors online by requiring age verification and strengthening oversight, but does so at the cost of significant privacy risks, compliance costs for small sites, and risks of over‑blocking lawful speech.
Children and teens would face reduced access to pornographic and other content labeled "harmful to minors" because platforms must implement age‑verification and blocking measures.
Consumers and the public gain stronger oversight and enforcement—through FTC enforcement tools, required audits, and a GAO review—making it more likely that platforms follow the law and that violations are addressed.
Platforms and regulators get clearer, more evidence‑based rules and technical guidance (definitions, consultative standards, and implementation guidance), increasing regulatory predictability and technical interoperability for verification systems.
Children, parents, and other users face increased privacy and surveillance risks because age‑verification will require collecting sensitive personal or device‑linked data and create centralized/third‑party aggregation points that could be breached or misused.
Small platforms, nonprofits, and niche sites may incur substantial compliance, audit, and legal costs—and some could shut down or raise prices—reducing online competition and services available to users.
Lawful adults, creators, and users of sexual‑health or educational material risk chilling and over‑blocking because the statute's vague "harmful to minors" standard and liability concerns could push platforms to block wide swaths of content.
Based on analysis of 9 sections of legislative text.
Requires platforms that create or host visual content harmful to minors to implement technological age‑verification measures and subjects violations to FTC enforcement.
Introduced February 26, 2025 by Mary E. Miller · Last progress February 26, 2025
Requires online services that create, host, or make available visual sexual content that is harmful to minors to implement technical age‑verification measures within one year; the Federal Trade Commission enforces the rule, issues guidance, audits compliance, and the Government Accountability Office must report on effectiveness and harms within two years of platform compliance. The law defines covered platforms and "harmful to minors" content, sets data‑security and minimal‑retention rules for verification data, and requires platforms to publicly disclose verification methods while remaining legally responsible for any contractors they use.