H.R. 1623
119th CONGRESS 1st Session
To require certain interactive computer services to adopt and operate technology verification measures to ensure that users of the platform are not minors, and for other purposes.
IN THE HOUSE OF REPRESENTATIVES · February 26, 2025 · Sponsor: Mrs. Miller of Illinois · Committee: Committee on Energy and Commerce
Table of contents
SEC. 1. Short title
- This Act may be cited as the or the .
SEC. 2. Findings; sense of Congress
- (a) Findings
- Congress finds the following:
- Over the 3 decades preceding the date of enactment of this Act, Congress has passed several bills to protect minors from access to online pornographic content, including title V of the Telecommunications Act of 1996 () (commonly known as the ), section 231 of the Communications Act of 1934 () (commonly known as the ), and the Children’s Internet Protection Act (title XVII of division B of ).
Communications Decency ActChild Online Protection ActPublic Law 104–104; 47 U.S.C. 231; Public Law 106–554 - With the exception of the Children's Internet Protection Act (title XVII of division B of ), the Supreme Court of the United States has struck down the previous efforts of Congress to shield children from pornographic content, finding that such legislation constituted a but that it was not the least restrictive means to achieve such interest. In Ashcroft v. ACLU, 542 U.S. 656 (2004), the Court even suggested at the time that could conceivably be a to the requirements passed by Congress.
compelling government interestblocking and filtering softwareprimary alternativePublic Law 106–554 - In the nearly 2 decades since the Supreme Court of the United States suggested the use of , such technology has proven to be ineffective in protecting minors from accessing online pornographic content. The Kaiser Family Foundation has found that filters do not work on 1 in 10 pornography sites accessed intentionally and 1 in 3 pornography sites that are accessed unintentionally. Further, it has been proven that children are able to bypass software by employing strategic searches or measures to bypass the software completely. ``blocking and filtering
software - Additionally, Pew Research has revealed studies showing that only 39 percent of parents use blocking or filtering software for their minor’s online activities, meaning that 61 percent of children only have restrictions on their internet access when they are at school or at a library.
- 17 States have now recognized pornography as a public health hazard that leads to a broad range of individual harms, societal harms, and public health impacts.
- It is estimated that 80 percent of minors between the ages of 12 to 17 have been exposed to pornography, with 54 percent of teenagers seeking it out. The internet is the most common source for minors to access pornography with pornographic websites receiving more web traffic in the United States than Twitter, Netflix, Pinterest, and LinkedIn combined.
- Exposure to online pornography has created unique psychological effects for minors, including anxiety, addiction, low self-esteem, body image disorders, an increase in problematic sexual activity at younger ages, and an increased desire among minors to engage in risky sexual behavior.
- The Supreme Court of the United States has recognized on multiple occasions that Congress has a to protect the physical and psychological well-being of minors, which includes shielding them from content that may not necessarily be considered by adult standards.
compelling government interestindecentobscene - Because has not produced the results envisioned nearly 2 decades ago, it is necessary for Congress to pursue alternative policies to enable the protection of the physical and psychological well-being of minors.
blocking and filtering software - The evolution of our technology has now enabled the use of age verification technology that is cost efficient, not unduly burdensome, and can be operated narrowly in a manner that ensures only adults have access to a website’s online pornographic content.
- Over the 3 decades preceding the date of enactment of this Act, Congress has passed several bills to protect minors from access to online pornographic content, including title V of the Telecommunications Act of 1996 () (commonly known as the ), section 231 of the Communications Act of 1934 () (commonly known as the ), and the Children’s Internet Protection Act (title XVII of division B of ).
- Congress finds the following:
- (b) Sense of Congress
- It is the sense of Congress that—
- shielding minors from access to online pornographic content is a compelling government interest that protects the physical and psychological well-being of minors; and
- requiring interactive computer services that are in the business of creating, hosting, or making available pornographic content to enact technological measures that shield minors from accessing pornographic content on their platforms is the least restrictive means for Congress to achieve its compelling government interest.
- It is the sense of Congress that—
SEC. 3. Definitions
- In this Act:
- The terms and have the meanings given those terms in section 2256 of title 18, United States Code.
child pornography,minor - The term
Commissionmeans the Federal Trade Commission. - The term —
covered platform - The term , with respect to a picture, image, graphic image file, film, videotape, or other visual depiction, means that the picture, image, graphic image file, film, videotape, or other depiction—
harmful to minors - The terms and have the meanings given those terms in section 230(f) of the Communications Act of 1934 ().
information content provider,interactive computer service47 U.S.C. 230(f) - The terms and have the meanings given those terms in section 2246 of title 18, United States Code.
sexual act,sexual contact - The term
technology verification measuremeans technology that— - The term
technology verification measure datameans information that—
- The terms and have the meanings given those terms in section 2256 of title 18, United States Code.
SEC. 4. Technology verification measures
- (a) Covered platform requirements
- Beginning on the date that is 1 year after the date of enactment of this Act, a covered platform shall adopt and utilize technology verification measures on the platform to ensure that—
- users of the covered platform are not minors; and
- minors are prevented from accessing any content on the covered platform that is harmful to minors.
- Beginning on the date that is 1 year after the date of enactment of this Act, a covered platform shall adopt and utilize technology verification measures on the platform to ensure that—
- (b) Requirements for age verification measures
- In order to comply with the requirement of subsection (a), the technology verification measures adopted and utilized by a covered platform shall do the following:
- Use a technology verification measure in order to verify a user's age.
- Provide that requiring a user to confirm that the user is not a minor shall not be sufficient to satisfy the requirement of subsection (a).
- Make publicly available the verification process that the covered platform is employing to comply with the requirements under this Act.
- Subject the Internet Protocol (IP) addresses, including known virtual proxy network IP addresses, of all users of a covered platform to the technology verification measure described in paragraph (1) unless the covered platform determines based on available technology that a user is not located within the United States.
- In order to comply with the requirement of subsection (a), the technology verification measures adopted and utilized by a covered platform shall do the following:
- (c) Choice of verification measures
- A covered platform may choose the specific technology verification measures to employ for purposes of complying with subsection (a), provided that the technology verification measure employed by the covered platform meets the requirements of subsection (b) and prohibits a minor from accessing the platform or any information on the platform that is obscene, child pornography, or harmful to minors.
- (d) Use of third parties
- A covered platform may contract with a third party to employ technology verification measures for purposes of complying with subsection (a) but the use of such a third party shall not relieve the covered platform of its obligations under this Act or from liability under this Act.
- (e) Rule of construction
- Nothing in this section shall be construed to require a covered platform to submit to the Commission any information that identifies, is linked to, or is reasonably linkable to a user of the covered platform or a device that identifies, is linked to, or is reasonably linkable to a user of the covered platform.
- (f) Technology verification measure data security
- A covered platform shall—
- establish, implement, and maintain reasonable data security to—
- protect the confidentiality, integrity, and accessibility of technology verification measure data collected by the covered platform or a third party employed by the covered platform; and
- protect such technology verification measure data against unauthorized access; and
- retain the technology verification measure data for no longer than is reasonably necessary to utilize a technology verification measure or what is minimally necessary to demonstrate compliance with the obligations under this Act.
- establish, implement, and maintain reasonable data security to—
- A covered platform shall—
SEC. 5. Consultation requirements
- In enforcing the requirements under section 4, the Commission shall consult with the following individuals, including with respect to the applicable standards and metrics for making a determination on whether a user of a covered platform is not a minor:
- Individuals with experience in computer science and software engineering.
- Individuals with experience in—
- advocating for online child safety; or
- providing services to minors who have been victimized by online child exploitation.
- Individuals with experience in consumer protection and online privacy.
- Individuals who supply technology verification measure products or have expertise in technology verification measure solutions.
- Individuals with experience in data security and cryptography.
SEC. 6. Commission requirements
- (a) In general
- The Commission shall—
- In general
- conduct regular audits of covered platforms to ensure compliance with the requirements of section 4;
- make public the terms and processes for the audits conducted under paragraph (1), including the processes for any third party conducting an audit on behalf of the Commission;
- establish a process for each covered platform to submit only such documents or other materials as are necessary for the Commission to ensure full compliance with the requirements of section 4 when conducting audits under this section; and
- prescribe the appropriate documents, materials, or other measures required to demonstrate full compliance with the requirements of section 4.
- (b) Guidance
- (1) In general
- Not later than 180 days after the date of enactment of this Act, the Commission shall issue guidance to assist covered platforms in complying with the requirements of section 4.
- In general
- (2) Limitations on guidance
- No guidance issued by the Commission with respect to this Act shall confer any rights on any person, State, or locality, nor shall operate to bind the Commission or any person to the approach recommended in such guidance. In any enforcement action brought pursuant to this Act, the Commission shall allege a specific violation of a provision of this Act. The Commission may not base an enforcement action on, or execute a consent order based on, practices that are alleged to be inconsistent with any such guidelines, unless the practices allegedly violate a provision of this Act.
- (1) In general
SEC. 7. Enforcement
- (a) Unfair or deceptive act or practice
- A violation of section 4 shall be treated as a violation of a rule defining an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (). 15 U.S.C. 57a(a)(1)(B)
- Unfair or deceptive act or practice
- (b) Powers of the Commission
- (1) In general
- The Commission shall enforce section 4 in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act () were incorporated into and made a part of this title. 15 U.S.C. 41 et seq.
- (2) Privileges and immunities
- Any person who violates section 4 shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (). 15 U.S.C. 41 et seq.
- (3) Authority preserved
- Nothing in this Act shall be construed to limit the authority of the Commission under any other provision of law.
- (1) In general
SEC. 8. GAO report
- Not later than 2 years after the date on which covered platforms are required to comply with the requirement of section 4(a), the Comptroller General of the United States shall submit to Congress a report that includes—
- an analysis of the effectiveness of the technology verification measures required under such section;
- an analysis of rates of compliance with such section among covered platforms;
- an analysis of the data security measures used by covered platforms in the age verification process;
- an analysis of the behavioral, economic, psychological, and societal effects of implementing technology verification measures;
- recommendations to the Commission on improving enforcement of section 4(a), if any; and
- recommendations to Congress on potential legislative improvements to this Act, if any.
SEC. 9. Severability clause
- If any provision of this Act, or the application of such a provision to any person or circumstance, is held to be unconstitutional, the remaining provisions of this Act, and the application of such provisions to any other person or circumstance, shall not be affected thereby.