Want the plain-English version? I'll explain what this bill does.
This is not an official government website.
Copyright © 2026 PLEJ LC. All rights reserved.
Makes it a federal crime and a consumer-protection violation to use realistic AI- or computer-generated audio, video, or other online impersonation to trick people into giving up money or other valuables. It creates criminal penalties (including prison, fines, and forfeiture), gives the FTC authority to pursue civil consumer-protection cases, requires technical best-practice guidance led by NIST, and directs U.S. agencies to pursue international cooperation while preserving core First Amendment protections like parody and journalism. The law includes exemptions for authorized U.S. and state law‑enforcement and intelligence activities, extends U.S. criminal reach to offenses that begin abroad, and requires public reporting and ongoing guidance updates to help detect and prevent digital impersonation fraud.
The bill strengthens consumer protections and cross-border enforcement against realistic AI impersonation scams while preserving agencies’ investigative tools, but it raises significant free‑speech and legal‑uncertainty risks for creators, increases compliance and enforcement costs, and could expose
Consumers and businesses will face stronger protections against high-fidelity AI audio/video impersonation scams through new federal prohibitions and available remedies (criminal penalties, forfeiture, and FTC enforcement), reducing fraud losses and improving deterrence.
Law enforcement and national-security agencies retain authority to use synthetic media in lawful investigations and intelligence operations, avoiding impediments to criminal and protective activities.
Consumers, businesses, and institutions gain publicly available technical guidance, standardized practices, and regular updates (plus stakeholder workshops) to better detect, avoid, and respond to digital impersonation scams.
Americans harmed by foreign-origin impersonation fraud can benefit from improved cross-border cooperation and targeted enforcement focused on top source countries, which may increase successful cross-border remedies and deterrence.
Creators, journalists, artists, and everyday users risk criminal exposure or chilling legal uncertainty because a broad definition of “digital impersonation” could sweep in legitimate parody, commentary, or ambiguous uses.
Businesses and technology developers may face substantial compliance pressure and costs (from FTC enforcement and de facto adoption of NIST guidance) that can be passed to consumers or reduce services.
Creating a new federal crime and expanding enforcement and reporting obligations will increase federal caseloads and administrative costs, potentially diverting DOJ/FTC staff and resources from other priorities and increasing taxpayer burden.
Extraterritorial reach and stronger cross-border enforcement could expose U.S. travelers, expats, and remote workers to prosecution for conduct legal where they are located and may create diplomatic friction with targeted countries.
Establishes the official short title of the Act as the "AI Fraud Accountability Act."
Redesignates current subsection (i) of 47 U.S.C. § 223 as subsection (j).
Inserts a new subsection (i) into 47 U.S.C. § 223 establishing a criminal prohibition on using digital impersonations to commit fraud.
Defines 'digital impersonation' to mean a visual or audio depiction of an identifiable individual or an imaginary individual created or altered by software, machine learning, AI, or other tech that a reasonable person would find indistinguishable from an authentic depiction.
Defines 'identifiable individual' to mean someone who appears or is heard in a digital impersonation and whose face, likeness, voice, or other distinguishing characteristic is displayed or heard.
Who is affected and how:
Victims and consumers: Individuals targeted by realistic AI-driven impersonation scams (including seniors and other fraud-prone groups) should see stronger legal tools for criminal prosecution, civil enforcement, and international cooperation to reduce cross-border scams.
Financial institutions and small businesses: Banks, payment platforms, and small businesses that suffer losses from impersonation fraud may see fewer attacks over time and could face new evidence/cooperation channels in investigations; they may also be asked to cooperate with investigations and adopt improved fraud-prevention practices.
Technology and platform companies (AI developers, social platforms, messaging services): Firms that build, host, or distribute AI-generated content may face compliance and operational costs to implement detection, labeling, or mitigation measures, and could be drawn into enforcement or civil cases under the FTC authority. The law also incentivizes development of detection tools and provenance/tracing capabilities.
Law enforcement and justice system: Federal and state law‑enforcement agencies will have a new federal offense to pursue and new international authorities to negotiate; prosecutorial and investigative resources may be required to handle cases, and DOJ will be tasked with strengthening international assistance agreements.
Standards bodies and researchers: NIST-led guidance and public workshops will shape technical norms and provide industry and researchers with prioritized best practices; this may speed development of detection and attribution tools.
International partners: The FTC and State Department will engage foreign counterparts, but negotiating effective cross-border cooperation may be slow and face legal and diplomatic obstacles.
Potential benefits and trade-offs:
Overall, the measure focuses on reducing fraud from realistic digital impersonation while building technical and international capacity to detect, prevent, and prosecute offenders. Implementation will require agency resources and industry cooperation, and effectiveness will depend on international partnerships and the pace of technical development.
Referred to the Committee on Energy and Commerce, and in addition to the Committees on the Judiciary, Science, Space, and Technology, and Foreign Affairs, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Introduced March 4, 2026 by Vernon G. Buchanan · Last progress March 4, 2026
Section defines 'digital impersonation' by reference to the meaning given in 47 U.S.C. 223(i), as added by section 2 of this Act; the section establishes a NIST-led working group and related reporting/termination provisions that relate to enforcement of section 223(i).
Incorporates the enforcement authority, jurisdiction, powers, duties, penalties, privileges, and immunities of the Federal Trade Commission Act into this section by treating violations as violations of rules defining unfair or deceptive acts or practices and by providing that the Commission shall enforce this section as if the FTC Act's terms were incorporated.
Treats a violation of the section's prohibition on digital impersonation as a violation of a rule defining an unfair or deceptive act or practice prescribed under section 18(a)(1)(B) of the FTC Act.
Defines 'digital impersonation' and 'identifiable individual' by referencing the meanings given in 47 U.S.C. 223(i) as added by section 2 of this Act.
Modifies cross-reference language in subsection (e)(1) to include the new subsection (i).
Specifies that section 413 of the Controlled Substances Act (21 U.S.C. 853), with the exception of subsections (a) and (d), applies to criminal forfeiture of property under the new subsection (i) of 47 U.S.C. 223.
Expand sections to see detailed analysis
Referred to the Committee on Energy and Commerce, and in addition to the Committees on the Judiciary, Science, Space, and Technology, and Foreign Affairs, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Introduced in House