Introduced September 19, 2025 by Yvette Diane Clarke · Last progress September 19, 2025
The bill strengthens consumer protections, transparency, and federal technical capacity to detect and mitigate harmful algorithmic decisions, but it also imposes substantial compliance, reporting, and privacy risks and leaves regulatory uncertainty and costs that could burden small firms, taxpayers, and innovation.
Consumers (including low-income people and members of protected groups) gain clearer protections, transparency, and recourse because the Act defines covered algorithms, requires pre- and post-deployment impact assessments, mandates notice/opt-out/contest mechanisms, and strengthens FTC enforcement tools.
Regulators and the public get better information and oversight: standardized, machine-readable summary reports, a public repository, and required testing/metrics improve auditability, transparency, and the ability of researchers and advocates to detect and address algorithmic harms.
Federal capacity for technical oversight increases because the Commission will hire technologists, share summary reports with NIST/OSTP/federal agencies, and develop guidance, templates, and periodic reviews to improve standards and enforcement.
Covered companies and organizations face substantial new compliance costs, reporting burdens, and increased exposure to enforcement actions and penalties, costs that are likely to be passed to consumers through higher prices or reduced services.
Broad or vague rulemaking authorities, undefined timelines, and discretionary formatting/criteria risk creating regulatory uncertainty, inconsistent enforcement, and potential forum-shopping between agencies and states.
Collecting, retaining, and publishing assessment details and sensitive provenance or demographic information increases privacy and security risks (including re-identification) for individuals represented in datasets unless safeguards are airtight.
Based on analysis of 11 sections of legislative text.
Requires businesses that meet size, data, or revenue thresholds and that deploy certain machine-learning or algorithmic systems that make or materially influence important decisions to run documented pre- and post-deployment impact assessments, mitigate likely material harms, and submit annual machine-readable summary reports to the Federal Trade Commission. The FTC must write implementing regulations (in consultation with NIST, OSTP, and other stakeholders), publish an anonymized public repository of certain report elements, stand up a Bureau of Technology to support enforcement, and may bring enforcement actions; State attorneys general may also sue on behalf of residents. Defines covered algorithms, covered entities (with revenue, equity, and data thresholds and CPI adjustments), and critical decision categories (health, housing, employment, finance, education, utilities, family planning, legal services, and comparable effects). Regulations must be promulgated within two years and will become effective two years after promulgation; some provisions (like non-preemption and agency coordination duties) take effect on enactment.