On March 17, Colorado Governor Jared Polis released a draft bill that would substantially overhaul the Colorado AI Act, replacing its core requirements with a narrower regime focused on disclosure, recordkeeping, and consumer notice requirements for “automated decision-making technology” (“ADMT”).  The proposal, which is still in draft form and not yet introduced in the General Assembly, marks the latest stage in a multi-year effort to reform the Colorado AI Act ahead of its June 30, 2026, effective date.

As enacted in May 2024, the current Colorado AI Act imposes separate, overlapping requirements for developers and deployers of “high-risk AI systems” that are used to make “consequential decisions” about consumers, including a duty of reasonable care for developers and deployers to protect consumers from “algorithmic discrimination.”  The draft bill would abandon that framework, repealing the duty of care and abandoning the incident reporting, impact assessment, and risk management requirements.  Instead, the draft bill favors a lighter-touch approach focused on transparency and consumer privacy, with narrower disclosure obligations for consumer-facing entities.  The proposal, if adopted, would more closely align with the approach taken by the California Privacy Protection Agency in its California Consumer Privacy Act (CCPA) ADMT Regulations.

Covered ADMT.  The draft bill would apply solely to “covered ADMT,” or ADMT “used to materially influence a consequential decision.”  While the draft bill’s definitions are arguably broader than their Colorado AI Act equivalents, the draft bill also contains new exceptions that would narrow its scope.

  • ADMT.  Similar to the CCPA ADMT Regulations, the draft bill defines “ADMT” broadly as “any technology that processes personal information and uses computation” to make or assist in decisions about individuals.  Unlike the Colorado AI Act’s “AI system” definition, however, the draft bill’s “ADMT” definition would exempt certain summarization tools, chatbots subject to acceptable use policies, and other technologies that do not rely on machine learning, regardless of whether the technologies are used to make consequential decisions “when deployed.”
  • Consequential Decision.  The draft bill expands this term to refer to a “decision, determination, or action for a consumer, employee, or applicant,” including a decision or action that “materially affects” the provision of, or “materially influences” the terms of, a covered opportunity or service.  The draft bill also would provide several exceptions to this term not found in the Colorado AI Act, including exceptions for “low-stakes or routine decisions, actions, and business processes” and “advertising, marketing, differentiated product recommendations, search, or content moderation.”
  • Material Influence.  The draft bill defines “materially influences” as an “ADMT output” that (1) is a “non-de minimis factor that is used in making a consequential decision” and (2) “affects the outcome of the decision.”  Unlike the Colorado AI Act’s “substantial factor” test, the draft bill’s “materially influences” definition would exclude “incidental, trivial or clerical uses” of ADMT.

Covered Entities.  The draft bill would define “developer” as any person that (1) “develops, offers, sells, leases, licenses, or otherwise makes commercially available a covered ADMT”; (2) “develops a component” for use as part of a covered ADMT in consequential decisions; or (3) “intentionally and substantially modifies an ADMT such that it becomes a covered ADMT.”  At the same time, the draft bill adds exemptions not found in the Colorado AI Act, including for developers of ADMT solely for internal purposes, and generally limits developer liability for downstream uses of ADMT “not intended, documented, marketed, advertised, configured, or contracted for by the developer.”  The draft bill also expands the Colorado AI Act’s definition of “consumer” to expressly include employees and job applicants.

Developer Obligations.  The draft bill would significantly narrow developer obligations compared to the current Colorado AI Act, applying only where a developer markets, provides, or intends for ADMT to be used in consequential decisions.  Covered developers would be required to (1) provide deployers with information regarding intended uses, limitations, and training data; (2) notify deployers of material updates, substantial modifications, and changes to that information; and (3) retain records demonstrating compliance for at least three years.

Deployer Obligations.  Covered deployers would be subject to a similar three-year record retention requirement, starting from the “date of the consequential decision.”  The draft bill’s other deployer obligations focus solely on consumer-facing notices and disclosures, while removing the Colorado AI Act’s deployer impact assessment and incident reporting requirements:

  • Point-of-Interaction Notice or Public Posting.  Covered ADMT deployers would be required to provide clear and conspicuous notice to consumers that the deployer “uses covered ADMTs for consequential decisions” and how to obtain additional information.  Alternatively, deployers may satisfy this requirement by maintaining a “prominent public notice” at points of consumer interaction.
  • Post-Adverse Decision Disclosure. If a deployer’s use of ADMT for a consequential decision results in an “adverse outcome” for a consumer, the deployer would be required to provide the consumer with information about the consequential decision, the role of ADMT, and how the consumer can request additional information, request and correct their personal data, and request meaningful human review or reconsideration.

Enforcement.  As under the Colorado AI Act, the proposal would grant the Colorado Attorney General exclusive enforcement authority pursuant to Colorado’s consumer protection statute.  Unlike the existing law, however, the draft bill includes a 90-day cure period, limits the Act’s exemption for creditors to apply only to entities that provide an equivalent notice under federal financial regulations, limits the Act’s existing exemption for HIPAA covered entities to apply only to covered entities that provide notices to patients of the use of ADMT, and allocates liability between developers and deployers “based on their relative fault for the violation.”   It also would prohibit indemnification provisions between developers and deployers related to uses of ADMT that violate Colorado’s antidiscrimination laws.

The draft bill’s future is far from certain.  Upon signing the Colorado AI Act in May 2024, Governor Polis called on the General Assembly to amend the Act; in June 2024, Governor Polis and other state officials announced a “process to revise” the Act to harmonize its definitions with federal and state frameworks and reduce compliance burdens.  Despite these concerted efforts, however, Colorado lawmakers failed to pass significant amendments to the Act in the 2025 legislative session, and instead opted to extend the Act’s effective date to June 30, 2026, to consider further amendments this year.  The ongoing effort to preempt “onerous” state AI laws under President Trump’s AI Preemption Executive Order, which expressly criticized the Colorado AI Act for requiring “ideological bias within models,” could lead Colorado lawmakers to push the draft bill – or other substantive amendments to the Act – over the finish line in 2026.

Matthew Shapanka

Matthew Shapanka draws on more than 15 years of experience – including on Capitol Hill, at Covington, and in state government – to advise and counsel clients across a range of industries on significant legislative, regulatory, and enforcement matters. He develops and executes…

Matthew Shapanka draws on more than 15 years of experience – including on Capitol Hill, at Covington, and in state government – to advise and counsel clients across a range of industries on significant legislative, regulatory, and enforcement matters. He develops and executes complex, multifaceted public policy initiatives for clients seeking actions by Congress, state legislatures, and federal and state government agencies, many with significant legal and political opportunities and risks.

Matt rejoined Covington after serving as Chief Counsel for the U.S. Senate Committee on Rules and Administration, where he advised Chairwoman Amy Klobuchar (D-MN) on all legal, policy, and oversight matters within the Committee’s jurisdiction, including federal election law and campaign finance, and oversight of the Federal Election Commission, legislative branch agencies, security and maintenance of the U.S. Capitol Complex, and Senate rules and regulations.

Most significantly, Matt led the Rules Committee staff work on the Electoral Count Reform and Presidential Transition Improvement Act – landmark bipartisan legislation to update the antiquated process of certifying and counting electoral votes in presidential elections that President Biden signed into law in 2022.

As Chief Counsel, Matt was a lead attorney on the joint bipartisan investigation (with the Homeland Security and Governmental Affairs Committee) into the security planning and response to the January 6, 2021 attack on the Capitol. In that role, he oversaw the collection review of documents, led interviews and depositions of key government officials, advised the Chairwoman and Committee members on two high-profile joint hearings, and drafted substantial portions of the Committees’ staff report on the attack. He also led oversight of the Capitol Police, Architect of the Capitol, Senate Sergeant at Arms, and executive branch agencies involved in implementing the Committees’ recommendations, including additional legislation and hearings.

Both in Congress and at the firm, Matt has prepared many corporate and nonprofit executives, academics, government officials, and presidential nominees for testimony at legislative, oversight, or nomination hearings before congressional committees, as well as witnesses appearing at congressional depositions and transcribed interviews. He is also an experienced legislative drafter who has composed dozens of bills introduced in Congress and state legislatures, including several that have been enacted into law across multiple policy areas.

In addition to his policy work, Matt advises and represents clients on the full range of political law compliance and enforcement matters involving federal election, campaign finance, lobbying, and government ethics laws, the Securities and Exchange Commission’s “Pay-to-Play” rule, as well as the election and political laws of states and municipalities across the country.

Before law school, Matt worked as a research analyst in the Massachusetts Recovery & Reinvestment Office, where he worked on all aspects of state-level policy, communications, and compliance for federal stimulus funding awarded to Massachusetts under the American Recovery & Reinvestment Act of 2009. He has also worked for federal, state, and local political candidates in Massachusetts and New Hampshire.

Photo of Vanessa Lauber Vanessa Lauber

Vanessa Lauber is an associate in the firm’s New York office and a member of the Data Privacy and Cybersecurity Practice Group, counseling clients on data privacy and emerging technologies, including artificial intelligence.

Vanessa’s practice includes partnering with clients on compliance with federal…

Vanessa Lauber is an associate in the firm’s New York office and a member of the Data Privacy and Cybersecurity Practice Group, counseling clients on data privacy and emerging technologies, including artificial intelligence.

Vanessa’s practice includes partnering with clients on compliance with federal and state privacy laws and FTC and consumer protection laws and guidance. Additionally, Vanessa routinely counsels clients on drafting and developing privacy notices and policies. Vanessa also advises clients on trends in artificial intelligence regulations and helps design governance programs for the development and deployment of artificial intelligence technologies across a number of industries.

August Gweon

August Gweon counsels national and multinational companies on data privacy, cybersecurity, antitrust, and technology policy issues, including issues related to artificial intelligence and other emerging technologies. August leverages his experiences in AI and technology policy to help clients understand complex technology developments, risks…

August Gweon counsels national and multinational companies on data privacy, cybersecurity, antitrust, and technology policy issues, including issues related to artificial intelligence and other emerging technologies. August leverages his experiences in AI and technology policy to help clients understand complex technology developments, risks, and policy trends.

August regularly provides advice to clients for complying with federal, state, and global privacy and competition frameworks and AI regulations. He also assists clients in investigating compliance issues, preparing for federal and state privacy regulations like the California Privacy Rights Act, responding to government inquiries and investigations, and engaging in public policy discussions and rulemaking processes.

Photo of Irene Kim Irene Kim

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal…

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal AI legislation, comprehensive state privacy laws, and regulatory compliance matters.