Technology

Colorado’s AI compromise would focus regulations on informing consumers when the technology is used

Colorado lawmakers propose AI rules requiring companies to disclose automated systems use, avoiding stricter regulations in bipartisan compromise.

chatgpt, laptop, ai, artificial intelligence, technology, bot, robot, mac, computer, automation, future, copyright, laws

Image: GlobalBeat / 2026

Colorado AI regulations will force companies to tell customers when they’re talking to a bot

Sarah Mills | GlobalBeat

State lawmakers unveiled a compromise bill that scraps earlier plans to ban “high-risk” artificial intelligence systems and instead requires companies to disclose when customers interact with AI.

The retreat comes after tech industry lobbyists warned that original provisions requiring bias audits and algorithmic impact assessments would drive startups out of Colorado. Governor Jared Polis had threatened to veto the first draft, according to two people briefed on private meetings.

Colorado’s pivot marks the second time this year that a U.S. state has diluted aggressive AI oversight plans. Utah watered down its own proposal in March after Amazon and Adobe argued the measure would create a patchwork of conflicting rules across the 50 states.

Senate Bill 24-205 now focuses on “consumer transparency,” sponsor Senator Robert Rodriguez told reporters. Any company using AI to make “consequential decisions” about hiring, lending, housing, or education must notify Coloradans that an algorithm played a role. Businesses must also give customers an opt-out option and explain how the system works in plain language.

The definition of “consequential” narrowed during closed-door negotiations. The final text covers decisions that produce “legal or similarly significant effects” on individuals, a phrase copied from the European Union’s AI Act. Early drafts had included marketing recommendations and customer service chatbots, according to legislative staff who asked not to be named discussing internal deliberations.

Republicans who opposed the first bill signaled they could accept the disclosure-only approach. “At least we’re not micromanaging math anymore,” Senator Mark Baisley said after a committee hearing. Democrats still lack unanimous support, with two progressive members pushing to restore penalties for discriminatory algorithms.

Trade groups praised the shift. “Mandating opt-out rights respects both innovation and consumer choice,” said Jace Johnson, Rocky Mountain region director for TechNet, whose members include Google, Meta, and Salesforce. The Colorado Chamber of Commerce called the bill “workable” but wants the attorney general to issue guidance before any rules take effect.

Consumer advocates called the rewrite a capitulation. “Transparency without enforcement is worthless,” said Danny Katz, executive director of the Colorado Public Interest Research Group. His organization wanted private citizens to sue companies that deploy biased AI, a provision that vanished from the final text. The ACLU of Colorado accused lawmakers of “choosing tech donors over Black and Latino renters who get rejected by algorithmic screening tools.”

The bill still requires developers of “high-risk” AI systems to use “reasonable care” to avoid algorithmic discrimination. The standard is intentionally vague, bill drafters said, giving courts flexibility as case law develops. Attorney General Phil Weiser would enforce violations under the state’s consumer protection act, with fines up to $20,000 per incident.

Industry lawyers warned the negligence clause could still trigger litigation. “Any plaintiff can claim an algorithm harmed them and force a company to prove it acted reasonably,” said Courtney Lang, a Denver-based partner at Holland & Hart who represents fintech lenders. Startups lack legal departments to fight those suits, she added.

Smaller tech firms lobbied hardest for the overhaul, according to spending disclosures. Accountable Tech, a 12-person Boulder startup that builds compliance software for HR algorithms, spent $45,000 on lobbyists this session. Founder Devin Wozniak testified that the original bill would have forced him to leave Colorado. “We can’t afford third-party bias audits on every model update,” he told lawmakers.

National civil-rights groups split on the compromise. The Leadership Conference on Civil and Human Rights urged Polis to veto anything without private enforcement, while the NAACP Legal Defense Fund backed the bill as “a critical first step.” Both groups want Congress to pass federal legislation pre-empting state laws, but Washington remains gridlocked.

Background

Colorado first attempted to regulate AI in 2022 after a ProPublica investigation found that healthcare algorithms used by Denver-area hospitals favored white patients for specialized care. Legislators introduced a sweeping proposal that would have required companies to prove their systems worked equally across racial groups. Tech lobbyists defeated the measure by arguing it would ban virtually all automated decision-making.

The issue returned in January when Democrats secured larger majorities in both legislative chambers. Original sponsor Senator Rodriguez modeled this year’s bill on New York City’s Local Law 144, which mandates annual bias audits for hiring algorithms. Colorado’s version went further by covering housing, credit, education, and insurance decisions.

What’s Next

The full Senate will vote on the compromise bill next week. If passed, the measure moves to the House where Speaker Julie McCluskie has scheduled hearings starting May 15. Governor Polis has until June 7 to sign or veto legislation before the session adjourns. Tech industry lobbyists expect at least a dozen other states to introduce similar disclosure-only bills in 2027 legislative sessions.

Silicon Valley investors are watching Colorado as a test case for whether statehouses can agree on baseline AI rules without stifling innovation. Venture firm Andreessen Horowitz warned portfolio companies to prepare for “a patchwork of transparency mandates” even if the federal government remains paralyzed. Consumer groups plan to push ballot initiatives in 2028 if lawmakers continue watering down enforcement provisions.

Sarah Mills
Technology & Science Editor

Sarah Mills is GlobalBeat’s technology and science editor, covering artificial intelligence, cybersecurity, public health, and climate research. Before joining GlobalBeat, she reported for technology desks across Europe and North America. She holds a degree in Computer Science and Journalism.