Technology

Colorado’s AI compromise would drop requirement that companies explain how their technology works

Colorado lawmakers drop proposed AI rule forcing firms to disclose algorithms, easing tech industry concerns, CPR reports.

chip, chipset, AI, artificial intelligence, microchip, technology, innovation, electronics, computer hardware, circuit b

Image: GlobalBeat / 2026

Colorado AI law drops demand for companies to reveal algorithm secrets

Sarah Mills | GlobalBeat

Colorado lawmakers scrapped a requirement that AI developers explain how their systems make decisions under a late compromise bill revealed late Monday.

The deletion removes what tech lobbyists called the bill’s most toxic clause, clearing its path to Governor Jared Polis’ desk after months of industry resistance.

The original language would have forced firms to disclose proprietary training data and model weights each time citizens asked why an algorithm denied them housing, credit, or a job. Software trade groups argued the mandate would leak trade secrets to competitors and expose companies to industrial espionage.

State Senator Robert Rodriguez, the Denver Democrat who carried the measure, told CPR News the explanatory duty never had the votes to survive. “We traded transparency for passage,” he said after the House Finance Committee signed off on the amended bill 8-3 shortly before midnight.

The revised bill still labels certain “high-risk” AI tools used in hiring, lending, and housing decisions as potential discriminatory practices. It keeps a clause letting the state attorney general sue firms whose algorithms produce biased outcomes. Companies must notify Colorado residents when artificial intelligence makes consequential decisions about them and offer an appeals process staffed by humans.

What vanished is any mention of the word “explain.” The old text ordered developers to provide “a meaningful explanation of the basis for the decision” including “the data inputs that materially contributed to the outcome.” That single sentence drew fire from Microsoft, IBM, Palantir, and dozens of Colorado startups who warned legislators that revealing training datasets would hand Chinese rivals a blueprint to copy American code.

Cathryn Hazouri, executive director of the ACLU of Colorado, called the retreat “a gut punch for civil rights.” She said her group backed the bill only because it promised citizens a window into opaque systems that increasingly decide who gets a mortgage or makes it past a résumé screen. Without the narrative requirement, the law “is mostly theater,” Hazouri said.

Tech lobbyists saw the reversal differently. “Colorado almost criminalized math,” said Daniel Castro, vice president of the Information Technology and Innovation Foundation. He argued forcing firms to publish source code would have driven AI investment out of the state and into jurisdictions with looser rules.

The legislation now heads to the full House where Democrats hold a 43-22 majority. Party leaders said they expect a floor vote Thursday. If passed, Colorado would join California, Texas, and Connecticut in passing algorithmic-accountability laws, though each state targets different sectors and none compels public disclosure of model mechanics.

Privacy lawyers predicted the softer language could become a template for other states eager to police bias without provoking Silicon Valley. “Politicians want to say they regulated AI, just not the part that kills jobs,” said Cameron Russell, a partner at Hogan Lovells who watched Monday’s hearing from Denver.

Industry opposition intensified after European Union lawmakers approved stricter AI rules last year that impose transparency audits on high-risk systems. U.S. firms feared Colorado’s wording would usher in a similar regime state-by-state unless they fought early. The Software Alliance, whose members include Adobe, Oracle, and Salesforce, spent $340,000 on Colorado lobbying in the first quarter of 2026, state disclosures show.

Smaller AI vendors worried they lacked the legal teams to parse customer requests for explanations. “We make hiring software for truck stops,” said Dana Wu, founder of 18-person Aurora startup FleetHR. “If a driver asks why he didn’t get a job, I’d need outside counsel to decode my own algorithm.”

Republicans on the committee opposed the bill even after the compromise, calling any AI compliance burdens premature while Congress debates federal standards. “We’re creating a patchwork,” said Representative Lisa Frizell of Castle Rock. She offered an amendment to delay enforcement until Washington acts, which failed on party lines.

Democrats countered that Washington’s gridlock forces states to lead. “We can’t wait for Ted Cruz to figure out machine learning,” said Representative Alex Valdez of Denver, referencing the Senate Republican who chairs the chamber’s tech subcommittee. Federal proposals stalled last session after House Republicans balked at private-right-of-action clauses that let citizens file bias lawsuits.

The bill keeps an unusual enforcement structure. Instead of empowering individuals to sue, it authorizes only the state attorney general to investigate and fine violators up to $25,000 per incident. That mirrors Colorado’s 2021 privacy statute, which consumer groups criticize as toothless because the AG’s office lacks resources to police thousands of data brokers.

Attorney General Phil Weiser, a Democrat who requested the bill, insists his consumer-protection unit will prioritize AI cases. He pointed to a $10 million budget hike legislators approved last year to hire technologists. Still, his office would have 60 days to respond to citizen complaints, and the bill sets no minimum number of investigators.

Progressive Democrats wanted bigger teeth. Representative Elisabeth Epps of Aurora proposed an amendment restoring a private right of action, allowing workers or borrowers to sue for algorithmic discrimination. It failed 5-6 as two moderate Democrats joined Republicans, underscoring the party’s slim caucus discipline on tech issues.

Labor unions split. The Colorado AFL-CIO endorsed Epps’ amendment, arguing workers need court access to prove bias. But the state building-trades council opposed it, fearing trial lawyers would pummel contractors who screen applicants with AI tools already deemed legal under federal civil-rights precedent.

National consumer groups watched closely. Linda Lacewell, former New York Department of Financial Services superintendent, said Colorado’s compromise shows “transparency lost the battle to expediency.” She warned other states will copy the weakened language, cementing a disclosure-free standard across the country before Congress returns from campaign season.

Background

Colorado began eyeing AI regulation in 2023 after a ProPublica investigation found mortgage algorithms in Denver charged Latino borrowers higher interest rates even when credit scores matched white applicants. The AG’s office settled with two lenders for $8 million but lacked a statute explicitly covering machine-learning models.

Democrats introduced the first draft in January 2026, one week after ChatGPT-maker OpenAI announced plans to open a 400-employee Boulder campus, complicating the politics. Governor Polis praised the investment while promising to balance innovation with consumer safeguards. The bill has undergone seven rewrites, ballooning from 15 to 42 pages as lobbyists inserted carve-outs for medical devices, video games, and cybersecurity software.

What’s Next

The House must approve the bill by Friday or it dies with the session ends next Monday. If it passes, Polis has 30 days to sign or veto. He declined Monday to say whether he supports the stripped-down version but has hinted any AI law must preserve “competitive secrets.”

Sarah Mills
Technology & Science Editor

Sarah Mills is GlobalBeat’s technology and science editor, covering artificial intelligence, cybersecurity, public health, and climate research. Before joining GlobalBeat, she reported for technology desks across Europe and North America. She holds a degree in Computer Science and Journalism.