Stemming from Colorado’s Concerning Consumer Protections in Interactions with Artificial Intelligence Systems Act (the Act), which will impose obligations on developers and deployers of artificial intelligence (AI), the Colorado Artificial Intelligence Impact Task Force recently issued a report outlining potential areas where the Act can be “clarified, refined[,] and otherwise improved.”

The Task Force’s mission is to review issues related to AI and automated detection systems (ADS) affecting consumers and employees. The Task Force met on several occasions and prepared a report summarizing their findings:

  • Revise the Act’s definition of the types of decisions that qualify as “consequential decisions,” as well as the definition of “algorithmic discrimination,” “substantial factor,” and “intentional and substantial modification;”
  • Revamp the list of exemptions to what qualifies as a “covered decision system;”
  • Change the scope of the information and documentation that developers must provide to deployers;
  • Update the triggering events and timing for impact assessments as well as changes to the requirements for deployer risk management programs;
  • Possible replacement of the duty of care standard for developers and deployers (i.e., consider whether such standard should be more or less stringent);
  • Consider whether to minimize or expand the small business exemption (the current exemption under the Act is for businesses with less than 50 employees);
  • Consider whether businesses should be provided a cure period for certain types of non-compliance before Attorney General enforcement under the Act; and,
  • Revise the trade secret exemptions and provisions related to a consumer’s right to appeal.

As of today, the requirements for AI developers and deployers under the Act go into effect on February 1, 2026. However, the Task Force recommends reconsidering the law’s implementation timing. We will continue to track this first-of-its-kind AI law. 

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.