Last year, the New York City Council passed Local Law Int. No. 1894-A, which amended the City’s administrative code to afford new protections to employees during the hiring and promotion processes. The law protects those individuals from unlawful bias by the employer when automated employment decision tools are used. Employers must conduct AI tool audits to confirm that such tools are not biased. The results of those audits must be published on publicly- available websites. Furthermore, the employer is required to disclose the data that the AI tool collects either by disclosing it publicly or in a response to an inquiry.


However, the City has yet to issue any guidance on the expectations or steps necessary to prepare for compliance. In particular, the law does not define what is meant by an “independent auditor.” Employers will likely rely on law firms and consulting firms to “perform the audit” since there are no other requirements other than that the party be “independent.” On top of this lack of specificity, many employers use third-party vendors’ automated tools, so the audit must also occur in regard to outside vendors’ software and practices. The challenge will be to determine what the vendors did during the construction and creation of the AI tools. The audit will likely need to consist of a discussion with technical experts who understand how the tools function as well as lawyers or consultants who are well versed in potential discrimination complaints.

Employers can also look to the U.S. Equal Employment Opportunity Commission technical assistance document, which includes guidance on AI hiring tools. It also includes questions to ask vendors related to such software.

This law will be enforced by the City’s Office of the Corporation Counsel; employers in non-compliance could face a $500 fine for the first violation, and $1,500 for each subsequent violation. The fines are multiplied by the number of AI tools used and the number of days the employer fails to correct the non-compliance. There is no private right of action under the law, but there is still potential for class actions in federal court for discrimination if the City issues fines for using discriminatory tools under this law. If you have yet to determine how you can comply with this new law, now is the time.

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.