10 Things You Need to Know About the NYC Mandatory Bias Audits

Airlie Hilliard
Holistic AI Publication
5 min readOct 31, 2022

To address some of the concerns about the use of automated employment decision tools (AEDTs) in making employment decisions, the New York City Council has taken decisive action and passed legislation that mandates bias audits of these tools. The Local Law 144, colloquially known as the NYC Bias Audit law, comes into effect on 1st January 2023. To clarify some of the requirements of this legislation, the Department of Consumer and Worker Protection has proposed some additional rules and is holding a hearing to provide further clarifications. This blog post outlines the 10 key things you need to know about this legislation and the proposed rules.‍

1. What is a bias audit?

An impartial evaluation of an automated employment decision tool carried out by an independent auditor that should include (but is not limited to) assessing for disparate impact against category 1 protected characteristics (race/ethnicity and sex/gender at minimum). Employers must provide a summary of this audit on their website if using automated employment decision tools to assess candidates residing in New York City and must inform them of the key features of the automated tool before using it.

The proposed rules specify that bias should be determined using impact ratios based on subgroup selection rate (% of individuals in the subgroup that are hired), subgroup average score, or both. Ratios are calculated by dividing the subgroup average score/selection rate by the average score/selection rate of the group with the group of the highest score/rate:

2. What is an automated employment decision tool?

A computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that produces a simplified output (a score, classification, or recommendation) used to aid or automate decision-making for employment decisions (screening for promotion or employment).

The proposed rules clarify that machine learning, machine learning, statistical modeling, data analytics, or artificial intelligence are a group of computer-based mathematical, computer-based techniques that generate a prediction of a candidate’s fit or likelihood of success or classification based on skills/aptitude. The inputs, predictor importance, and parameters of the model are identified by a computer to improve model accuracy or performance and are refined through cross-validation or by using a train/test split. They also clarify that a simplified output includes ranking systems.‍

3. What are some examples of an automated employment decision tool?

Video interviews, game-based/image-based assessments, and resume screening tools etc. that are scored or evaluated by an algorithm. Systems that rank candidates on their suitability for a position or how well they meet some criteria are also considered automated employment decision tools.‍

4. What documentation do employers have to provide?

Employers using an automated employment decision tool must provide a summary of a current bias audit (< 1 year old) on their website or the Employment agency website before using the tool.

The proposed rules clarify that this summary should appear in the careers or jobs section of their website in a clear and conspicuous manner and should include the date of the most recent bias audit of such AEDT, the distribution date of the AEDT to which such bias audit applies, and a summary of the results (including selection rates and impact ratios for all categories)‍

5. What are the notification requirements of the legislation?

At least ten working days before the tool is used, candidates must be informed that an automated employment decision tool is being used to assess them and allow them to request an accommodation or alternative selection process. The characteristics that are being used to make the judgments and the source and type of data being used within 30 days of the written request. If it is not available on the website of the employer or the Employment Agency.

The proposed rules clarify that the notice can be given by including it in a job posting or by sending it through U.S. mail or e-mail. For employees specifically, notice can also be given in a written policy or procedure that is provided to employees, and for candidates, the notice can be included on the careers or jobs section of its website.‍

6. Who does the legislation apply to?

Employers using automated employment decision tools to evaluate candidates or employees who reside in New York City for a position or promotion.‍

7. Are there penalties for noncompliance?

Up to $500 for the first and additional violations occurring on the same day. Subsequent violations incur penalties of $500 — $1500.‍

8. Does this affect the civil rights of candidates?

The subchapter should not be construed to limit the rights of any candidate or employee for an employment decision to bring a civil action. Therefore, candidates’ civil rights are not affected, and other relevant equal employment laws must still be followed by the employer.

The proposed rules clarify that nothing in the legislation requires employers to comply with requests for alternative procedures or accommodations, but other legislation may cover these practices (e.g., Americans with Disabilities Act; ADA).‍

9. When does the legislation come into effect?

From 1st January 2023, it will be unlawful for employers to use an automated employment decision tool without a bias audit to screen candidates or employees residing in New York City.‍

10. Where can I find out more about the legislation?

You can read the legislation here and find our commentary, where we discuss some of the shortcomings and areas of ambiguity here. To find out more about the proposed rules specifically, see this blog.

This article is originally published at Holistic AI.

--

--