New York: (347) 941-0760

Law Offices Of David S. Rich - Employment lawyer

Text Us: (347) 389-7755

New Jersey: (201) 740-2828

Law Offices Of David S. Rich - Employment lawyer

Text Us: (347) 389-7755


Can My Company In New York City Use Automated Employment Decision Tools To Screen Employees Or Candidates For Employment Or Promotion?Rapid advancements in artificial intelligence have created opportunities and challenges for employers in New York City. This article explores this evolving reality, highlighting:

  • The restrictions that the New York City AI Bias Law, New York City Local Law 144 of 2021, imposes on employers’ use of automated employment decision tools in screening workers or job seekers for employment or promotion.
  • The substantive duties, notice requirements, and monetary penalties to which employers in New York City are subject when utilizing automated employment determination tools in assessing workers or job seekers for employment or advancement.
  • Means of bringing your company in New York City into compliance with the statute governing the use of artificial intelligence in the workplace.

New York City law regulates your company’s use of automated employment decision tools (“AEDTs”) to screen employees or candidates for employment or promotion. Effective July 5, 2023, Local Law 144 of 2021 (codified as N.Y. City Admin. Code §§ 20-870 – 20-874) (the “New York City AI Bias Law,” the “AI Bias Law,” or “Local Law 144”) prohibits employers and employment agencies in New York City from using automated employment decision tools to screen a candidate or employee for an employment decision unless the tool has been subject to a bias audit no more than one year before the use of that tool.

On April 6, 2023, the New York City Department of Consumer and Worker Protection (the “the NYC DCWP”) issued a Final Rule, codified as §§ 5-300 – 5304 of Title 6 of the Rules of the City of New York (the “Final Rule”), providing further guidance on the New York City AI Bias Law.

In addition, before using an automated employment decision tool, employers and employment agencies in New York City must make publicly available, on their websites, a summary of the results of the most recent bias audit of the tool as well as the distribution date of the tool to which the audit applies.

Further, in New York City employers and employment agencies that use automated employment decision tools to screen workers or job applicants must notify each employee or candidate who has applied for a position for an employment decision and who resides in the City of the following:

  • That an automated employment decision tool will be used in connection with the assessment or evaluation of the worker or job applicant. This notice must be made no less than ten business days before such use and must allow a candidate to request an alternative selection process or a reasonable accommodation under other laws;
  • The job qualifications and characteristics that the automated employment decision tool will use in the assessment of the job applicant or employee. This notice must be made no less than ten business days before such use; and
  • If not disclosed on the employer or employment agency’s website, information about the type of data collected for the AEDT, the source of such data and the employer or employment agency’s data retention policy must be available upon written request by a job applicant or worker. This information must be provided within 30 days of the written request.

As stated above, the New York City AI Bias Law requires that an employer or an employment agency’s advance notice, to employees or candidates, of the use of an AEDT must allow a candidate to request an alternative selection process or a reasonable accommodation. However, the Final Rule clarifies that the AI Bias Law does not “require an employer or employment agency to provide an alternative selection process,” even if requested by a candidate.

The New York City AI Bias Law defines an “automated employment decision tool” as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence [(“AI”)], that issues simplified output, including a score, classification, or recommendation that is used to substantially assist or replace discretionary decision making for making employment decisions that impact” individuals.

In turn, the AI Bias Law’s Final Rule defines “[m]achine learning, statistical modeling, data analytics, or artificial intelligence” as “a group of mathematical, computer-based techniques:”

  1. that generate a prediction, meaning an expected outcome for an observation, such as an assessment of a candidate’s fit or likelihood of success, or that generate a classification, meaning an assignment of an observation to a group, such as categorizations based on skill sets or aptitude; and
  2. for which a computer at least in part identifies the inputs, the relative importance placed on those inputs, and, if applicable, other parameters for the models in order to improve the accuracy of the prediction or classification.

Under the AI Bias Law, to engage in an “employment decision” means “to screen candidates for employment or employees for promotion within [New York] City.”

The accuracy and effectiveness of artificial intelligence depends on the quality of the information that is placed in it. When human programmers with conscious or unconscious biases input data into artificial intelligence, the artificial intelligence systems may operate in a biased manner.

For example, based on the explicit or implicit biases of human programmers who feed data to artificial intelligence, AI may assume that the optimal worker must be able-bodied; that food-insecure families are noncitizens; or that more crime takes place in postal zip codes with majority-minority populations.

What Consequences Do Employers, Employment Agencies, and Other Persons Face For Violating The New York City Statute Regulating Use, By Employers And Employment Agencies, Of Automated Employment Decision Tools To Screen Candidates Or Employees For Employment Decisions?

Any employer, employment agency, or other person who violates the New York City AI Bias Law is liable for a civil penalty of not more than $500 for a first violation and each additional violation occurring on the same day as the first violation, and for a civil penalty of not less than $500 nor more than $1,500 for each subsequent violation.

Further, each day on which an automated employment decision tool is used in violation of the AI Bias Law constitutes a separate violation of the New York City statute.

Likewise, each failure to provide the required notice to a candidate or an employee is a separate violation of the New York City AI Bias Law.

The AI Bias Law authorizes the City’s chief legal officer, the New York City Corporation Counsel, to bring lawsuits in court to enforce the statute.

The New York City AI Bias Law states that it “shall not be construed to limit” any rights which employees or candidates for employment decisions may have to bring civil actions for alleged violations of the AI Bias Law. Research reveals no judicial decisions addressing whether, in fact, the New York City AI Bias Law confers, on employees or candidates, a private cause of action for violations of the statute.

Under the AI Bias Law’s Final Rule, where an AEDT selects candidates for employment or employees being considered for promotion to move forward in the hiring process or classifies them into groups, a bias audit must at a minimum, among other things:

  • Calculate the selection rate for each category; and
  • Calculate the impact ratio for each category

Further, the bias audit’s selection rate and impact ratio calculations must separately calculate the impact of the AEDT on gender categories, race / ethnicity categories, and intersectional categories of gender, ethnicity, and race.

How Can Companies In New York City Come Into Compliance With The AI Bias Law?

To comply with the New York City AI Bias Law, employers and employment agencies in New York City should:

  • Arrange for a bias audit of automated employment decision tools currently in use;
  • Confirm that AEDTs that may be acquired in the future can be appropriately audited for bias before being placed in use;
  • Draw up a compliant notice to candidates and workers that an AEDT is being used and identify where it will be posted; and
  • Train individuals responsible for utilizing AEDTs about critical requirements of the AI Bias Law.

If your company needs assistance or guidance on a labor or employment law issue and your company is located in the New York metro area, call the New York Employment Attorney David S. Rich at (347) 941-0760.

David Rich

Call Now For An Initial Consultation

New York (347) 941-0760 |
New Jersey (201) 740-2828