AI-driven and automated tools are increasingly being used in the employee lifecycle to help evaluate candidates, track performance and more, with businesses hoping to enhance speed and more efficiently deploy its in-house resources.
But along with the applications of these tools come risks, many of them already borne out in studies, such as existing biases being perpetuated and amplified with their use, and there is compliance risk in using any technology without adequate transparency about a system’s potential vulnerabilities and limitations.
The New York City Department of Consumer and Worker Protection (DCWP) recently released final rules regarding the city’s Local Law 144, which aims to provide more transparency around the use of AI in employment contexts.
The local law just went into effect on July 5, and it is the first of its kind in the nation.
Shortlisting for jobs
The law says that employers and employment agencies using an “automated employment decision tool” (AEDT) to evaluate candidates for employment or employees for promotion within New York City must have procured a “bias audit” and have procedures established for required notifications.
It applies only to New York City residents who are either candidates for hire in, or employees up for promotion to positions in New York City.
Decisions regarding compensation, termination, workforce planning, labor deployment, benefits, workforce monitoring, and likely even performance evaluations, are beyond the reach of the law.
Information about the bias audit must be made publicly available, and certain notices must have been provided to employees or job candidates.
Local Law 144 defines an AEDT as any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.
The law restricts employers and employment agencies from using such an AEDT in hiring and promotion decisions unless it has been the subject of a bias audit by an independent auditor no more than one year prior to use. An AI tool falls within scope of Local Law 144 only where its output is used to “substantially assist or replace discretionary decision making.”
That means it was the sole criterion in making the employment decision, with no other factors considered, or it was used as a criterion that was given more determinative weight than any other criterion.
The DCWP published an initial version of these rules in September 2022. It received comments about that version from the public, including from employers, employment agencies, law firms, AEDT developers, and advocacy organizations.
Various issues raised in the comments have resulted in changes to the proposed rules. These changes included modifying the definition of AEDT to ensure it is more focused and clarifying that an “independent auditor” may not be employed or have a financial interest in an employer or employment agency.
The audit evaluates if the AEDT model can distinguish between suitable and unsuitable candidates or if it uses sensitive attributes (such as gender, ethnicity or race) as the proxy for candidates’ validity.
The bias audit must take into account the details that need to be reported to the US Equal Employment Opportunity Commission (EEOC) such that it must calculate the selection rate for each race/ethnicity and sex category that is required to be reported on to the EEOC and compare the selection rates to the most selected category to determine an impact ratio.
Or, more simply stated: The audit evaluates if the AEDT model in use can distinguish between suitable and unsuitable candidates or if it uses sensitive attributes (such as gender, ethnicity or race) as the proxy for candidates’ validity.
An independent auditor in this context is defined as objective individuals or entities who are not and have not been involved in the use, development, or distribution of the AEDT being used.
Information about the bias audit must be made publicly available, and the employer must provide notice to applicants and employees of the tool’s use and functioning, plus supply notice that affected individuals may request an accommodation or alternative selection process.