Making Sure AI Complies With Anti-Discrimination Law
9.18.2023
Employers are using artificial intelligence to make hiring decisions – but they must ensure that the AI does not enable discrimination.
A recent Continuing Legal Education course – sponsored by the Labor & Employment Law Section and the Committee on Technology and the Legal Profession – covered new developments in employment law as it relates to such technologies. The panelists were:
- Joseph Lockinger, special counsel at Cooley
- Debbie Reynolds, founder, CEO and chief data privacy officer at Debbie Reynolds Consulting
- Steven A. Zuckerman, associate at Cooley
According to Equal Employment Opportunity Commission Chair Charlotte Burrows, more than 80% of employers use AI in some form to make decisions – be it scanning resumes, interviewing applicants, training employees or conducting performance reviews. Sometimes humans oversee these decisions, but not always.
Often these systems are only tested by software vendors. “While the software vendors claim that AI reduces long-prevailing discrimination in the hiring process, critics are dubious of this,” said Lockinger. “They claim there’s no way to guarantee that software isn’t simply reproducing systemic and institutional bias. Because of this, there is significant risk for employers that are using AI in the employment decision-making process.”
Bias in AI Systems
AI can discriminate in a number of ways. For example, video-interviewing software may filter out a person with a speech impediment because it cannot understand their words. An algorithm-based personality test may pass more white candidates than Black candidates.
The EEOC has released guidance reminding employers that AI tools still need to comply with the Americans With Disabilities Act and Title VII, which includes providing reasonable accommodations for applicants.
The EEOC has already settled a lawsuit based on discrimination through AI. iTutorGroup Inc. programmed its recruitment software to automatically reject older people – which was discovered when one applicant resubmitted their resume with a younger birthdate and was offered an interview when they had been previously rejected.
“These assumptions that they were making go to the heart of employment discrimination law,” said Lockinger. “Where you’re assuming unfairly and unlawfully that someone who is older simply can’t do the job… The ease of doing that with technology is what’s concerning to the EEOC and the administrative bodies that are attempting to look at these issues.”
As part of the settlement, iTutorGroup paid $365,000 to a group of rejected applicants and invited them to reapply. The company was also required to adopt anti-discrimination policies and conduct trainings with its employees.
“You really need to decide from a business perspective why you’re asking certain questions,” said Zuckerman. “In a lot of cases, there’s really no need to ask an applicant what their date of birth is at the early stage of an application. You may need to know if they’re 18 or older in order to hire them… Although the technology is new, a lot of these laws have been around. So we shouldn’t just forget all the compliance efforts we had prior to the advent of these technologies.”
Anti-Discrimination Enforcement in New York City
New York City Local Law 144, which was enacted in 2021, requires employers to check their systems for bias. Under the law, automated employment decision tools must be evaluated for bias by an independent auditor before its use, followed by a new bias audit at least once a year. The results of this audit must be publicly available on an employer’s or agency’s website as well.
“Most of it is straightforward, however it is worth remembering that compliance with the audit requirement will not insulate an employer from liability under other employment laws,” said Lockinger. “Such as state, local and federal discrimination laws.”
Furthermore, a company must provide notice to candidates that it is using such a tool and provide instructions to request reasonable accommodations if needed.
Enforcement of Local Law 144 started in July.
While there are only a few obligations, Zuckerman cautioned against believing that complying with the law is an all-clear for employers. “You may be in compliance, but that does not make it a good idea to continue to use a tool that clearly creates bias,” he said. “Especially because notice has to be provided externally, you are really setting yourself up for potential liability and potential scrutiny from outsiders.”
The audit portion of the law applies to employers based in New York City – which includes remote positions reporting to or associated with offices in the city. However, the notice requirement of the law only applies to New York City residents.
Reynolds said it was important for employers to establish clear processes for how to handle data and tools used in employment, as well as training and how the use of AI aligns with company values and ethics.
“A lot of these systems are trying to automate and bring in some level of efficiency,” she said. “When companies do this, what they’re doing is taking a lot of their manual processes and trying to put them into these tools. But as they’re being put in these tools, companies really need to look out for things that can be biased, things that can be discriminatory, things that aren’t exactly clear or objective.”
The CLE is available on demand.