Employment Law Report

EEOC Releases Technical Assistance Document on Use of Artificial Intelligence in Selection Procedures & the Possibility of Disparate Impact

By: Kaylee Secor, Wyatt Summer Associate

Employers are increasingly using artificial intelligence (AI) to aid them in their selection procedures, which are procedures used to make decisions regarding hiring, promotion, and firing. Using AI tools to aid in the selection process offers the promise of reducing or eliminating selection bias. However, there is the possibility that newer algorithmic decision-making tools can lead to “disparate impact” or “adverse impact” under Title VII of the Civil Rights Act of 1964.

Title VII prohibits employers from using selection procedures that disproportionately have a negative effect on a protected class on the basis of race, color, religion, sex, or national origin. When the selection procedures are not job related for the position in question and consistent with business necessity, this is known as “disparate impact” discrimination. The Equal Employment Opportunity Commission (EEOC) recently released a new technical assistance document discussing disparate or adverse impact to help employers prevent the use of AI from leading to discrimination in the workplace. The EEOC’s technical assistance document can be found here.

AI tools produce results based on information provided to them. Depending on the algorithms or data set being used, the AI tool may duplicate past discriminatory practices that favor one group of individuals over another. The EEOC provides that “employers can assess whether a selection procedure has an adverse impact on a particular protected group by checking whether use of the procedure causes a selection rate for individuals in the group that is ‘substantially’ less than the selection rate for individuals in another group.” The four-fifths rule is a general rule of thumb to determine whether a selection rate for one group is “substantially” different than that of another group. The rule provides that one selection rate is substantially different than another if their ratio is less than four-fifths (or 80%). However, courts have agreed that the four-fifths rule is not always an appropriate measure, especially where it is not a reasonable substitute for statistical significance.

If the use of an AI tool has a disparate or adverse impact on individuals of a particular class, then use of the tool will violate Title VII unless the employer can show that such use is job related or consistent with business necessity. The EEOC encourages employers to conduct routine self-analyses to determine whether their selection practices have a disproportionately large negative impact. If an employer discovers that its AI tool imposes a disparate impact on a particular class, then it should take steps to reduce the impact or select a different tool entirely.