The EEOC on AI in Hiring: Technical Guidelines Released

June 26, 2023

Last month, the US Equal Employment Opportunity Commission released technical guidance about assessing the use of artificial intelligence (AI) in the employment setting. This guidance is the latest publication by the EEOC relating to its 2021 initiative to ensure AI tools used in hiring and employment decisions comply with federal law.  

Key Findings in the Recent EEOC Guidance 

The EEOC guidelines provide technical assistance to employers, employees, job applicants and vendors for the use of AI or other algorithmic tools as they relate to the federal equal employment opportunity laws in hiring and other employment decisions. They provide the guidance in an FAQ format – and we’ve summarized some of the main takeaways below:  

Algorithmic decision-making tools may constitute a ‘selection procedure’. 

Under Title VII, employers are not permitted to use neutral tests or selection procedures that disproportionately exclude persons based on race, color, religion, sex, or national origin (unless certain exceptions apply). The EEOC’s Guidelines clarify that algorithmic tools, when used to make or inform decisions about hiring, promotions, termination, or similar actions toward applicants or current employees, are indeed “a selection procedure” and do fall under Title VII.  

Employers may be responsible for algorithmic decision-making tools administered by another entity. 

The EEOC Guidelines state that “if an employer administers a selection procedure, it may be responsible under Title VII if the procedure discriminates . . .even if the test was developed by an outside vendor”. This remains true even if the vendor states to the employer that the tool does not result in discrimination – and it is later found that the vendor’s assessment is incorrect.  

Practically, what this means is that employers must use caution when outsourcing employment decisions to third parties. Employers should undertake due diligence, including inquiring with the outside entity as to which AI tools they use, how they have tested those tools for disparate impact discrimination, and whether they used the four-fifths rule in their fairness testing (more on this below). Ideally, employers should also conduct tests to audit the fairness (and compliance) of the algorithms used by AI vendors.  

Further, employers should request indemnity and liability allocation in any contracts with relevant third parties.  

The four-fifths rule is a rule of thumb, not a guarantee 

The four-fifths rule is a general rule of thumb for determining whether the selection rate for one group is ‘substantially different’ to that of another. The rule outlines that one rate is substantially different when the ratio is less than four-fifths (or 80%).  

The EEOC Guidelines note that the rule is a “practical and easy-to-administer” test that may be used to draw initial inferences and prompt employers to provide additional information, but that it is not the sole determiner. It goes on to note that the courts often use (and prefer) a standard of statistical significance.  

Some Additional Context: Discriminatory Hiring Practices are an Enforcement Priority 

The Equal Employment Opportunity Commission (EEOC) announced its focus on enforcement against discriminatory recruitment and hiring practices, including the use of automated systems with artificial intelligence (AI) and/or machine learning, in January 2023 (in a draft strategic enforcement plan). 

Then, in April 2023, the EEOC joined the Department of Justice, Bureau of Consumer Financial Protection, and Federal Trade Commission in a joint statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems.   

Given the regulatory focus on AI, legal counsel should be consulted for guidance if you or your third-party vendors use AI or machine learning in hiring and/or employment-related decision-making.  

Our employment attorneys would be happy to help.  

 

Further Resources 

Algorithmic Fairness and the ADA (Our coverage) 

The EEOC’s Guidance about the ADA and AI used to assess job applicants and employees 

The AI Technical Advisory Committee Report from the Institute for Workplace Equality 

 

Disclaimer

The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.

Other Articles

Image of 3 equal piles of coins with figurines sitting on top of them, one has brown skin, one is a mother, and one is a white male to demonstrate how effective pay audits can result in equal pay. 5 Tips for Conducting Effective Pay Audits
Cartoon illustration of a nurse holding up an enormous needle highlighting the challenges posed by vaccine mandates as well as other pandemic-related compliance challenges. What California Employers Need to Know About Vaccine Mandates
Illustration of an employer and an employee pointing to an arbitration clause in employment contracts with a cup of coffee, gavel and other documents also on the table. What’s Happening with Arbitration Clauses in California Employment Contracts?

    Ready to Talk?
    Contact Us

    We would to hear from you

    Please take a moment to tell us a few things about your needs and someone from our team will reach out to you as soon as possible.

    We would to hear from you

    Thank you for reaching out!

    Someone from our team will get back to you shortly

    We would to hear from you