The Equal Employment Opportunity Commission (EEOC) has settled a first-of-its-kind claim against a Chinese tutoring provider for using Artificial Intelligence (AI) that allegedly discriminated against protected classes of workers. In this case, it was alleged that the AI excluded women applicants 55 or older and male applicants who were 60 or older. Here’s what you need to know about using AI in hiring and the potential risks because of this case:
Factual Background and Settlement Details
In 2022, the EEOC announced that it was suing iTutorGroup for automatically rejecting candidates above a certain age. The allegations stated that iTutorGroup discriminated against more than 200 qualified, US-based candidates who applied to provide remote tutoring services.
iTutorGroup has agreed to pay $365,000 to the 200+ candidates who were excluded based on their age, however it has not admitted any wrongdoing. The settlement will require court approval, which is pending.
“Age discrimination is unjust and unlawful. Even when technology automates the discrimination, the employer is still responsible,” said EEOC Chair Charlotte A. Burrows. “This case is an example of why the EEOC recently launched an Artificial Intelligence and Algorithmic Fairness Initiative. Workers facing discrimination from an employer’s use of technology can count on the EEOC to seek remedies.”
AI-Bias in Hiring is a Current Enforcement Priority
We reported earlier that the Equal Employment Opportunity Commission (EEOC) announced its focus on enforcement against discriminatory recruitment and hiring practices, including the use of automated systems with artificial intelligence (AI) and/or machine learning, in January 2023 (in a draft strategic enforcement plan) including the use of automated systems with artificial intelligence (AI) and/or machine learning, in January 2023 (in a draft strategic enforcement plan).
Then, in April 2023, the EEOC joined the Department of Justice, Bureau of Consumer Financial Protection, and Federal Trade Commission in a joint statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems. Clearly, the US and its various labor enforcement agencies have focused their efforts on reducing discrimination in novel areas, specifically, those affected by AI.
Expect More Lawsuits for Employer Use of Biased AI
It is widely expected that the results of this EEOC matter will cause an increase in lawsuits alleging discriminatory employment practices as a result of AI.
We are already seeing this occur in California, where a class action lawsuit has been filed against Workday. The lawsuit, which has yet to be heard in court, alleges that Workday’s widely used AI hiring process disproportionately disqualifies applicants who are Black, disabled, or older than 40.
Actionable Insights for California Employers Using AI Tools in Hiring Processes
We outlined the following good practices for using AI Tools in hiring processes in an earlier blog post:
General Good Practices
- Prioritize AI-enhanced tools that are accessible and contemplate the experience of all applicants, not just certain categories.
- Clearly state why and how the AI tool is being used.
- Provide information about how to request a reasonable accommodation.
- Consider how the AI assesses the skill/qualifications it checks for and prioritize AI tools that measure the skill directly rather than by correlation.
- Ensure the AI only tests for skills/qualifications required for that job.
- Specifically ask the vendor providing the AI tool about how it prevents hiring bias and discrimination.
For Employers Using AI In House
- Train your team to recognize and process requests for reasonable accommodations.
- Train your team to assess applicants fairly and to regularly review those applicants being rejected by AI to determine whether certain groups (such as those over the age of 40, women, or other protected categories0 are being disproportionately rejected.)
- Regularly screen for discriminatory outcomes.
For Employers Using a Third-Party to Administer an AI-Enhanced Tool
- Include provisions in your contracts that require the third party to either promptly forward any requests from candidates for reasonable accommodation or provide an agreed-upon reasonable accommodation on the employer’s behalf.
- Require reporting that shows the third-party provider is watching for discriminatory outcomes.
- Require data showing the age, ethnicity, gender, and race, if determinable, of the applicants who applied versus the applicants selected and rejected.
If you need help developing compliant hiring practices, reach out. Our employment attorneys would love to help.
The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.