Earlier this year, the Equal Employment Opportunity Commission (EEOC) released its technical guidance about using software and AI in hiring. Specifically, it highlighted the issues US companies may face in achieving fairness while using AI in hiring. It also showcased the risk that US companies may fall afoul of the American Disabilities Act (ADA) by relying on AI.
Common Risks Arising From AI In Hiring
The EEOC referred to three specific risks arising from the use of AI in hiring:
No Reasonable Accommodation For Applicants Who Can’t Use The AI-Enhanced Tool
Employers are increasingly using AI-enhanced tools to assess candidates in the pre-hiring stages. AI tools are available to help assess:
- Hard skills, such as typing or language proficiency.
- Soft skills, like adaptability and learning agility.
- Job fit, usually through situational judgement.
- Personality through a virtual interview or questionnaire.
While these tools help screen out candidates that may not be the best fit for your company, they aren’t always suitable for all qualified candidates – and reasonable accommodations may need to be made to ensure disabled applicants aren’t excluded unfairly.
For instance, some of these tools rely on metrics like time spent reading the test or answering questions. As a result, these tools may be unsuitable or may provide unfavourable outcomes for applicants with disabilities in contravention of the ADA.
The AI Screens Out Individuals With Disabilities
There are two common ways that AI-enhanced tools will screen out individuals with disabilities:
- An issue with the design causes it to screen out a group of people who may have a disability. You might see this where a screening chatbot excludes a person because they have a gap in their job history, where an event that resulted in a disability caused that gap; or
- Where the AI ‘learns’ a bias against individuals with disabilities. This issue has plagued AI tools since they ‘learn’ based on the datasets fed to them, and the data they work from often excludes certain groups.
The Program Asks Disability-Related Information
You need to be sure that the AI-enhanced tool won’t pose questions to the candidate that are “disability-related questions or medical examinations”.
The EEOC notes:
“An assessment includes “disability-related inquiries” if it asks job applicants or employees questions that are likely to elicit information about a disability or directly asks whether an applicant or employee is an individual with disability. It qualifies as a “medical examination” if it seeks information about an individual’s physical or mental impairments or health.”
Overcoming The Risks of AI Tools in Hiring
The following tips help US employers remain compliant while using AI-enhanced tools in hiring. Some are based on the promising practices by the EEOC, and our employment attorneys have added others:
General Good Practices
- Prioritize AI-enhanced tools that are accessible and contemplate the experience of disabled applicants.
- Clearly state why and how the AI tool is being used.
- Provide information about how to request a reasonable accommodation.
- Consider how the AI assesses the skill/qualifications it checks for and prioritize AI tools that measure the skill directly rather than by correlation.
- Ensure the AI only tests for skills/qualifications required for that job.
- Specifically ask the vendor providing the AI tool about how it prevents hiring bias and discrimination.
For Employers Using AI In House
- Train your team to recognize and process requests for a reasonable accommodation.
- Train your team to assess applicants with disabilities fairly, including by offering alternative test formats.
- Watch for discriminatory outcomes.
For Employers Using A Third-Party To Administer an AI-Enhanced Tool
- Include provisions in your contracts that require the third party to either promptly forward any requests from candidates for a reasonable accommodation or provide an agreed-upon reasonable accommodation on the employer’s behalf.
- Require reporting that shows the third-party provider is watching for discriminatory outcomes.
The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.