The U.S. Equal Employment Opportunity Commission (EEOC) and the U.S. Department of Justice (DOJ) each recently released technical assistance documents warning employers that the use of artificial intelligence (AI) and other software tools may result in unlawful discrimination against people with disabilities in violation of the Americans with Disabilities Act (ADA). As employers increase their use of technology in the workplace, this guidance highlights the risks associated with AI and other software tools designed to help select new employees, monitor performance, and determine pay or promotions.
The DOJ’s guidance, “Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring,” provides a plain-language overview of the risks employers should consider when using AI and other technological tools as not to run afoul of the ADA. The EEOC’s guidance, “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees,” takes a more technical approach in alerting employers to the risks of using AI under Title I of the ADA and also provides ADA compliance tips. Both documents provide examples of how an employer’s use of algorithmic decision-making software and other technological tools could violate the ADA. Some of those examples include:
- The employer does not provide a reasonable accommodation that is necessary for a job applicant or employee to be rated fairly and accurately when using the software.
- The employer relies on a “virtual assistant” or “chatbot” that asks candidates about their job qualifications and intentionally or unintentionally “screens out” an individual with a disability, even though that individual is able to do the job with a reasonable accommodation.
- The employer utilizes a video interviewing software that evaluates candidates based on their facial expression or speech patterns, which could disparately impact individuals with disabilities such as autism or speech impairments, even if they are qualified for the position.
The EEOC also cautions that employers can be held responsible if the tools are designed or administered by a third-party entity, as employers may be held responsible for the actions of their agents. The DOJ and the EEOC also provides helpful recommendations on how to use these technologies while being mindful of their impact on candidates and employees with disabilities.
- Ensure the technology used is measuring only the relevant skills and abilities required for the position.
- Use an accessible test that measures a candidate’s job skills or makes other reasonable accommodations to accurately measure the skills needed for the position.
- Train staff to identify when a reasonable accommodation might be necessary.
- Plan in advance for alternative testing formats or methods to testing if the current process is inaccessible or unfairly disadvantages individuals with disabilities. Work with third parties who develop or administer the software or tools to have reasonable accommodations or alternative testing options prepared in advance.
- Inform all job applicants and employees who are being tested that reasonable accommodations are available for individuals with disabilities and provide clear and accessible instructions for requesting such accommodations.
- Confirm that the software or tool does not ask job applicants or employees questions that are likely to elicit information about a disability or seek information about an individual’s physical or mental impairments or health unless such inquiries are related to a request for reasonable accommodation.
In addition to its technical assistance, the EEOC released a summary document providing “Tips for Job Applicants and Employees.”
Frost Brown Todd continues to monitor the technological trends and legal risks associated with AI in the workplace. If you have questions about the employment-related risks of using AI in your workplace, please contact the authors of this article or any member of Frost Brown Todd’s Labor and Employment Practice Group.