Whether we call it the Great Resignation, the Great Reevaluation, the Great Reset or something else, finding, attracting and retaining talented workers remains among the top challenges facing HR leaders. With the unemployment rate in the U.S. at a near 50-year low, 4 million-plus workers voluntarily quitting their jobs each month and time-to-fill increasing dramatically (in some cases, over 60 business days, more than double the historical average), HR leaders are being forced to look for more creative ways to address their hiring and retention challenges.
Increasingly, they are turning to HR technology that leverages powerful artificial intelligence capability. It’s no secret that there has been a proliferation of new HR technologies in the last several years that utilize AI for a wide range of HR and talent acquisition processes like candidate matching, skills profiling, interview scheduling, screening and more. These AI technologies are designed to accomplish multiple objectives, including improving speed and efficiency, surfacing talent that might otherwise go unnoticed, freeing up HR and TA professionals for more “high-touch” tasks and (hopefully) removing or at least mitigating bias in the hiring and talent management processes. Almost every organization I speak with is now either implementing or evaluating the use of AI in HR technology. In fact, recent research from HR technology solution provider Eightfold AI shows that 92% of organizations are planning on increasing their use of AI technology in at least one area of HR, including talent acquisition and talent management, employee onboarding, payroll processing and more.
But the widespread adoption of AI technology for HR functions has recently drawn the attention of federal agencies such as the Department of Justice and the Equal Employment Opportunity Commission, which are interested in how these AI technologies could be influencing hiring and other decision-making processes. Specifically, these agencies are examining how AI and other advanced technologies in hiring and other processes may negatively impact people with disabilities. For instance, in a May 12 Technical Assistance Document titled The Americans with Disabilities Act and the Use of Software, Algorithms and Artificial Intelligence to Assess Job Applicants and Employees, the EEOC examined cautioned employers that their use of AI and other related technologies for hiring could violate the Americans with Disabilities Act if:
- The employer does not provide a “reasonable accommodation” that is necessary for a job applicant or employee to be rated fairly and accurately by the algorithm.
- The employer relies on an algorithmic decision-making tool that intentionally or unintentionally “screens out” an individual with a disability, even though that individual is able to do the job with a reasonable accommodation.
- The employer adopts an algorithmic decision-making tool for use with its job applicants or employees that violates the ADA’s restrictions on disability-related inquiries and medical examinations.
Related: The workforce of the future, according to Top 100 HR Tech Influencers
The document goes on to describe several scenarios in which the use of AI technology in the hiring process could present challenges for people with disabilities and offers guidance to help employers adapt to remain compliant with the ADA. There are multiple scenarios included, but the one I wanted to highlight here is below, which specifically discusses the responsibilities of the employer when using third-party or outsourced solutions that incorporate AI technology in the HR decision-making process.
From the EEOC document:
Is an employer responsible for providing reasonable accommodations related to the use of algorithmic decision-making tools, even if the software or application is developed or administered by another entity?
In many cases, yes. As explained above, an employer may be held responsible for the actions of other entities, such as software vendors, that the employer has authorized to act on its behalf. For example, if an employer were to contract with a software vendor to administer and score on its behalf a pre-employment test, the employer likely would be held responsible for actions that the vendor performed—or did not perform. Thus, if an applicant were to tell the vendor that a medical condition was making it difficult to take the test (which qualifies as a request for reasonable accommodation), and the vendor did not provide an accommodation that was required under the ADA, the employer likely would be responsible even if it was unaware that the applicant reported a problem to the vendor.
See also: How Walmart, Chevron and The Lego Group select new HR tech tools
This is an important passage for employers to keep in mind as they continue with or extend their application of AI technology in their hiring and HR decision-making processes. Very few employers are going to develop their own, in-house AI applications for candidate screening, for example; these are almost always provided by their HR technology solution provider partners. But as the EEOC points out, ceding the development of these technologies to HR software providers does not absolve the employer of its responsibility to remain ADA-compliant. And if the above-mentioned survey from Eightfold AI is reflective of HR leaders’ plans to advance AI technology in HR, more organizations are going to have to address these ADA issues (and potentially others) with their HR technology providers, which are infusing AI into their solutions to a large degree.
The EEOC document does offer a few recommendations for employers that are considering the use of AI technology in hiring processes, including providing reasonable accommodations on the employer’s behalf, relying on decision-making tools that are designed to be accessible to individuals of all abilities and ensuring that the algorithmic decision-making tools only measure abilities or qualifications that are truly necessary for the job—even for people who are entitled to an on-the-job reasonable accommodation.
As we continue to see increasing rates of AI incorporation into HR technologies, I would expect the EEOC and DOJ to continue to examine these developments and take steps to ensure the ADA and other relevant regulatory strictures are followed. I’m not a lawyer (I only play one in this column), but my sense is that more statements and possibly formal regulations are coming. It’s best, as an HR leader and employer, to get out in front of these as much as possible, in order to remain compliant and fair, while optimizing and enhancing your talent management processes.
To learn more about artificial intelligence in HR, attend the HR Technology Conference from Sept. 13-16 in Las Vegas, where industry expert Jeanne Meister will moderate a mega-session titled “How CHROs Are Preparing for the New World of Work,” for example, and one track will focus on HR Digital Transformation. See the conference program here.
The post Here’s how to get ahead of potential regulations on AI in HR tech appeared first on HR Executive.