How Your AI-Driven Recruiting Software Could Lead to Legal Trouble Instead of Better Candidates
November 19, 2019
The road to the courtroom can be paved with good intentions, says Jaimi Kerr, a director of employment law for LinkedIn.
Jaimi says that as recruiters and hiring managers navigate the ever-changing topography of AI-driven tools, they need to understand what the law has to say about it.
In a conversation with Brendan Browne, LinkedIn’s global head of talent acquisition, Jaimi notes that the law can be focused on outcomes, not only intentions, when it comes to the vast library of software now available to help companies source and recruit more effectively and, in theory, with less unconscious bias. She says that, similarly, good intentions are not a defense for making workplace decisions based on age, gender, or race.
In the latest episode of Talent on Tap, Jaimi and Brendan explore reasons why, more than ever, your company’s legal and talent acquisition teams need to be working closely together:
Caveat emptor: A product’s claim to being unbiased doesn’t make it unbiased
Jaimi has worked with Brendan’s team enough to know how much talent acquisition professionals are craving to hire candidates quickly, fairly, and efficiently. At the same time, there has been explosive growth in the number of software tools that have been brought to market to help recruiters do just that.
“On LinkedIn, you’ll see a bunch of ads,” Jaimi says, “for a bunch of different vendors that say, ‘Hey, we can use machine learning to help you find the right candidate.’” She notes that often these companies also tout that their products eliminate human bias.
“It sounds great,” Jaimi tells Brendan, “and it’s not to say that it couldn’t be great, but there can be really serious unintended consequences that can cause a legal liability.”
A jury may not be as sold on the hype as your recruiting team was. As an article in the Harvard Business Review asked earlier this year: “Do hiring algorithms prevent bias, or amplify it?”
Recruiters may use AI-driven software for a whole range of tasks — to target certain candidates with job ads; to source passive candidates; to screen applicants; to score resumes; and to assess skills and competencies. The algorithms behind these tools may be well intended but that is not a winning defense in the eyes of U.S. law.
Recruiting teams need to understand what disparate impact is and how to avoid it
“There’s this whole area of discrimination,” Jaimi says, “which is disparate impact discrimination. That’s what it’s called in the U.S. In other places, sometimes it’s called indirect.”
In recruiting, it works like this: “It’s basically the situation where you put people through a screening tool,” Jaimi says, “and one group of people, say, men, passes at a certain rate and perhaps a group of people that are women pass at a different rate. If you compare those two rates and the group that passes at the lower rate is passing at less than 80% of the other group,” that can be called “adverse impact,” which is considered evidence of disparate impact discrimination.
Here are two scenarios where an AI-driven tool could be causing disparate impact on the recruiting process:
- The algorithm excludes anyone who lives more than 10 miles from the main office, which might have a disparate impact on underrepresented groups, depending on the demographics of the surrounding neighborhoods and communities.
- The algorithm only forwards applicants who have a degree in a major that wasn’t offered 20 years ago, which might have a disparate impact on older applicants.
Neither of those scenarios is wildly nefarious, but each could be a legal landmine. And companies don’t want to use their AI tools to create the data needed to see if they’re leading to disparate impact. By then, it’s too late.
“There are processes that you can put in place on the front end,” Jaimi says, “that can help you potentially avoid that as an outcome. But you have to thinking about it on the front end. You can’t just be saying, ‘Oh, let’s plug in the tool and use it tomorrow.’”
Test your recruiting software rigorously before you starting using it
It’s critically important that TA teams work with their legal teams to make sure their new software tools are going to help find great talent rather than land them in a heap of trouble.
“A compliant validation process,” Jaimi told the Talent Blog, “is one that is aligned with the guidance set out in the Uniform Guidelines on Employee Selection Procedures (UGESP). It typically begins with a robust analysis of the job for which the tool is being considered and proceeds with a technical study that evaluates both how well the selection procedure mirrors or predicts success on the job and whether the selection procedure has an impact on fairness.”
As Jaimi notes, conducting a validation process under UGESP also serves the ultimate business goal: It helps ensure that the tool is actually measuring things that are meaningful indicators of success in the jobs you’re posting.
The big diversity rule: Never make a decision based on someone’s age, race, or gender
Brendan asks Jaimi if she has any broader advice on diversity recruiting. “The most important thing,” she says, “is to remember — this is what I call the big diversity rule — is that you don’t make decisions based on someone’s race or gender or age or other protected class.”
Jaimi says that after you assemble a diverse slate of candidates, you want to evaluate their strengths and weaknesses and select the person who will contribute the most.
Sometimes, Jaimi says, people want to find a loophole on this point. They ask her: If I have two equal and identical candidates, can I use race or gender as a tiebreaker?
“My response to that is always,” she says, “there is no such thing as two exactly equal and identical candidates. . . . So, look at those two individuals. What are they bringing to the table that you need? What can they add to your team?”
This is not only the legal way to make the decision, it’s the right way.
“Nobody,” Jaimi says, “wants to be seen as only a woman or only a Hispanic candidate or only an African American candidate. Nobody wants to be chosen or not chosen for anything related to those characteristics. And it also happens to be against the law. Which is important to me, to all of us.”
“Early and often is key,” Brendan says, “when it comes to partnering with your legal department.”
He recommends that recruiting managers train their teams on current legal issues so they understand the reasons behind choices that are being made about platforms and processes, and he suggests making it an ongoing priority.
“There’s not a number of meetings,” Brendan says, “that Jaimi and I have and then we’re at a place where we’re done. This is something that needs to be an ongoing commitment and ongoing partnership for you in your recruiting career, now and in perpetuity.”
To receive blog posts like this one straight in your inbox, subscribe to the blog newsletter.