Why Unconscious Bias Training Doesn’t Work—5 Ways to Actually Make a Difference
July 11, 2017
On the whole, people like unconscious bias training: it validates their experiences, opens their eyes to new insights, and shows that their company cares about diversity and inclusion. That’s all great.
Unfortunately, unconscious bias training just isn’t that effective. Research shows that educating, training, and providing feedback to managers are the least effective ways to create a more diverse workforce. This poses a very real challenge when it comes to hiring for diversity.
Raising awareness is a good first step, but it’s not enough—you need real action.
The most effective strategies focus on changing processes, not changing minds. That’s according to David Rock, Director of the NeuroLeadership Institute, who recently spoke at the SHRM 2017 Conference.
Unconscious bias training doesn’t work because it’s really, really hard to catch yourself doing something unconsciously—even if you know better.
We overestimate our ability to change our own minds. If you don’t believe me, take a look at this image:
If you haven’t seen this optical illusion before, you might be shocked to learn that Square A and Square B are actually the exact same color. (Look closely at those grey guidelines if you don’t believe me.)
But even when you’ve learned about the illusion—even when you have those grey guidelines helping you—your brain still unconsciously sees one as light and one as dark. In the same way, awareness alone doesn’t prevent bias.
Instead, the most effective strategies to reduce bias are those that don’t require people to catch themselves being biased. David recommends focusing on processes, not people and not awareness. If you’re following the right processes, you’ll reduce bias and increase diversity—whether you’re aware of it or not.
Neuroscience research has revealed over 150 biases. But, since 150 biases isn’t really manageable, David’s organization groups all those biases into five underlying causes in what it calls the SEEDS model(R), and offers specific strategies to treat each cause.
Let’s take a look at each cause and get into the actionable, process-oriented strategies to mitigate their impact on hiring.
1. Similarity: We think people similar to ourselves are better than others
Research shows that homogenous, non-diverse teams are less effective than diverse teams. However, they feel more confident and assume they’re much more effective than they truly are.
That’s largely because of the similarity bias, which makes you think people similar to yourself are better than others.
Whether you realize it or not, your brain almost immediately categorizes everyone you encounter as in-group or out-group—friend or foe—and your brain gives extra brownie points if someone seems familiar and similar to yourself. Obviously, that makes it hard to be objective during the interview process.
How to mitigate the similarity bias: Find commonalities with candidates—every candidate.
It’s very easy to find things in common with people similar to you; in fact, Airbnb found that recruiters who emphasized commonalities with candidates were more biased. But your brain is going to find and overvalue those commonalities consciously or unconsciously.
The point here is to go the extra mile to find things in common with all candidates. You’re basically manually re-programming your brain to categorize every candidate as part of your group.
2. Expedience: We think our first feeling must be true
Your brain is smart, but it’s also lazy: it tries to take shortcuts wherever it can. That’s exactly what the expediency bias is all about: thinking too fast and skipping steps.
For example, take the famous halo effect: if a candidate is good at one thing, our brain lazily assumes they’re good at everything. In fact, the halo effect can be super superficial: if someone’s attractive, you’ll tend to think they’re more competent and trustworthy.
Another example of an expedience bias is the availability bias: we tend to overvalue things that we can remember quickly and clearly—even if that’s because it happened more recently. For instance, if you interviewed five candidates last week and one this week, you might go with the one you saw most recently just because you remember more of that interview.
Stereotypes are another form of the expedience bias: to save mental energy, your brain makes assumptions about someone based on what group you think they belong to.
How to mitigate the expedience bias: Slow down and think through everything carefully. This is the one bias where awareness can make a difference—but you still need to follow a set process.
To ensure you’re considering all the info, make a list of pros and cons for each candidate or outline a specific set of criteria that you can systematically judge every candidate by.
3. Experience: We think our subjective perceptions are objectively true
It’s comforting to believe that we see the world exactly as it really is, but that doesn’t make it true. In reality, our perceptions are super subjective: our impressions are impacted by past experiences, personal preferences, and, of course, biases.
But the experience bias means that we usually mistake our subjective judgments as objective fact.
The experience bias has a huge effect when you’re evaluating soft skills or creative work. For example, the false-consensus effect is a common bias where we assume everyone else thinks just like us—e.g., if you think that a graphic designer’s portfolio is too stuffy, it’s easy to assume everyone will agree with you.
How to mitigate the experience bias: Bring other people into the hiring process. The best way to battle this bias is having diverse opinions in the same room—your subjective impressions will be challenged, forcing everyone to think more critically.
Be sure not to only solicit second opinions from people who agree with you. If you want to cover your blindspots, find that person you’re always arguing with and get their impression of a candidate.
4. Distance: We think people closer to us are better than those far away
Neuroscientists recently discovered a hidden process running in the background of our brain: it’s called the “proximity network” and it basically categorizes everything as either close or far from you in time and space—and gives greater value to things that are closer.
That results in the distance bias: we think people and things that are closer are better. In recruiting, that means you might overvalue the candidate who came in for an in-office interview, even if the candidate who called in remotely from another time-zone would have been a better hire.
How to mitigate the experience bias: Level the playing field by putting everyone at the same distance. If half the interviews you’re conducting are over videoconference, do all of them over video instead—even for those candidates who are just down the street.
Of course, distance can be a reasonable factor when you’re considering candidates: but this bias means that we tend to overweight it beyond the actual impact. Following a uniform process for everyone doesn’t mean you totally forget about distance, it just helps you avoid overvaluing it.
5. Safety: We think bad outcomes are much more powerful than good outcomes
Our brains are hardwired to obsess about bad things that might happen and undervalue good things that might happen. We’re a lot more afraid of possible losses than we are excited by possible gains. That’s because the part of your brain that detects threats is three or four times bigger than the part that detects rewards.
That’s the essence of the safety bias: we avoid risk way more than we reasonably should.
A common example of the safety bias is the sunk cost fallacy: if you’ve invested a lot of money in a project that’s not working out, you’ll avoid cutting your losses and may even throw more more into it instead.
Let’s say you’ve flown a candidate in from across the country for a final interview—only to realize that they’re missing a critical skill and aren’t the best hire for that position. After investing in the plane ticket and dedicating all that time recruiting them, you might end up making the bad hire anyways, just so those expenses aren’t in vain. Admitting defeat might feel worse, but it offers better long-term benefits.
To take another example, you might avoid a non-traditional candidate and go with a conservative, low-risk/low-reward hire instead—even if that leads to a less diverse, less effective team.
How to mitigate the experience bias: Trick yourself into creating some psychological distance from the situation. When things are personal and immediate, our safety bias easily warps our decisions, but if you can be more logical by taking an imaginary step back.
Imagine that you’re making the decision on behalf of someone else: if another recruiter said they were facing this dilemma, what advice would you give them?
Another trick is to project the situation into the past: if you pretend you already made this decision six months ago, would you be happy that you took a chance with the non-traditional candidate—would you regret hiring that so-so, low-risk person instead?
According to the NeuroLeadership Institute, “if you have a brain, you’re biased.” It’s not a matter of intent or awareness, it’s just how the brain works: it takes tons of quick-and-dirty shortcuts everyday.
However, these biases lead to limited, less diverse teams and worse hires. And while raising awareness feels good and can kickstart important discussions, it doesn’t actually achieve a whole lot of change.
Instead of simply examining your biases, you need to put concrete processes in place that help you mitigate their effects. We’ll never totally eliminate bias—as David likes to say, “if you have a brain, you’re biased”—but following these processes inspired by neuroscience can help you make better, less biased decisions everyday.
To receive blog posts like this one straight in your inbox, subscribe to the blog newsletter.