What if you heard that a successful, great hire was as much luck as anything else? And what if you learned that diversity training, no matter how focused, could put your organization on a direct path away from diversity? Would knowing those things change your hiring process?
Unconscious biases have more control over perception and action than you are aware of or might want to admit. There are steps that hiring teams can make to help ensure diversity, but they’re probably not what you’d expect. And looking past the resources that you have now, maybe it’s time to enable automated and blind hiring decisions.
Luck and the Hiring Game
Luck in hiring might not be an easily digestible idea. After all, if luck does play a role in who your company hires, then it probably played a role when they hired you. Ouch. But Harvard Business Review (HBR)says that it’s true.
HR has balanced on the belief that the most deserving person usually gets the job. But merit isn’t as influential as it’s touted to be. All things being equal, meaning when no overt favoritism is shown in selecting interviewees, merit is still only part of the hiring decision.
Smarts and plenty of hard work – when we see a successful person, that’s how we usually interpret the path that they took to get there. But while those two factors are important, there’s another one that’s just as relevant: bias.
Acknowledging Bias Might Make Things Worse
This is an era of seeing bias as the enemy, but it’s as persistent as weeds in a garden. There are training courses designed to identify and stamp it out. But bias remains because it’s inherent in every person. And that creates a bigger challenge. If bias can’t be eliminated, how can anyone weaken its influence in the hiring process?
The answer might be automation and blind hiring. A common argument about automation is that computers can’t replace a human’s ability to interpret nuances and see the picture more clearly. But maybe that’s the problem. Humans are biased. Computer programs are not, at least not when they are programmed to evaluate candidates without regard to ethnicity, gender and other factors, such as where a person earned a degree.
The human element seems to help out candidates with the luck factor, even when training makes the hiring team aware of bias. That’s evidenced by how likely any person is to be hired when the candidate pool is stacked in opposition.
Diversity in the Hiring Pool Begets Diversity in Hiring
A woman stands practically no chance of being hired if she’s the only woman who made it to the ”finalist” interview level. But you already know about gender bias as it applies to women and minorities. What happens if the finalist candidate is a white male? You might be surprised.
In an HBR video about the relationship between hiring pools and hiring decisions, representation in the hiring pool is shown to affect everyone almost equally. So that white male finalist is just as unlikely to be hired if he’s up against a candidate pool of women or minorities.
The issue appears to be the difference between candidates, not the fact that a candidate is any specific ethnicity or gender. Differences are interpreted as risks, says HBR. Risk diverts the hiring decision away from the less risky bet and toward the majority.
Understanding bias is only part of improving diversity. The usual identification, education, and attack that’s thrown at other business problems don’t work. That’s because the problem isn’t a lack of awareness, it’s that training has very little effect. It’s like using an outdated weed killer on a specimen in the garden that’s become immune to it.
The human element is still important. Without it, there would be no one to care about improving diversity. But instead of putting humans in charge of making critical decisions that affect who does and doesn’t make the candidate pool cut, maybe it’s time for automated candidate matching to take the lead.