Diversity & Inclusion

July 18, 2022

Hiring Bias: How HR Analytics is Making Hiring Fairer

Let’s say you profile someone on LinkedIn or Facebook and see that they are passionate about football - something that for you, is a negative thing because you have a loved one who played football but suffered from an addiction. So, you end up not hiring the candidate. Because something about them triggers an unpleasant memory in your subconscious. Many recruiters are guilty of this, says Cristina Imre, the CEO of Aecho.

There was a survey where a group of HR professionals were asked — ‘Why are you having interviews with people you have no intention of hiring?’ Most of them were vague in their answers. There were several kinds of responses like, “You know, I need to have a certain number of candidates to assess.” “I need to show the hiring managers that I went through all of these candidates.” “I was curious to meet that candidate.”

Most recruiters are doing the hard work and go through all interviews before making a decision, but many are unaware that they already selected a candidate before the interview. In those cases, the interview transforms into a confirmation process about their choice. This is mostly unconscious. They’ll be more inclined to focus on those things that approve the candidate and overview the warning signs. They already have that bias in action about that perfect profile and why that particular person should be hired. 

Instead of focusing entirely on the candidate to truly understand him or her, you’re filtering out and checking whether that candidate conforms to your previously set expectations. You’ll automatically reject everything that interferes with your initial belief. 

It is really unfortunate because recruiters are, by a wide margin, people-centric personalities, who genuinely love other people and want to know them. But when you are stuck with all the other tasks around hiring and many times you don’t have enough time to breathe or clear your head, automatic assumptions will come into play. 

Recruiters are acting as a buffer between candidates and hiring managers, and that’s a very soliciting and sensitive game to play. Things can go wrong at any time. A lot of pressure. So predispositions can get activated to release that tension. One of the characteristics of the brain is that it’s lazy, preferring shortcuts whenever possible. 

An important aspect of diversity is for companies to hire people who are not like them. When you hire people you like and people you feel very comfortable with, you’re hiring them to fail because you’ve made it less about their fit for the role and more about how you feel about them.

Here’s another scenario:

Imagine you’re about to interview a candidate.

On the way to the meeting room, or before opening Zoom, you glance at their resume and notice they went to your school. There’s a photo of them attached – they remind you of someone famous, but you’re not sure who yet. They’ve listed basketball as one of their biggest interests outside of work, and you’re already thinking about talking about it with them. They’ve traveled, too – just like you did; Europe for a year, right after college.

Jared Leto – this person looks just like Jared Leto.

This person seems really cool. You like the same things, so you’re probably going to have plenty to talk about besides work. You haven’t paid much attention to their work experience, though. Wait; their employment history actually looks a little shaky. Maybe that doesn’t matter – maybe they had some personal stuff – maybe they’ll ace the interview and wow you with an answer.

The interview goes great.

You both talk, laugh, and nod approvingly; there’s an instant rapport. They explain away your concerns with solid answers. And you are inclined to believe them. You just know this person is going to be an amazing fit, so you offer them the position, and negotiate a salary you’re both happy with. They’re able to start on Monday. Perfect!

Except now, imagine it’s three months later. That new hire isn’t working out so well. Productivity is low, they’re not engaged at all, and there are questions being asked about how they got hired in the first place. Some members of the team around the new hire are starting to feel unmotivated and undervalued – and so are you.

Then, one of your best performers on the team resigns. You call a meeting and try a way to fix things before they get any worse.

What happened here is a classic example of hiring bias – and it’s not all that exaggerated, either. 74% of companies admit to having hired the wrong candidate, some with results like the ones above; attrition reduced productivity, lowered morale, and poor engagement.

Whether consciously or unconsciously, people make biased judgments based on their own experiences; their likes, dislikes, and affinities. We are prone to judge each other, knowingly or unknowingly, based on our similarities, and our differences.

We all do it. It’s a product of our nurture and of our nature.

The problem

Hiring bias is a major issue that affects businesses, their profitability, and their impact on society at large. At its worst, biased hiring decisions make for less diversity, which leads to less innovation, lower productivity, slower growth – and a grossly unjust, unfair environment to work in.

We know that names on resumes are discriminated against. We know that women are less likely to be considered than men. 

The problem isn’t so much that bias exists – we know it does. The problem is how do we address it, limit it, or at least work towards that goal.

Some companies have tried to solve the problem and reach that goal using AI.

Why current AI solutions for HR are failing

Amazon, FedEx, Target – these are just a handful of giants who’ve experimented with AI in their hiring process. AI-powered HR tools can scan huge volumes of resumes, picking out desirable traits, experience, known success factors – even desirable answers to pre-qualifying questions.

The idea was to remove human bias from the first steps of the hiring process, working on merit alone. This was intended to be the solution to discrimination and bias in hiring.

And maybe it would have been, were it not for the algorithms in charge of these tools being trained and programmed by humans, based on human behaviors, and modeled with the unconscious biases of the creators themselves.

Maybe this is the result of tools being made by the famously non-diverse tech world – but that’s another story.

A takeaway at this point could be that the short history of bias in tech and AI doesn’t bode well for the future. We can do better. 

Amazon scrapped its AI hiring tool after it was found to overwhelmingly favor men – because it had trained itself on a history of biased hiring.

Twitter’s algorithm is now proven to crop image previews to show younger, whiter faces.

Microsoft quickly abandoned an AI influencer personality that it created (a chatterbot named Tay), after it began sharing wildly inflammatory content, drawing comparisons with Neo-Nazi accounts. It was a PR disaster – that highlighted the naivety of those who created it, and the infancy of ML.

Bias is unknowingly built into the tech that was meant to eradicate it because it relies on previous human behaviors to learn. We cannot kill biases because that’s not realistic or compatible with humans. We need to be able to differentiate and categorize things. But in a fair way, and that’s the keyword. Fairness. Eliminate biases that can cause unfair and unjust results. 

Cristina describes how her startup, Aecho is solving this problem through the use of multi-measure voice analysis — ‘We take the complexity of human beings and analyze everything that you can imagine, from the data, voice, speech, including emotions. It’s a complex formula that doesn’t rely only on one set of features or parameters. We currently are using over 90 KPI’s generated from the voice. And we are adding, removing, correcting things in the mix on a daily basis.”

There’s still so much to do, in this respect – and we still need people to do it. So, how are we supposed to eradicate human bias in hiring, when humans are always going to be part of the equation?

We need to turn to data, with solid, foundational HR analytics.

How good HR analytics can make hiring fairer

Good HR analytics relies on good, clean data.

Clean data, audited for bias, can be relied upon as a base for good hiring decisions. HR analytics can then be used to better inform automated, algorithmic processes – leading to more diverse, qualified hires who benefit their organizations in countless ways.

Your own HR data can work for you, right now – and you can start using HR analytics to work towards DEI goals today. Here’s how:

1. Check your pulse

What’s the DEI status of your organization today? Where do you want to be? How does your company compare to others? Once you know where you are and where you want to go, you can draft KPIs and achievable goals that you can measure.

2. Use employee data to predict and counter bias

Your employee feedback, pulse surveys, and engagement surveys may hold a treasure trove of deep insights into bias in the workplace. Then, look deeper: what common factors do you notice in appraisals and one-to-ones? Do themes emerge where performance may be the same, but certain groups are reprimanded or praised more often?

3. Document, track, and target bias

To effectively target hiring and workplace bias, HR teams must consistently collect data, proactively follow up on issues and track the progress to resolution. This loops back to organizational goal-setting; measuring along the way and at key milestones (be that time elapsed, or company size) is going to be a key indicator of success.

4. Training

Managing DEI is still a fairly new concept, and you can't assume that everyone is on the same page, or has the same experience and understanding. Empower your HR and recruitment teams with the knowledge and tools to combat bias, using internal data, industry data and by developing internal programs.

5. Invest in tech

The right tech and the right people work hand in hand. Use employee engagement tools, HCM tools, and modernize your internal systems – maybe even consider a decentralized HR department that operates independently, using best-in-class software and SaaS tools, in the interests of your company.

This problem can’t be solved by technology alone, but it absolutely will help facilitate change. Promoting and improving DEI in companies was one of the biggest drivers when we founded eqtble, and our tech was created by pioneers in people analytics, acutely aware of the challenges of hiring bias. 

Smash your company’s DEI goals

Let’s change the way the world works. eqtble is the fastest, most powerful way to set DEI targets and track your company’s progress. Want to know more?

Request access to our platform!