Advertisement

Unconscious Bias Is Hidden In Your Software

Article main image
Jan 24, 2019

Back in October, Reuters disclosed that a year earlier Amazon had abandoned its AI recruitment software because it was biased against women. Though the article said the company’s recruiters never relied on the system’s rankings, they did consider them.

The question is: How does an AI matching software achieve bias; aren’t machines cold, opinion-less tools?

The answer evokes that old expression, “Garbage in, garbage out.” There is no reliable system that ensures AI software is acting fairly to all participants. But it’s not the fault of the machine. It’s our own biases that are seeping into the coding for these AI programs.

Let’s explore how Amazon and potentially all AI matching recruitment software developed an unconscious bias.

It’s a well-known fact that the IT, specifically the web engineer, space is mainly dominated by men. In fact, Google and Microsoft IT teams are 80% comprised of men. And there lies the problem with technology, specifically in the development of artificial intelligence matching.

When any organization creates matching technology they develop rules around how it works. In essence, these systems are told the company wants more candidates like its top performers. They learn to identify the best prospects by comparing the candidate’s data – resume and whatever else might be required in an application – to those of the people the company has previously hired. Weight is given to particular words in the data.

Since pretty much all of this programming developed by humans and, like it or not, we all have our unconscious biases, the technology we create is unintentionally inheriting those biases.

Tough like mommy?

Men learn from their first years to use a type of language known as masculine language. A great example of this was shared in an article this year, showing two youngsters wearing t-shirts saying “Smart like Mommy” and “Kind like Daddy.”

Do the statements seem discordant? Why? It’s because we’ve all been exposed to gender (and other) stereotypes for so long we’ve developed biases, often unconsciously.

This is where a can of worms opens when we program our machines. As Amazon discovered, its AI software ranked certain words, phrases and even activities lower due to the masculine language weaved into the DNA of how their matching worked.

Is all our software biased?

Amazon, as one of the biggest brands in the world, could not possibly use software with an inherent bias in favor of men as the backlash would be incredibly damaging, so scrapping it only made sense. This leads me to this bigger issue: With the IT industry dominated by men — and AI matching technology companies are no exception — how can anyone be confident that all the current recruitment software on the market is free of bias? Is the problem really only limited to the technical capabilities of Amazon? I think not.

What is truly concerning is Amazon has basically proven that AI can be subjective and biased by inheriting traits from its creators, so we can assume that all AI recruitment matching software is biased in some way. Women, no doubt, have been and will continue to suffer at the hands of this unintentional flaw within the hiring process.

The good that comes from this is the fact that this problem has come to light in the recruitment industry.

I personally have not seen many other companies in the recruitment technology space making that much noise around the Amazon issue, but I am on a mission to make sure that matching technology in the recruitment sector strives towards being fair for all.

Get articles like this
in your inbox
Keep up to date with the latest human resources news and information.
Advertisement