Amazon reportedly scraps internal AI recruiting tool that was biased against womenOctober 10, 2018
Bias in machine learning can be a problem even for companies with plenty of experience with AI, like Amazon. According to a report from Reuters, the e-commerce giant had to scrap an internal project that was trying to use AI to vet job applications after the software consistently downgraded female candidates.
Because AI systems learn to make decisions by looking at historical data they often perpetuate existing biases. In this case, that bias was the male-dominated working environment of the tech world. According to Reuters, Amazon’s program penalized applicants who attended all-women’s colleges, as well as any resumes that contained the word “women’s” (as might appear in the phrase “women’s chess club”).
The team behind the project reportedly intended to speed up the hiring process. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,” an unnamed source familiar with the work told Reuters. When the company realized the software was not producing gender-neutral results it was tweaked to remove this bias. However, those involved could not be sure other biases had not crept in to the program, and as a result it was scrapped entirely last year.
Although Reuter’s report says the program was only an “experiment,” it’s not clear if it was ever used to vet candidates for real jobs at Amazon, even as part of a trial. We’ve reached out to the company to find out more.
Over the past few years, as artificial intelligence has been deployed in more and more contexts, researchers have become increasingly vocal about the dangers of bias. Prejudices about gender and race can easily creep into a range of AI programs — everything from facial recognition algorithms to those used by the courts and hospitals.
In most cases, these programs are simply perpetuating existing biases. With Amazon’s CV scanner, for example, a human recruiter might be equally prejudiced against female candidates on a subconscious basis. But, by passing these biases on to to computer program, we make them less visible and less open to correction. That’s because we tend to trust decisions from machines and because AI programs can’t explain their thinking.
Despite this, many startups working on AI recruitment tools explicitly sell their services as a way to avoid bias, because, the say, preferences for certain candidates can be coded in. Amazon is apparently thinking along these lines too, as Reuters reports that the company is having another go at building an AI recruitment tool, this time “with a focus on diversity.”