Tuesday, October 16, 2018
Amazon has long been known as a high-tech Moneyball employer, striving to make data-driven decision when possible. But this week shows that there are limits to that approach. After working since 2014 to develop AI-driven hiring algorithms, Amazon recently abandoned that approach. The reason? The algorithms were biased against women. This is an issue that several folks, including Rick Bales, have been talking about (and is a small part of a larger tech project I'm working on), and isn't a surprise given the dearth of women in the tech industry. This is the classic garbage-in-garbage-out issue. Amazon was training its algorithms based on resumes it has received, and because men disproportionally applied to the company, the algorithms were spitting out decisions that undervalued women; indeed, they were specifically penalizing resumes that included references to women. If Amazon or other companies want to use AI (really Machine Learning) for hiring, they should first use the technology to analyze its current hiring practices to try to root out pre-existing bias. Only once that's addressed does AI have even the hope of being effective.
To be clear: Amazon says that it never actually used the algorithms for actual hiring decisions. It wasn't for a lack of trying though. Amazon realized what was going on in 2015, but didn't disband the program until the start of last year. In other words, despite working for quite a while to eliminate the bias, they couldn't do it to their satisfaction. That a company like Amazon couldn't pull this off should serve as a strong warning to everyone about the limits of AI. I'm actually more optimistic on AI's eventual potential to reduce employment discrimination than many, but I am still extremely cautious about the technology. There's definitely a right way and wrong way to use it and, as Amazon shows, the right way can be really hard. As a result, I think the greatest risk of AI in personnel decisions is its misuse by companies that are too lazy, cheap, or blinded by the shiny object that is AI to realize that is is only a tool and, like other tools, can be used the wrong way.