Friday, March 15, 2019

Using AI to Overcome Implicit Gender Bias in Employment Decision-Making in the Tech Industry

Kimberly Houser, Can AI Solve the Diversity Problem in the Tech Industry? Mitigating Noise and Bias in Employment Decision-Making, 22 Stanford Tech. L. Rev. (forthcoming)

After the first diversity report was issued in 2014 revealing the dearth of women in the tech industry, companies rushed to hire consultants to provide unconscious bias training to their employees. Unfortunately, recent diversity reports show no significant improvement, and, in fact, women lost ground during some of the years. According to a 2016 Human Capital Institute survey, nearly 80% of leaders were still using gut feeling and personal opinion to make decisions that affected talent-management practices. By incorporating AI into employment decisions, we can mitigate unconscious bias and variability in human decision-making. While some scholars have warned that using artificial intelligence (AI) in decision-making creates discriminatory results, they downplay the reason for such occurrences – humans. The main concerns noted relate to the risk of reproducing bias in an algorithmic outcome (“garbage in, garbage out”) and the inability to detect bias due to the lack of understanding of the reason for the algorithmic outcome (“black box” problem). In this paper, I argue that responsible AI will abate the problems caused by unconscious biases and noise in human decision-making, and in doing so increase the hiring, promotion, and retention of women in the tech industry. The new solutions to the garbage in, garbage out and black box concerns will be explored. The question is not whether AI should be incorporated into decisions impacting employment, but rather why in 2019 are we still relying on faulty human-decision making?

https://lawprofessors.typepad.com/gender_law/2019/03/using-ai-to-overcome-implicit-gender-bias-in-employment-decision-making-in-the-tech-industry.html

Business, Equal Employment, Science, Technology, Workplace | Permalink

Comments

Post a comment