Friday, April 9, 2021
Mirko Bagaric, Jennifer Svilar, Melissa Bull, Dan Hunter and Nigel Stobbs (Director of the Evidence-Based Sentencing and Criminal Justice Project, Swinburne University Law School, affiliation not provided to SSRN, Queensland University of Technology, Queensland University of Technology and Queensland University of Technology - Faculty of Law) have posted The Solution to the Pervasive Bias and Discrimination in the Criminal Justice: Transparent Artificial Intelligence (American Criminal Law Review, Vol. 59, No. 1, Forthcoming) on SSRN. Here is the abstract:
Algorithms are increasingly used in the criminal justice system for a range of important matters, including determining the sentence that should be imposed on offenders; whether offenders should be released early from prison; and the locations where police should patrol. The use of algorithms in this domain has been severely criticized on a number of grounds, including that they are inaccurate and discriminate against minority groups. Algorithms are used widely in relation to many other social endeavors, including flying planes and assessing eligibility for loans and insurance. In fact, most people regularly use algorithms in their day-to-day lives. Google Maps is an algorithm, as are Siri, weather forecasts, and automatic pilots. The criminal justice system is one of the few human activities which has not embraced the use of algorithms. This Article explains why the criticisms that have been leveled against the use of algorithms in the criminal justice domain are flawed.