CrimProf Blog

Editor: Kevin Cole
Univ. of San Diego School of Law

Tuesday, November 5, 2019

Simmons on Quantifying Criminal Procedure

Ric Simmons (Ohio State University (OSU) - Michael E. Moritz College of Law) has posted Smart Surveillance: Chapter 3 - Quantifying Criminal Procedure (Chapter 3 of "Smart Surveillance" by Cambridge University Press, 2019) on SSRN. Here is the abstract:
Since the inception of our criminal justice system, law enforcement officers and judges have relied primarily on experience, training, intuition, and common sense in making their predictions about whether reasonable suspicion or probable cause exists in a certain case. In response, courts have crafted broad standards to accommodate these subjective judgments and allow for flexibility in application. The broad, flexible nature of these standards is no accident: they have been intentionally left imprecise by generations of courts. One reason is the nearly infinite number of different facts that could arise in any criminal case, which make hard and fast rules rather impractical. But the main reason these rules have been kept ambiguous is that police and courts have historically lacked the necessary tools to evaluate the accuracy of their predictions with any precision. Thus, state actors have been forced to rely on their own subjective beliefs and anecdotal evidence in making their predictions.

All of that is now changing. Big data analysis is providing police and judges with tools that can predict future behavior with greater precision than ever before. These tools hold out the promise of increased fairness and greater objectivity at many of the critical decision points in our criminal justice system. They also will be able to quantify the benefits of various forms of surveillance, allowing us to predict with much greater precision the success rate for each of these methods. This chapter argues in favor of adopting precise legal standards for different levels of surveillance in order to harmonize the analytical world of big data with the legal world of criminal justice.

This chapter provides a road map for a transition from intuitive descriptive standards to empirically tested quantitative standards. It first explains why a move to quantified legal standards is an improvement over the descriptive standards now used by courts. It then examines the current set of descriptive standards and attempts to determine what level of certainty courts are applying when they use these standards – in other words, it seeks to translate the current standards of reasonable suspicion and probable cause into quantitative terms. In doing so, it calculates the de facto probability of reasonable suspicion and probable cause that has been applied by judges, and also draws on original empirical research into the views of federal magistrates. The chapter proposes that ultimately these descriptive standards should be jettisoned altogether in favor of a fully quantified system that could adjust more precisely to the productivity of different types of surveillance.

| Permalink


Post a comment