Tuesday, May 4, 2021
Cognitive Biases, Implicit Biases, and Student Feedback
I have stressed the importance of cognitive (or unconscious) bias training as a part of law students' professional identity development. (See Understanding and Overcoming Cognitive Biases For Lawyers And Law Students: Becoming a Better Lawyer Through Cognitive Science (2018)) I think it is important for law students to understand that all humans--lawyers, clients, judges, and themselves--are susceptible to cognitive errors caused by how their brains evolved.
Cognitive biases (thinking or brain biases) are “a systematic error in thinking that affects the decisions and judgments that people make.” (Kendra Cherry, What is a Cognitive Bias? Definitions and Examples, VeryWell (May 26, 2016). https://
www.verywell.com/what-is-a-cognitive-bias-2794963) Overcoming cognitive biases can produce amazing results. For example, “Atul Gawande, an accomplished medical professional, recounts the results of an initiative at a major U.S. hospital, in which a test run showed that doctors skipped at least one of only 5 steps in 1/3 of certain surgery cases, after which nurses were given the authority and responsibility to catch doctors missing any steps in a simple checklist aimed at reducing central line infections. In the subsequent 15-month period, infection rates went from 11% to 0%, 8 deaths were avoided and some $2 million in avoidable costs were saved.” (Wikipedia: Cognitive Bias Mitigation, https://en.wikipedia.org/wiki/Cognitive_bias_mitigation)
Anne Gordon has written an amazing article (with one major flaw) on biases and effective and fair student feedback.
Better Than Our Biases: Using Psychological Research to Inform our Approach to Inclusive, Effective Feedback. First, the abstract:
"As teaching faculty, we are obligated to create an inclusive learning environment for all students. When we fail to be thoughtful about our own bias, our teaching suffers – and students from under-represented backgrounds are left behind. This paper draws on legal, pedagogical, and psychological research to create a practical guide for clinical teaching faculty in understanding, examining, and mitigating our own biases, so that we may better teach and support our students. First, I discuss two kinds of bias that interfere with our decision-making and behavior: cognitive biases (such as confirmation bias, primacy and recency effects, and the halo effect) and implicit biases (stereotype and attitude-based), that arise from living in our culture. Second, I explain how our biases negatively affect our students: both through the stereotype threat that students experience when interacting with biased teachers, and by our own failure to evaluate and give feedback appropriately, which in turn interferes with our students’ learning and future opportunities. The final section of this paper details practical steps for reducing our bias, including engaging in long-term debiasing, reducing the conditions that make us prone to bias (such as times of cognitive fatigue), and adopting processes that will keep us from falling back on our biases, (such as the use of rubrics). Acknowledging and mitigating our biases is possible, but we must make a concerted effort to do so in order to live up to our obligations to our students and our profession."
This article is thoroughly researched (with one exception), and the author applies critical thinking to her arguments (with the same exception). Not only does she identify the problems, she offers several practical solutions to overcome cognitive biases in feedback and grading.
The one problem is her use of "implicit bias" in her argument. She defines cognitive biases as "errors of intuitive thought - shortcuts to decision-making that actually lead us to the wrong conclusion." The cognitive biases she lists in her article include the Anchoring Bias, the Regency Effect, the Conformation Bias, the Halo Effect, the Bandwagon Effect, the Attractiveness Effect, the In-Group Bias, the Bias Blindspot, and the Objectivity Illusion. These biases are well-supported by scientific studies, and Professor Gordon presents a detailed discussion of the biases.
She defines "implicit bias" as "a combination of attitude and stereotype biases, and "cognitive bias" as all other cognitive biases. It is key that she recognizes this distinction because cognitive (unconscious, brain biases) and implicit bias because these two categories have distinct research lines. In other words, researchers on cognitive biases do not generally study implicit biases.
As I said above, cognitive biases are strongly supported by rigorous, scientific studies. The same is not true of implicit biases. Scientists strongly disagree about the basis of implied bias theory.
Adam Lamparello has written an excellent article on the problems with implicit biases: The Flaws of Implicit Bias -- and the Need for Empirical Research in Legal Scholarship and in Legal Education. Here is the abstract:
"Nowhere is the necessity of using empirical research methods and statistics in formulating legal arguments more obvious than in recent legal scholarship concerning implicit bias.
By way of background, the concept of implicit, or unconscious, bias has recently enjoyed its ‘fifteen minutes of fame,’ garnering substantial support from many scholars, including some law professors, who contend that implicit biases cause discriminatory behavior, including behaviors that disparately impact traditionally marginalized groups. Indeed, scholars have advocated for programs and policies that instruct incoming law students and faculty regarding the existence of its implicit bias and its alleged role in perpetuating overt and subtle racism.
But there is a problem – a very big problem – that plagues legal scholarship in this area and that casts doubt on these policies.
Specifically, recent empirical studies by social psychologists strongly suggest that implicit bias is not predictive of biased behavior. In fact, the science regarding implicit bias’s connection to biased behavior is so flawed that social psychologists doubt its validity and question the utility of policies that attempt to link implicit bias to biased behavior. You wouldn’t know this from reading the many law review articles concerning implicit bias, or from the orientation sessions where law students are taught to believe that implicit bias is the sine qua non of biased behavior."
More specifically, first, "researchers have failed to delineate any meaningful distinction or definition distinguishing explicit (i.e., conscious) bias from implicit bias, and thus cannot explain why implicit, rather than explicit bias is the primary cause of overt and subtle discriminatory conduct. Put simply, the conclusion that implicit bias underlies certain types of discriminatory behavior is predicated on an inference –and nothing more." "Additionally,and relatedly, such studies are based substantially on the theory of disparate impact, which asserts that, where marginalized groups are disproportionately affected by a policy or practice, that impact is likely due to discrimination (and implicit bias). If, for example, an employer hires more Caucasians than African-Americans, such conduct is often attributed to implicit bias, despite the fact that bias or non-discriminatory factors, such as interview performance, `fit' with an organization, and other intangible factors could be responsible for this disparity. In other words, quantifying the impact of implicit bias on discriminatory behaviors has proven elusive, thus rendering conclusions regarding its effect tenuous."
Second, "What's more, the extant measures of implicit bias are flawed and thus provide no reliable method by which to quantify the effects of implicit bias on biased behavior. For example, the most common measure of implicit bias is the Implicit Association Test (IAT), which purports to measure the strength of associations between concepts (e.g., African-American, Hispanic, or Muslim persons), valuations (e.g., good, bad), and stereotypes (e.g., smart, dumb). (Based on the results of this test, researchers assess the degree to which an individual harbors implicit biases toward particular groups. But recent research has revealed that the IAT is flawed. To begin with, the IAT sets arbitrary cutoff scores to determine whether an individual's responses reveal implicit biases, yet fails to provide any assessments of the differences, if any, between the many individuals who score above or below those cutoffs." "Furthermore, IAT scores are arguably context-dependent, as the IAT produces different results for individuals when they complete the test multiple times. In essence, although results on the IAT are 'not as malleable as mood,' they are not as reliable as a personality trait." Moreover, it is difficult to assess whether the IAT is measuring unconscious attitudes that reflect associations resulting from environmental influences. Finally, and as stated above, the IAT fails to meaningfully distinguish between implicit and explicit bias. As one scholar explains, 'the IAT provides little insight into who will discriminate against whom, and provides no more insight than explicit measures of bias.'"
Finally, "the IAT, and implicit bias generally, is weakly correlated to discriminatory behaviors." "In fact, the evidence shows precisely the opposite: Researchers from the University of Wisconsin at Madison, Harvard, and the University of Virginia examined 499 studies over 20 years involving 80,859 participants that used the IAT and other, similar measures. They discovered two things: One is that the correlation between implicit bias and discriminatory behavior appears weaker than previously thought. They also conclude that there is very little evidence that changes in implicit bias have anything to do with changes in a person’s behavior. These findings, they write, 'produce a challenge for this area of research.'"
In sum, "As one social psychologist explains: Almost everything about implicit bias is controversial in scientific circles. It is not clear, for instance,what most implicit bias methods actually measure; their ability to predict discrimination is modest at best;their reliability is low; early claims about their power and immutability have proven unjustified." Consequently, "This is not to say, of course, that implicit bias does not exist, or that it does not have a material impact on biased behavior. It is to say, however, that the IAT –and evidence supporting a connection between implicit bias and biased behavior –is, at best, premature and, at worst, untenable."
Why have so many scholars uncritically advocated implicit biases and implicit bias tests? Cognitive Biases--the Confirmation Bias, the Semmelweis Reflex, and the Bandwagon Effect. Professor Gordon defines the confirmation bias as "the tendency to selectively search for information that confirms prior beliefs, hypotheses, or judgments." So, those who support implicit bias theory only look for evidence that supports their theory. The Semmelweis reflex is "The tendency to reject new evidence that contradicts a paradigm." (Wikipedia: List of Cognitive Biases) This is the most important reason that the shaky implicit bias theory is so widely accepted. Scholars who support the implicit bias theory simply ignore all contradictory evidence due to this cognitive bias. Finally Gordon defines the bandwagon effect as "our tendency to have our attitudes and beliefs shaped by others, due to our innate desire for social harmony." To paraphrase Gordon: If one clinician has an opinion about a [theory], the other may be unconsciously swayed by that person's opinion and therefore less likely to speak up [about the flaws in the theory], even if her opinion differs."
In sum, one foundation of Gordon's article, cognitive bias theory, has firm scientific support, while the other, implicit bias theory, does not. What does this mean for Gordon's suggestions and conclusions? Not much. Gordon's ideas for avoiding unfairness in feedback and grading are fully supported by cognitive bias theory. She does not need implicit bias theory to support her conclusions. For example, awareness of the primacy effect and the confirmation bias tells us that we shouldn't just evaluate a student on what she does at the beginning of the semester, but be prepared to change our opinion of that student as the semester progresses. Similarly, awareness of the in-group bias tells us that we should be careful that we treat those who are not in our group as fairly as in-group members.
Professor Gordon has written an amazing article. When we eliminate the one flaw in her article, we can follow her suggestions and give students fairer and more effective feedback.
(Scott Fruehwald)
Other sources criticizing implicit bias theory:
The Problem with Explicit Bias Training, Scientific American.
The False Science of Implicit Bias, WSJ.
Fred Oswald and Phillip E. Tetlock and colleagues, Predicting Ethnic and Racial Discrimination, Journal of Personality and Social Psychology, 2013.
https://lawprofessors.typepad.com/legal_skills/2021/05/cognitive-biases-inplicit-biases-and-student-feedback.html
Comments
Thanks so much for your response. Your close reading of the article and validation of its conclusions are much appreciated. (If only we all read each others’ work with such careful attention!)
Your critique is based on a challenge to implicit bias theory, and is primarily a critique of the IAT. You are correct that some psychologists have challenged the predictive value of the IAT; many have concluded that the results are highly context-dependent, for example. (This is exactly the point of my article, by the way – one’s biased actions may be a result of the cognitive resources at their disposal.)
While there may be a debate in the field, however, it is not the case that social psychologists have abandoned the IAT’s validity; new research continues to be published nearly every day that uses the IAT as a basis for both measuring implicit bias and drawing connections with biased behavior. If the methodology of the IAT is the problem, one could instead use the Affective Priming Task (APT), the Affect Misattribution Procedure (AMP), or the Sorting Paired Features Task, all of which have also been widely used (though not as widely as the IAT) to make the same point: attitudes, opinions, and judgments outside our conscious awareness have very real consequences for our decision-making. But I’ll leave those for the social scientists to debate.
In the end, it matters little whether a given clinician’s bias comes from explicit or implicit stereotypes, prejudices, or attitudes. My goal is to offer clinicians the tools to mitigate that bias when interacting with students. My citations to studies based on implicit bias are meant to put clinicians on notice that even those who disavow prejudice may still be falling back on it in unconscious ways, just as they may be doing with other cognitive biases. IAT or not, implicit or not, we need to do our best to ensure that we are not evaluating and interacting with our students in a biased manner. And I know you and I would agree on that.
Posted by: Anne Gordon | May 5, 2021 7:59:10 PM
Few doubt that implicit biases exist. The relevant social science research, however, has cast doubt on whether implicit bias has any relationship to biased behavior. The meta analysis cited above suggests that the answer is no. Given the flaws in both the IAT and implicit bias theory, law schools and the ABA should hesitate to embrace implicit bias training.
Posted by: Adam Lamparello | May 6, 2021 6:41:44 PM