Thursday, June 14, 2012
Posted by Jeff Lipshaw (this is a re-posting of a PrawfsBlawg post)
I'm privileged to co-blog with Bill Henderson (Indiana-Bloomington, left) at The Legal Whiteboard, with whom I've had a running dialogue about, among other things, how we can actually turn the insights of Nobel Prize laureate Daniel Kahneman (and his co-author, Amos Tversky) in guides for action. This morning, Bill has posted another of his delightful observations, this on the "dangers of being smart," and their application to faculty workshop discussions.
It turns out that this was close to the subject of my talk at Law & Society, where I apologized for sending discussant Sean O'Connor (Washington, right) at least three updated drafts of the paper in the twenty-four hours before the session, and changing the title at least a couple times. Since I got home, I've revamped it and changed the title one more time, but now I think I have it: "Reflections on the 'Two-Handed Lawyer:' Thinking and Action in Business Lawyering."
I'll include the present abstract below the break, but first I will explain why this is part of my dialogue with Bill. As I began my talk at LSA, I also worship at the altar of Daniel Kahneman. But I have always wondered about my own experience of actual decision-making, and the regress that has to occur when I reflect on a course of action. I can attribute my first, fast reaction to intuitive "System 1" thinking, and then slow down. I can use "System 2" thinking to reflect on the issue, and perhaps even think about Kahneman's recommendation, "When we can replace human judgment with a formula, we should at least consider it." (Thinking Fast and Slow, p. 233.)
Well, I've considered it, but what is the formula for replacing the heuristic about deciding whether to go with System 1 or System 2? Because Kahneman also thinks System 1 thinking produces many good results. (TFaS, p. 416.) And then all the way down.
The seed of the present paper was to assess whether disclosure remedies or Nudge-like libertarian paternalism actually got to the root of the decision-making problem in view of this seeming irreducibility. Sometime in the last couple weeks, provoked by a Mike Madison blog post and our subsequent offline dialogue, the seed came out of the larva (ick, mixed metaphors!), and it had morphed into something else: the difference between two-handed lawyering ("on one hand, but on the other hand") and the commitment to action undertaken by business people, and particularly entrepreneurs. In a nutshell, there's a point at which you stop thinking, and you act, and decision is metaphorically closer to action than it is to thought. That's not a natural act if you are "thinking like a lawyer."
Wednesday, June 13, 2012
I just read a short essay at the Big Think, "The Dangers of Being Smart," that reminded me of nearly every faculty workshop I have ever attended. In a nutshell, brilliant people -- and law faculty are filled with them -- can wax eloquent on cognitive bias, yet the deft ability to describe and comprehend does little to enable brilliant people to rein in the bias. In fact, being smart can be disadvantageous because we fall in love with the beauty and nuance of our own rationalizations and justifications.
This really hit home for me because I have witnessed hundreds of examples in which data are never sought out or consulted because the brilliant lawyers were so persuaded by their own reasoning. (And I stipulate that I am sure I have done this many times myself.) Or worse, good but not perfect data are dismissed because a lawyer or law professor could theorize a plausible flaw in the sample or methodology. The glee in finding the flaw then shortcircuits the right response, which is a simple discussion of probability -- that is, what is more or less likely based on all available evidence.
The Big Think essay reminded me of Dan Kahneman's Thinking, Fast and Slow, especially the section titled "Overconfidence." Kahneman, a psychologist, won the Nobel Prize in economics because he, along with co-author Amos Tversky, identified several predictable, recurring cognitive biases in human decisionmaking.
Kahneman later revealed that the basis for their breakthrough research was errors they detected in their own judgments. “People thought we were studying stupidity,” said Kahneman. “But we were not. We were studying ourselves.” For a wonderful primer on Kahneman's unusual worldview, see this Michael Lewis essay.
[posted by Bill Henderson]