Wednesday, February 11, 2015
There have been several discussions in the legal blogesphere about Does Experiential Learning Improve JD Employment Outcomes? by Jason W. Yackee. (e.g., here and here). Because I find the article to be limited and flawed, I write further about it today.
Before I begin here is the abstract:
"This short paper provides an empirical examination of the link between law school experiential (or 'skills') learning opportunities and JD employment outcomes. The current 'law school crisis' poses a number of serious challenges to the legal academy, and how law schools should respond is hotly debated. One common suggestion is that law schools should reform their curriculum to emphasize the development of practical skills through experiential learning, rather than emphasize what is described as the impractical, theory- and doctrine-heavy book learning of the traditional law school curriculum. Employers are said to be more likely to hire those with substantial skills training. This paper provides a simple empirical examination of that basic hypothesis. To summarize the paper's key finding: there is no statistical relationship between law school opportunities for skills training and JD employment outcomes. In contrast, employment outcomes do seem to be strongly related to law school prestige."
To begin, both the title, the abstract, and the first sentence of the article are misleading. The article does not encompass all experiential learning; it is only a limited study of the effect of offering clinics on employment outcomes. The problem with this is that it allows bloggers and others who have not thoroughly read the article to make proclamations about experiential education that are not based on facts.
In addition, those of us who advocate legal education reform are not doing so only to improve job outcomes; we want law schools to graduate better lawyers. As Professor Michael Risch has written (here), "Of course, increasing clinical education because it leads to better learning outcomes may make the investment worthwhile." Concerning his own law school's clinics, he added, "we've ramped up our efforts, opening two new clinics, expanded externship opportunities, and created week-long modules targeted at real-world learning. Why? Because our students asked for it, our employers asked for it, and we think it will help our students be better lawyers earlier (even if I'm skeptical about being ready on day one)."
Professor Jamie Baker Roskie similarly wrote in the comments to Risch's post, "This is why I think it's a bad idea to focus on clinics as a direct cause of better employment numbers. It's shortsighted, although understandable from a budgetary perspective. But to me, as a former clinician, the real issue is that we are failing to train good lawyers. This creates a crisis in access to justice and client service. That's why clinics are important. As a profession we need to move away from this ridiculous outdated model of "let's try to get all our graduates jobs in prestigious firms." We're so far past that - our graduates are leaving the practice of law altogether (if they even start there) and meanwhile there's a huge crisis out there in terms of clients being able to access services. The discussion needs to be much much broader."
Professor Risch also criticized the paper: "But can clinics make a marginal difference? I don't think this paper gets us there because it is only a cross-section. While prestige and state unemployment explain a lot, there are obviously many other factors, such as state population. Further, changing enrollment will affect outcomes longitudinally. For example, Villanova has reduced its class size while increasing its clinical offerings. Assuming fixed employment demand, both employment rates and clinical position/JD population rates will increase, but I would be hard pressed to say that it is the clinics alone that cause any employment outcome changes."
Professor Brian Galle also has criticized Yackee's study (here): "The problem with this interpretation is that the causation story could well be backwards. Schools whose graduates are struggling may believe that clinical training will help---or, at least, they believe that applicants will think it helps. That would give us the negative correlation Jason observes. Dealing with this kind of endogeneity problem in data this coarse will be tough. As a first pass, though, it would be nice to see a dynamic model in which we can see the evolution of job outcomes and rankings over time. If lagged clinical offerings predict job outcomes, but not vice-versa, that would support Jason’s story a bit more."
He adds, "I’d also like to see some efforts at refining the data. Jason uses single-year U.S. News rankings, or alternatively peer-reputation rankings, as his measures of the school’s reputation on employment outcomes. We know that rankings are noisy, especially below the T-14, so a rolling average would probably make more sense. And why not use the professial-reputation score, which presumably represents the views of some employers? A more thorough version would also include alternate measures of placement outcomes, lagged placement outcomes, employment conditions (e.g., use employment rates for major MSA's where graduates go, not the rate for where the school is) and alternate measures of clinical positions available.
There’s also a big omitted variable, which is student ability. U.S. News ranking is of course correlated with that, but not perfectly (some schools in desirable job markets have higher LSAT averages than schools ranked well above them in U.S. News, e.g.). At a minimum, I’d want to see 25/75 percentile LSAT scores and a measure of student body diversity (which I think is usually a positive for employers)."
Finally, Professor Yackee makes a very questionable statement concerning the reason for poor job outcomes at certain universities, especially considering that it is contained in a statistical study. He notes that both Northeastern and Washington and Lee both heavily market their skills based programs, but have poor employment outcomes: "Both schools nonetheless under-perform in employment outcomes, Washington and Lee greatly so."
This is a questionable statement to make in a statistical study. Professor Yackee has not eliminated other possible causes of the poor employment outcomes at these law schools. Correlation does not equal causation. Equally important, are two instances statistically significant? Has the experiential program at Washington and Lee existed long enough that statisticians can draw conclusions concerning its effect on employment outcomes?
In sum, while I find Professor Yackee's study to be flawed (and overstated), it is important to undertake such studies. The science community frequently does multiple studies of the same subject matter in order to support or refute the original study. We need multiple studies of legal education reform to see if it is working, and, if so, how it is working. I have great confidence in the legal education reform because the learning techniques being used and proposed have been shown to work in other fields of learning. However, it is also important to do studies specifically on legal education.