January 29, 2013
Washington & Lee is Biggest Legal Education Story of 2013
Here it is in a nutshell. There is empirical evidence that Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students—significantly better than prior graduating classes at W&L, and significantly better than W&L’s primary competitors. Moreover, at a time when total law school applicants are on the decline, W&L’s getting more than its historical share of applicants and getting a much higher yield. When many schools are worried about revenues to survive next year and the year after, W&L is worried about creating the bandwidth needed to educate the surplus of students who enrolled in the fall of 2012, and the backlog of applicants that the school deferred to the fall of 2013.
[This is a long essay. If you want it in PDF format, click here.]
Alas, now we know: There is a market for high quality legal education. It consists of college graduates who don’t want to cast their lot with law schools who cannot guarantee students entree to meaningful practical training. Some might argue that W&L is not objectively better-- that the 3L curriculum is a marketing ploy where the reality falls well short of promotional materials and that, regardless, prospective students can't judge quality.
Well, in fact there is substantial evidence that the W&L 3L program delivers comparative value. The evidence is based on several years' worth of data from the Law School Survey of Student Engagement (LSSSE). I received permission from Professor James Moliterno, someone who took a leadership role in building W&L’s third year program, to share some of the key results (each school controls access to its LSSSE data.) They are below.
But before getting into empirical evidence, I want to put squarely on the table the most sobering finding that likely applies to virtually all of legal education. It is this: On several key LSSSE metrics, W&L has made impressive gains vis-à-vis its own historical benchmarks and its primary rival schools. But even for this leader, there remains enormous room for improvement. More on that below.
Here is the bottom line: Traditional legal education, when it is measured, does not fare very well. Yet, as W&L shows, substantial improvement is clearly possible. We law professors can respond to this information in one of two ways:
- Don’t measure, as it may disconfirm our belief that we are delivering a great education.
- Measure—even when it hurts—and improve.
I am in the second camp. Indeed, I don’t know if improvement is possible without measurement. Are we judging art work or the acquisition of key professional skills needed for the benefit of clients and the advancement of the public good?
Moving the Market
I doubt I will ever forget Jim Moliterno’s September 2012 presentation at the Educating Tomorrow’s Lawyers (ETL) conference at the University of Denver. He presented a single graph (chart below) showing W&L actual applicant volumes since 2008 versus what would have happened at W&L if its applicant volume had followed national trends.
While law school applicants crested a few years ago, W&L enjoyed a large run-up in volume of applicants, presumably due to the launching of their new 3L program. This larger applicant pool effectively served as a buffer when applicant declines began in 2011 and 2012. Since 2008, overall law school applicants are down -19%, yet W&L is up overall +33%.
But much more significantly, after their experiential 3L year was up and running and the overall legal job market continued to stagnate, W&L yields spiked. Ordinarily they would enroll 135 students. But for the fall of 2012, they received enrollment commitments from well over 260 students. Indeed, at the ETL conference Jim Moliterno said the school had to offer financially attractive deferments to get the class to approximately 185 incoming students -- a 50 student bulge.
When Jim Moliterno showed the above graph and explained the corresponding changes in yield, my good friend Gillian Hadfield, a skeptical, toughminded, evidence-demanding economist who teaches at USC Law, leaned over and said to me, “that is the single most important takeaway from this entire conference.” I agreed. The market for a legal education with practical training is, apparently, much more inelastic than the market for traditional JD programs.
Yet, what is perhaps most remarkable is that a large proportion of incoming students at W&L were enrolling based on little more than faith. Nobody knew for sure if W&L had the ability to pull off their ambitious 3L curriculum. The program relies on a large cadre of adjunct professors, after all, and W&L is located in remote Lexington, Virginia. Many law faculty outside of W&L, and perhaps some inside, thought (or perhaps think) that the program could not live up to the hype. Well, as shown below, the program appears to have produced meaningful gains.
The only data-driven critique anyone can muster is that the gains remain significantly short of perfection. But that critique bites harder on the rest of us. To use a simple metaphor, W&L is tooling around in a Model-T while the rest of us rely on horse and buggy. What ought to be plain to all of us, however, is that, just like automobile industry circa 1910, we are entering a period of staggering transformation that will last decades. And transformation will be roughly equal parts creation and destruction. See Schumpeter.
W&L Data, Internal Historical Benchmark
LSSSE is a phenomenally rich dataset – nearly 100 questions per year on a wide variety of topics related to student classroom experience, faculty interaction, type and quantity of assessments, time allocation, and perceived gains on a variety of dimensions related to personal and professional development. The survey instrument is online here.
Aside from a host of questions related to demographics, career goals, and debt, major sections in the LSSSE include:
- Section 1, Intellectual Experience (20 questions)
- Section 2, Examinations (1 question)
- Section 3, Mental Activities (5 questions)
- Section 4, Writing (3 questions)
- Section 5, Enriching Educational Experiences (9 questions)
- Section 6, Student Satisfaction (7 questions)
- Section 7, Time Usage (11 questions)
- Section 8, Law School Environment (10 questions)
- Section 9, Quality of Relationships (3 questions)
- Section 10, Educational and Personal Growth (16 questions)
W&L deserves to be a detailed case study. But frankly, legal education can’t wait. So I will do the best I can to cover the landscape in a blog post. I hope every law faculty member who reads this post makes a strong plea to their dean to enroll in LSSSE. Why? So your school can benchmark itself against the detailed LSSSE case studies that are bound to flow out of W&L and other innovative law schools. Though they don’t get much press, there are, in fact, other innovative law schools.The dataset I have for W&L covers 2004, 2008, and 2012. This is the same data that Jim Moliterno briefly shared at the ETL conference. I have put them into bar charts so that readers can see the scores on several questions at once. Two important interpretative notes:
- LSSSE is especially useful when an entire class (1L, 2L, or 3L cohort) experiences a curricular change. This happened with Indiana Law's 1L Legal Professions class. It is also happening here, as all W&L 3L students had the benefit of the experiential 3L curriculum. Assuming nothing else signficant has changed (a safe assumption when it comes to legal education), the classwide change enables a simple "event study" analysis.
- W&L LSSSE scores for 2004 and 2008 are much more alike than they are different. The big differences appear between 2008 and 2012. So that is what I discuss below.
Section 1 differences are displayed below (3L students only). Click on the chart to enlarge.
The big takeaway here is that W&L gained in 17 out of 20 categories. Because Section 1 is put on a 4 point scale, just like a traditional academic grading system, we can analyze the data using something akin to a LSSSE Section GPA . W&L's Section 1 GPA for 2008 was 2.52, which is essentially on the C+/B- cut point. Only one factor -- communicated with faculty via email--was meaningfully above a 3.0.
We can contrast that with a 2.85 GPA for 2012, which is in the B-/B territory. W&L's overall average increased by .33 points, and six measure are above 3.0. It experienced the biggest gains on the following:
- +.77, Put together ideas or concepts from different courses when completing assignments or during class discussions.
- +.75, Participated in a clinical or pro bono project as part of a course or for academic credit.
- +.53, Put together ideas or concepts from different courses when completing assignments or during class discussions.
- +.51, Worked with classmates outside of class to prepare class assignments.
- +.49, Prepared two or more drafts of a paper or assignment before turning it in.
- +.47, Discussed assignments with a faculty member.
- +.44, Used email to communicate with a faculty member (now a 3.65).
- +.43, Talked about career plans or job search activities with a faculty member or advisor
- +.41, Worked with other students on project during class
There is still enormous room for improvement, but W&L's 3L experiential program appears to have really moved the needle on factors related to the Section 1 Intellectual Experiences factors.
W&L fares even better on Section 3, which covers the mental activities that ostensibly comprise "thinking like a lawyer." [Click on chart to enlarge]
As shown above, W&L 3Ls drop in only one category -- rote memorization for repeating on an exam. Surely, that pleases the W&L faculty. These are 3Ls after all. The overall Section 3 GPA, which excludes 3a, moves from 3.07 (B) to 3.41 (B+). Question 3c to 3e are true higher order lawyering skills. W&L ought to wheel out these data the next time some bar association claims that legal education is not accomplishing anything. At some places, maybe. But good things appear to be happening at W&L.
Washington & Lee shows similar gains in the other key LSSSE sections. If you are curious, you'll have to wait for the detailed W&L case study, which I hope will get written someday by someone at W&L. What is no doubt of greater interest to the broader legal education community, however, is how well W&L is doing against other law schools--i.e., like us.
W&L Data, External Peer Benchmarks
LSSSE data are the property of law school who pay for the survey. The survey is designed to improve the education programming rather than create an industrywide ranking. Roughly 50% of law schools participate each year. Since its inception in 2003, 179 law schools have participate for at least one year.
Although the data are reported at the individual school-level, comparative benchmarks are a key part of the LSSSE value proposition. Comparative benchmarks include size, public/private, the total LSSSE sample, and a peer group specified by the school. For example, at Indiana, we might want to look at other Big 10 public law schools. We don't get to see our rivals' scores, individually, but we can get a group average for five or more schools we select that are also participating in that specific year.
I am told that schools typically pick their peer groups based on similar rank, geography, and applicant pool, etc. I thought W&L's peer comparison would be the most relevant to show here.
Below are the 11 (out of 20) factors in LSSSE Section 1 in which W&L is higher than its peer benchmark at statistically significant levels. Again, only 3Ls in the sample I am using here. [Click on to enlarge]
On these 11 benchmarks, W&L posts a "GPA" of 3.02 (B) versus 2.45 for the peers (C+). Again, W&L has plenty of room to grow, but relatively speaking, it is dramatically outperforming its competition.
What about those critical Section 3 Mental Activities that comprise "thinking like a lawyer"? Again, W&L is outdistancing the competition. [Click to enlarge]
Section 4 pertains to writing. Ask any professional development coordinator in a law firm about the biggest weakenesses of incoming associates, and you'll get a near unanimous reply: "writing." Well, the best way to become a better legal writer is to write. How did to W&L 3Ls do on that front? 3L students at W&L write a ton. [See chart below, click on to enlarge.]
W&L 3Ls write roughly the same number of 20-page papers as those at peer schools, but in the 1-4 and 5-19 page category, W&L 3Ls surge ahead of the competition at statistically signficant levels. In the above chart, the 3.27 score for papers in the 5-19 page range corresponds to 6-7 medium length papers during the 3L year. Peers, in contrast, are roughly at 3 medium length papers. The 3.68 score in the 1-4 page category also equals roughly 7 short papers during the 3L year; peers write roughly half that number, roughly 3-4 short numbers.
Section 7 covers time usage. Not surprisingly, W&L 3Ls spend more time prepping for classes beyond just reading assigned text -- roughly 7 hours more per week. [See chart below, click on to enlarge.]
Section 9 focuses on the quality of relationships within the school. In terms of 3L student relationships with faculty and administration, they are quite high -- indeed, higher at statistically significant levels than W&Ls peer schools. [See chart below, click on to enlarge.]
Finally, Section 10 asks a series of questions related to how well the law school experience has contributed to the student's knowledge, skill and personal development. [See chart below, click on to enlarge.]
On 10 of 15 questions, W&L is posting higher scores than its competition -- all at statistically significant level. But as I noted above, there remains room for improvement. W&L Section 10 "GPA" is 2.99 (B). Its competitor's GPA is 2.7 (B-).
There are three takeaways from this blog posts:
- A sizeable number of prospective students really do care about practical skills training and are voting with their feet. W&L has therefore become a big winner in the race for applicants.
- W&L's 3L experiential curriculum is substantial improvement over the curriculum W&L offered in 2004 and 2008; moreover, there is room for even more improvement.
- There is substantial evidence that W&L, with some modest focused energy on the curriculum, is now offering a better educational experience than its peer schools -- albeit, the current grade is a "B" at best for W&L and likely lower for the rest of us. We all, therefore, have a lot of work to do.
The example of the Washington & Lee 3L experiential year ought to be a watershed for legal education. We can no longer afford to ignore data. Through LSSSE, high quality comparative data are cheap and comprehensive. And that information, as we have seen, can significantly improve the value of a legal education.
[Posted by Bill Henderson]
This really should come as no surprise to anyone. The influence of Robert E. Lee continues even until today.
Posted by: willis | Jan 29, 2013 8:19:43 AM
Thank you Bill so much for posting this very important analysis, as well as for all of your work over the past year or two.
Posted by: Ralph Brill | Jan 29, 2013 2:12:23 PM
It is a long post, and I may have missed something, but am I right that the "empirical evidence" referred to is entirely student surveys in which they assess their educations? If so, is that enough evidence to support the quite immodest claim that "Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students"? I could see it supporting the more modest claim that "students seem to believe that they are doing more of a variety of activities that we suspect are useful to law school education."
Posted by: Stephen | Jan 29, 2013 3:14:42 PM
This is NOT empirical evidence that it is successful. That would require data about success as practicing lawyers. The data only show that people in the legal academy like it.
Posted by: Henry H Perritt, Jr. | Jan 29, 2013 3:56:41 PM
I think you're missing the easiest explanation for the spike in enrollment: increased scholarships. I don't have access to any official data, but from what I can glean W&L is offering significantly more money to admits. Combined with the low cost of living and proximity to a desirable market, I think this is the easiest way to square the enrollment numbers with the national trends.
The best thing I could find to support this is data from lawschoolnumbers.com (a website where students input their numbers and how their admissions cycle played out). This link shows the admission rates for all law schools for current 1Ls who scored between the 25/75s of W&L's GPA and LSAT: http://myLSN.info/7fh3h2. This is the same info for the 25/75 numbers for current 2Ls http://myLSN.info/hrg9iq. I chose to not include unrepresented minorities as well as early decision applicants. If you look at the average scholarship amount along with the admit rate, W&L is offering significantly more money than most other schools.
You may have access to better numbers and I'd be eager to hear what they show.
I doubt it's included in the LSSSE data, but I'm confident that if you asked students whether they'd have $10,000+ in additional scholarships to a school or an innovative 3L curriculum they would take the money and run.
Posted by: Anon | Jan 29, 2013 6:56:17 PM
To Stephen and Henry,
In all due respect, do you really think that people in their 20s with IQs in the Mensa range who are spending three years and $100K for an education can't discern whether they are learning something? The LSSSE is built on huge literature of how people learn -- that is the basis for those questions. Whether it is W&L on an historical basis, or W&L versus its peers, the folks receiving the W&L 3L education appear to be gaining on these indices of learning.
Moveover, smart young people applying to law school don't necessarily need the data to make these decisions -- it just senses that experiential education on what lawyers do is better than a 3L year comprised mostly of lectures and seminars. It is, literally, commonsense. So, as the data show, students are voting in big numbers with their feet. In these admissions environment, that means W&L will have enough students with good credentials. And many other schools won't.
You are arguing against both science and probability. It is time to give up the ghost. bh.
Posted by: Bill Henderson | Jan 29, 2013 9:47:19 PM
Bill--This is fascinating; thanks for posting. One question: how much of this could just be a version of the Hawthorne effect? That is, it is not so much W&L's new program which is improving student outcomes, as the implementation of a new, significantly different program, period?
Posted by: hw | Jan 30, 2013 3:57:11 AM
What about bar passage rates? I haven't seen anyone discuss this, but the W&L July 2012 Virginia bar passage rate looks unusually low to me: http://valawyersweekly.com/vlwblog/2012/10/16/1136-pass-july-2012-bar-exam/#more-7527
Is there an explanation for this that is unrelated to the new 3d year curriculum?
Posted by: anon | Jan 30, 2013 5:23:48 AM
I thought we had moved to a "science" of outcome assessment. Surveying students and learning that they are writing X number of pages tells us they are writing X number of pages. It does not tell us what that *produces*. And this comes from someone who "just senses," and indeed, believes, that writing more is important. I just have trouble calling my "commonsense" impression "science." I also have trouble extrapolating from my "sense" (or the sense of others) to spectacular statements like "this is evidence of a better education."
Regarding applications and yield, aren't they evidence of something else entirely? None of those students participated in the "better education" W&L purportedly provides, nor the LSSSE surveying. They were simply recipients of marketing material promoting that education (and perhaps other things, as another commenter notes).
I think experimenting with the curriculum is terrific, and I think it's indisputable that we can do better. I'm also a fan, however, of circumspection, and tying evidence to conclusions. It seems that students like the new program, and that's important. I don't think that necessarily means it's "better," which strikes me as a decidedly non-scientific word.
Posted by: Stephen | Jan 30, 2013 10:19:57 AM
Lw, re a Hawthorne effect, I think it all washes out and there is no Hawthorne effect that could account for these results. Specifically, the LSSSE operates the same at every school: A standard survey is sent out every April. The measurement at W&L was not different than the measurement at the peers; likewise, the measurement in 2004, 2008, and 2012 at W&L was administered the same way. So if measurement changes behavior, then it applies to all LSSSE schools in all years. I hope this is responsive to your query. bh.
Posted by: Bill Henderson | Jan 30, 2013 12:03:33 PM
Interesting post, as always. Much food for thought.
However, there is a lot of comments here about "traditional JD curricula" and the declaration that "the rest of us rely on horse-and-buggy." Do you draw that from a rich, focused comparison of curricula at other law schools? If so, do tell. My impression is that a number of law schools are revisiting their 3d year curricula in some meaningful ways. Horse-and-buggy may be a tad hyperbolic, no?
Posted by: dan rodriguez | Jan 30, 2013 4:20:56 PM
100% fair point. Two responses:
1. I do say there are other innovative law schools and they fail to their full stories told -- NWU Law is in the camp for sure.
2. But there are many law schools -- and I hear and it see it all the time -- where major initiatives stagnate in faculty committees. I draw an inference on the proportion from the high significant results presented in the W&L case study. We just will not be p values < .001 when 1/3 or a 1/4 of law schools are meaningful innovators (the peer comparisons are very very similar to W&L against the LSSSE population as a whole). So, W&L is just outperforming 90-95% of law schools, at least when it comes to the 3L year.
I would love to write more LSSSE success stories. If a school gives me their LSSSE data, I will write it up.
Best regards, Bill H
Posted by: Bill Henderson | Jan 31, 2013 3:49:00 AM
The Educating Tomorrow's Lawyers Website features innovative law schools at http://educatingtomorrowslawyers.du.edu/schools/. They discuss innovative classes at http://educatingtomorrowslawyers.du.edu/course-portfolios/.
Posted by: Scott Fruehwald | Jan 31, 2013 8:39:53 AM
Just a note about the qualitative versus quantitative nature of the LSSSE questions. I would agree with those who suggest that students have limited capacity to evaluate whether their education is good or bad, effective or ineffective. But with only a couple of exceptions, the LSSSE questions do not ask students to evaluate. They ask students questions that are well within the students' capacity to answer accurately: how many papers of various lengths did you write? how often do you apply law to real world problems? how many hours do you give to school work per week? how often do you work with others on proects? how often do you come to class unprepared? etc. These questions are also unlikely to pose Hawthorne effect problems.
Posted by: Jim Moliterno | Feb 1, 2013 6:34:30 AM
The gist of the numbers seems to be that 3Ls at W&L self-report being busier and more engaged with their 3L year than 3Ls self-reported at W&L four years ago. We also have data that W&L 3Ls self-report better on some metrics than do 3Ls at an unnamed group of schools that W&L has selected for comparison but (as far as I can tell) has not identified. Bill, can you say more about how the data show that W&L's curriculum has delivered on its promise?
Posted by: Orin Kerr | Feb 4, 2013 9:29:28 AM
All of capitalism operates capably on market research that is far thinner than what W&L has done. Why are W&L students report scores so much higher than historical benchmarks or peer benchmarks? The p values are often < .001, and the gains are on many dimensions. If the 3L curriculum did not produce this gains, what did?
Your reference to "self-report" seems suggest that the data cannot be trusted. A survey goes out in April and students answer it. Yet, 3Ls at W&L in April of 2012 see the world far differently than prior W&L 3Ls (2004 and 2008 are very similar) and 3Ls at peer schools. High ability students in their third year of law school who are investing their own time and money into their educations are more than capable of reporting what they do during their law school experience. Whatever bias might exist in the data is systemic to all law schools; I cannot think of any bias that would attach to just W&L only for 3Ls in the year 2012 (W&Ls 1L and 2L scores are not nearly as good as the 3Ls scores).
I am a big believer in Occam's Razor. Students find the 3L curriculum more engaging, interactive, and practical; this provides a basis for giving the school higher LSSSE scores; LSSSE questions are in the survey because these measures have been found to produce learning gains. This explanation needs no further elaboration.
What is alternative explanation for the jump in 2012 W&L 3L LSSSE scores? Whatever it might be, it is not simple or highly probable.
Regarding the disclosure of the reference group, I don't have the names of the schools (moreover, it would be unseemly to publish them, as it would discourage LSSSE participation). At Indiana, our reference group tends to be other Tier 1 schools of that are relatively geographically close (east of Mississippi), ideally schools where we compete for applicants. This very pragmatic approach is common among LSSSE schools. That said, I have worked with LSSSE data long enough to see, over and over again, that what is highly statistically significant against peers or schools of similar size is also highly statistically significant against the entire LSSSE sample.
In summary, W&L's gains are large, rare and, therefore, likely right.
Posted by: William Henderson | Feb 4, 2013 11:32:11 AM
Thanks for the response. Having read your answer, though, I still don't know why you reason from what the data shows to what your conclusions are. The closest you come to an explanation is your statement that "LSSSE questions are in the survey because these measures have been found to produce learning gains. This explanation needs no further elaboration." Maybe I'm missing something that everyone else gets, but I suppose I do want further elaboration: In particular, who found that these measures produce learning gains, and based on what evidence?
Posted by: Orin Kerr | Feb 4, 2013 12:33:43 PM
LSSSE is part of the Center for PostSecondary Research, which houses a wide variety of programs that promote educational effectiveness and institutional improvement through the collection and analysis of data. LSSSE, like many other national surveys conducted t the Center, are based on decades of research in adult learning. That may sound conplicated, but a lot of it boils down to simple stuff: more engagement is better than less. Learning by doing is more effective that listening to teachers talk about doing.
At W&L, we see indices of that:
-- a lot more writing
-- a lot more time on task
-- a lot more interaction for faculty
-- a lot more comprehensive that the knowledge learned applies to the real world and practical problem, often by doing rather than observing
-- a lot more collaborate with classmates
When a school (or law school) makes a curricular change, they can benchmark its effect by picking LSSSE questions (ideally in advance) that they believe should move up if the curriculum change is effective. W&L more than passes this test because the needle most by such a large amount is so many related areas and only for the 3L year.
I hope this explanation is adequate. Bill H.
Posted by: Bill Henderson | Feb 4, 2013 8:10:23 PM
Bill, I don't want to test your patience, but I'm still stuck on the questions I was asking earlier. In response to my question, "who found that these measures produce learning gains, and based on what evidence," you respond that the survey is "based on decades of research in adult learning." Maybe I should trust you without asking questions rather than get to know why the data proves what you're suggesting. But I guess I'm stuck on wanting to know the data. Anyway, thanks for responding.
Posted by: Orin Kerr | Feb 4, 2013 10:03:42 PM
Orin, I think Bill's answers have been entirely on point and are more than sufficient in themselves, but I plan to weigh in further in the coming days.
For now, I would say that when students say (and we know from our interactions with them) that they are doing (or spending or having) as Bill summarized
"-- a lot more writing
-- a lot more time on task
-- a lot more interaction for faculty
-- a lot more comprehension that the knowledge learned applies to the real world and practical problem, often by doing rather than observing
-- a lot more collaborate with classmates," I conclude they are making gains.
I understand that if we had them writing about irrelevant matters without instruction, if the time on task they were spending was on macrame drills, if the additional interaction with faculty was with an incompetent faculty, if the application of law to real world problems was led by inept instructors, and if the collaboration with classmates was collaboration on where the party should be that night, there would be no gain. I assure you and others that at a W&L with its first-rate faculty, the time, the writing, the collaboration, the application to world problems is well-designed and well-spent. They are spending more time learning from us and engaging in work that the profession regards as the core of what makes a productive lawyer. If that does not produce educational gains, I am not sure what any of us is paid to do.
There is plenty of educational research that says that time spent under expert supervision and guidance doing the tasks that are valuable to a particular profession or occupation produces gains.
Posted by: Jim Moliterno | Feb 5, 2013 4:56:59 PM