Sunday, March 2, 2014
DECLINING ENROLLMENT – Between fall 2012 and fall 2013, the 199 law schools in the 48 contiguous states and Hawaii (excluding the Puerto Rican schools) accredited by the ABA’s Section for Legal Education and Admissions to the Bar, experienced the following first-year enrollment changes:
25 schools had a decline in first-year enrollment of 25% or more,
34 schools had a decline in first-year enrollment of 15%-24.99%,
44 schools had a decline in first-year enrollment of 5% to 14.99%,
62 schools had “flat” first-year enrollment of -4.99% to 4.99%,
19 schools had an increase in first-year enrollment of 5% and 14.99%, and
15 schools had an increase in first-year enrollment of 15% or more.
Overall, more than half (103) had a decrease in first-year enrollment of at least 5%, while roughly 17% (34) had an increase in first-year enrollment of at least 5%.
Across these 199 schools, first-year enrollment declined from 42,590 to 39,109, a decrease of 8.2%. The average decline in first-year enrollment across U.S. News “tiers” of law schools was 2.6% among top 50 schools, 8.2% among schools ranked 51-99, 7.7% among schools ranked 100-144 and 7.9% among schools ranked alphabetically.
Between fall 2010 and fall 2013, the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth), experienced the following first-year enrollment changes:
28 schools had a decline in first-year enrollment of 40% or more,
29 schools had a decline in first-year enrollment of 30% to 39.99%
43 schools had a decline in first-year enrollment of 20% to 29.99%
43 schools had a decline in first-year enrollment of 10% to 19.99%
36 schools had a decline in first-year enrollment of 0% to 9.99%
10 schools had an increase in first-year enrollment of 0.01%to 9.99%
6 schools had an increase in first-year enrollment of 10% or more.
Overall, more than half (100) had a decrease in first-year enrollment of at least 20%, while only roughly 8% (16) had any increase in first-year enrollment.
Across these 195 schools, first-year enrollment declined from 50,408 to 38,773, a drop of 23.1%. The average decline in first-year enrollment across U.S. News “tiers” of law schools was 14.7% among top 50 schools, 22.5% among schools ranked 51-99, 22.8% among schools ranked 100-144, and 26.8% among schools ranked alphabetically.
DECLINING PROFILES -- Across the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (thus excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth) the entering first-year class average LSAT profile fell one point at all three measures between 2012 and 2013, from 159.6/157/153.5 to 158.6/156/152.5. The entering first-year class average LSAT profile fell roughly two points at all three measures between 2010 and 2013, from 160.5/158.1/155.2 to 158.6/156/152.5.
The average decline in median LSAT scores between 2012 and 2013 across U.S. News “tiers” of law schools was .98 among top 50 schools, 1.18 among schools ranked 51-99, .72 among schools ranked 100-144, and 1.13 among schools ranked alphabetically.
Notably, 133 law schools saw a decline in their median LSAT between 2012 and 2013, with 80 down one point, 38 down two points, 12 down three points, one down four points, one down five points and one down six points, while 54 law schools were flat and 7 saw an increase in their median LSAT.
In terms of schools experiencing “larger” declines in median LSAT scores between 2012 and 2013, five schools in the top 50 saw a three point decline in their median LSAT, five schools ranked 51-99 saw at least a three point decline (of which one was down four points), three schools ranked 100-144 saw a three point decline, and two schools ranked alphabetically saw large declines – one of five points and one of six points.
The average decline in median LSAT scores between 2010 and 2013 across U.S. News “tiers” of law schools was 1.54 among top 50 schools, 2.27 among schools ranked 51-99, 2.11 among schools ranked 100-144, and 2.79 among schools ranked alphabetically. If one were to unpack the top 50 schools a little more, however, one would discover that the top 20 schools saw an average decline in their median LSAT of 1.05 between 2010 and 2013, while the bottom 15 schools in the top 50 saw an average decline in their median LSAT of 2.53.
In terms of schools experiencing “larger” declines in median LSAT scores between 2010 and 2013, three schools in the top 50 have seen declines of four or more points, nine schools ranked 51-99 have seen declines of four or more points, 11 schools ranked 100-144 have seen declines of four or more points and 17 schools ranked alphabetically have seen declines of four or more points.
When looking at the 2012-13 data in comparison with the 2010-2013 data, one sees that lower ranked schools have had more of a sustained challenge in terms of managing profile over the last few years, while schools ranked in the top 50 or top 100 had been managing profile fairly well until fall 2013 when the decreased number of high LSAT applicants really began to manifest itself in terms of impacting the LSAT profiles of highly ranked schools.
The overall decline in the LSAT profile of first-year students also can be demonstrated with two other reference points. In 2010, there were 74 law schools with a median LSAT of 160; in 2013, that number has fallen to 56. At the other end of the spectrum, in 2010, there were only 9 schools with a median LSAT of less than 150 and only one with a median LSAT of 145. In 2013, the number of law schools with a median LSAT of less than 150 has more than tripled to 32, while the number of law schools with a median LSAT of 145 or less now numbers 9 (with the low now being a 143).
CONCLUDING THOUGHTS – Over the last three years, few schools have had the luxury of being able to hold enrollment (or come close to holding enrollment) and being able to hold profile (or come close to holding profile). Many schools have found themselves in a “pick your poison” scenario. A number of schools have picked profile and made an effort to hold profile or come close to holding profile by absorbing significant declines in first-year enrollment (and the corresponding loss of revenue). By contrast, a number of schools have picked enrollment and made an effort to hold enrollment or come close to holding enrollment (and maintaining revenue) but at the expense of absorbing a significant decline in LSAT profile. Some schools, however, haven’t even been able to pick their poison. For these schools, the last three years have presented something of a double whammy, as the schools have experienced both significant declines in first-year enrollment (and the corresponding loss of revenue) and significant declines in profile.
Tuesday, February 4, 2014
I think the answer is yes. But, unfortunately, in virtually all of the debate surrounding legal education, there is a tremendous lack of clarity and precision about how we assess improvements in quality. And equally relevant, if a gain is real, was it worth the cost?
The purpose of this essay is to chip away at this serious conceptual gap. Until this gap is filled, experiential education will fall significantly short of its potential.
Is Experiential Legal Education Better? And if so, at What Cost?
Many legal educators believe that if we had more clinics, externships, and skills courses in law school, legal education would be better. Why? Because this more diversified curriculum would become more "experiential."
Inside the legal education echo chamber, we often accept this claim as self-evident. The logic runs something like this. A competent lawyer needs domain knowledge + practical skills + a fiduciary disposition (i.e., the lawyer’s needs are subservient to the needs of clients and the rule of law). Since practical skills—and some would argue, a fiduciary disposition—cannot be effectively acquired through traditional Socratic or lecture teaching methods, the ostensible logic is that schools become better by embracing the "learning-by-doing" experiential approach.
That may be true. I would bet on it. But the per-unit cost of legal education is also probably going up as well. So, have we really created a viable and sustainable long-term improvement to legal education?
In my mind, the questions we should be asking instead are the following: (1) Among experiential teaching methods, which ones are the most effective at accelerating professional development? And (2) among these options, how much does each cost to operate? Quality and cost must be assessed simultaneously. After they are evaluated, then we will be able to make choices and tradeoffs.
Let's start with quality, which I define as moving lawyers toward their peak effectiveness potential as rapidly and cost-effectively as possible. This is an education design problem, as we are trying to find the right combination of education (building domain knowledge) and experience (acquiring and honing skills through practice). There is also likely to be an optimal way to sequence the various educational and experiential steps.
Creating Compelling Evidence of Educational Quality
We legal educators have many ideas on how to improve educational quality, but we make no real progress if employers and students remain unconvinced. Can it be shown that because of a specific type of experiential curriculum at School X, its graduates are, during the first few years of practice, more capable lawyers than graduates of School Y?
[Side bar: If you are skeptical of this market test, it is worth noting that it was the preferences of law firm employers who gave rise to the existing national law school hierarchy. It happened about 100 years ago when a handful of law schools adopted the case method, required undergraduate education as a prerequisite to admission, and hired scholars as teachers. As a general matter, this was a far better education than a practitioner reading lecture notes at the local YMCA. See William Henderson, "Successful Lawyer Skills and Behaviors," in Essential Qualities of the Professional Lawyer ch 5 (P. Haskins ed., 2013).]
If a law school can produce, on balance, a better caliber of graduates than its competitors, then we are getting somewhere. As this information diffuses, employers (who want lawyers who make their lives easier) will preference law schools with the better graduates, and law students (who want more and better career options) will follow suit. Until we have this level of conceptual and empirical clarity, we might as well be debating art or literature.
If students and employers are responding to particular curricula, it is reasonable to assume they are responding to perceived value (i.e., quality as a function of price). I believe there are three steps needed to create a legal education curriculum that truly moves the market.
1. Clarity on Goals. We need to understand the knowledge, skills, and behaviors that are highly prized by legal and non-legal employers. Truth be told, this is tacit knowledge in most workplaces. It is hard intellectual work to translate tacit knowledge into something explicit that can be communicated and taught. But we are educators -- that is our job! If we think employers are missing something essential, we can add in additional factors. That's our job, too.
2. Designing and Building the Program. Working backwards from our goals, let's design and build curricula that will, overall, accelerate development toward those goals. This is harder and more rigorous than lesson planning from a casebook.
3. Communicating Value to the Market. If our program is indeed better, employers and students need to know it. This also requires a crisp, accurate message and a receptive audience. This requires planning and effort. That said, if our program truly is producing more effective lawyers, it logically follows that our graduates (i.e., the more effective lawyers) will be the most effective way to communicate that message.
Regarding point #3, in simple, practical terms, how would this work?
During the 1L year, we show our law students the roadmap we have developed (step #2) and spend the next two years filling in the knowledge, skills, and behaviors needed to achieve their career goals. This professional development process would be documented through a portfolio of work. This would enable students to communicate specific examples of initiative, collaborative learning, problem-solving, or a fiduciary disposition, etc., developed during law school. Students would also know their weaknesses, and have a clear plan for their future professional development. In a word, they'd stand out from other law graduates because, as a group, they would be much more intentional and self-directed (i.e., they'd know where they are going and how to get there).
With such a curriculum in place, our law school would collaborate with employers assess the performance of our graduates. By implication, the reference point for assessing quality would be graduates from other law schools. When our graduates fare better, future graduates will be more heavily recruited. Why? Because when an employer hires from our school, they would be more likely to get a lawyer who helps peers and clients while adding immediate enterprise value.
I suspect that many of my legal academic colleagues would argue the best law schools are not trade schools -- I 100% agree. But I am not talking about a trade school model. Rather, a world-class law school creates skilled problem-solvers who combine theory with practice and a fiduciary disposition. Graduates of a world-class law school would be reliably smart, competent, and trustworthy. This is a very difficult endeavor. It takes time, planning, collaboration, creativity and hard work. But the benefits are personal, organizational, and societal.
At a practical level, I think few law schools have targeted this goal with a full, unbridled institutional commitment. But the opportunity exists.
When I got tenure in 2009, I decided that I was going to spend the next several years doing applied research. I am a fact guy. Rather than argue that something is, or is not, better, I prefer to spend my time and effort gathering evidence and following the data. I am also a practical guy. The world is headed in this direction, thanks to the ubiquity of data in the digital age. And, on balance, that is a good thing because it has the potential to reduce conflict.
I have pursued applied work in two ways: (1) building stuff (curricula, selection systems, lawyer development tools, datasets for making strategic decisions, etc.) and assessing how well it works, and (2) observing and measuring the work of others.
A Law School Curriculum Worth Measuring
A couple of years ago, a really unique applied research opportunity fell onto my lap. I had a series of lengthy discussions on the future of legal education with Emily Spieler, who was then serving as dean of Northeastern University School of Law in Boston, a position she held for over a decade. One of the raps on legal education is that it is more alike than it is different. In fact, this very point was just made by the ABA Taskforce on Legal Education. See ABA Task Force On The Future Of Legal Education, Report And Recommendations (Jan. 2014) at 2.
Emily, in contrast, said her school was unique -- that the curriculum better prepared students for practice and enabled them to make better career planning decisions. Also, Emily stated that Northeastern students were more sensitized to the needs of clients and the privilege and burden of being a lawyer--specifically, that Northeastern grads become aware, before graduation, that their own lack of competency and diligence has real-world consequences for real-world people. And that reality weighed on students' minds.
Tall claims. But if Northeastern coulddeliver those outcomes more effectively than the traditional unstructured law school curriculum, I wanted to know about it.
On a purely structural level, Northeastern Law is definitely unique. Most law schools are organized on either quarters (University of Chicago, my alma mater) or semesters (Indiana University, where I teach). Northeastern, however, has both. The 1L year curriculum at Northeastern is the traditional two semester model. But after that, the school flips to quarters -- one quarter in law school, and one quarter in a cooperative placement with a legal employer, such as a judge, prosecutor’s office, a law firm, a corporate legal department, or a public interest organization.
This classroom/coop sequence occurs four times over eight quarters. Because the cooperative placement is not viewed as part of Northeastern's ABA-required course work -- all the contact hours are packed into two 1L semesters and four 2L/3L quarters -- students can be paid during cooperative placements. And in any given semester, roughly 30 to 40% are getting paid.
This system has been up and running for 45 years--over 5,000 students have become lawyers through this program. What an amazing research opportunity!
Now imagine the faculty meeting where the law professors get together to discuss and deliberate over whether to adopt the Northeastern model. At Northeastern, "summer" means summer quarter, not summer vacation.
How did this unique curricular structure come into being? That is quite an interesting story. During the 1950s, the law school at Northeastern was shuttered. Yet, reflecting the zeitgeist of the times, a group of Northeastern law alumni and young lawyers who were skeptical of their own legal education (at elite national law schools) petitioned Northeastern to reopen the law school and feature a more progressive, forward-looking curriculum. The university administration agreed to reopen the law school on the condition that the school adopt the signature cooperative education model. So this crucial decision was essentially made at the birth of the law school over four decades ago. Once up and running, Northeastern Law implemented other innovations, such as the narrative grading policy--i.e., no letter grades and no GPA. This was done in order to mitigate competition and encourage a focus on collaboration and skills development.
The Outcomes Assessment Project
Back in 2011, my conversations with Emily Spieler eventually led me to make a two-day pilgrimage to Boston to talk with Northeastern Law faculty, students, administrators, and coop employers. Suffice it to say, I was surprised by what I witnessed --a truly differentiated legal education with a substantial alumni/ae base spanning 45 years.
That pilgrimage eventually led to my involvement in Northeastern Law's Outcomes Assessment Project (OAP), which is something akin to The After the JD Project, but limited in scope to Northeastern -- although Northeastern will provide all of the project tools and templates to other law schools interested in studying their own alumni. From the outset, the OAP has been set up to scale to other law schools.
There are lots of tricky methodological issues with Northeastern. For example,
- It has a longstanding public interest tradition; Northeastern Law is overrepresented in government service, public interest, and non-profit sectors (including a sizeable contingent of law professors and legal clinicians). See Research Bulletin No 1.
- Its student body was over 50% female almost from the outset, nearly 20 years before legal education as a whole.
- Because of its progressive roots, GLBT law students have long been drawn to Northeastern Law -- again, nearly two decades before it was deemed safe to be out.
Because of this distinctive profile, we have to worry that any differences in graduates are primarily due to a selection effect (who applied and enrolled) versus a treatment effect (they got a different type of education). That said, the admissions data show that Northeastern Law students are, like other law students, strongly influenced by the US News rankings. If a student gets admitted to Northeastern Law and BC, BU, or Harvard Law, Northeastern seldom wins.
Over the coming months, I am going to use OAP data to attempt to develop some analytical and empirical clarity to some of the questions surrounding experiential education. Preliminary data from our Research Bulletin No 3 suggest that the coop program does remarkably well in developing the three apprenticeships identified by the Carnegie Report. More on that later.
Print version of this essay at JD Supra.
Sunday, November 24, 2013
Why the Difference in Response to Market Signals?
In Part One, I analyzed how analysis of changes in applicants from LSAC’s Top 240 Feeder Schools demonstrates that graduates of more elite colleges and universities have abandoned legal education at a rate greater than graduates of less elite colleges and universities.
In Part Two, I analyzed how the pool of applicants to law school has shifted with a greater decrease among applicants with high LSATs than among applicants with low LSATs resulting in a corresponding increase in the number and percentage of matriculants with LSATs of <150.
What might explain why applicants to law school are down more significantly among graduates of more elite colleges and universities than among graduates of less elite colleges and universities? What might explain why applicants to law school are down more significantly among those with LSATs of 165+ than among those with LSATs of <150? Is there some relationship between these data points?
There likely is some relationship between these data points. Many of the more elite schools in the LSAC’s list of the Top 240 Feeder Schools have historically been schools whose graduates on average have higher LSAT scores compared with graduates from less elite schools. The LSAC’s 1995 publication, Legal Education at the Close of the Twentieth Century: Descriptions and Analyses of Students, Financing, and Professional Expectations and Attitudes, authored by Linda F. Wightman, discusses the characteristics of the population of students who entered law school in the fall of 1991. Roughly 31% of the students scoring in the top quarter in terms of LSAT came from very highly selective undergraduate schools, roughly 31% from highly selective undergraduate schools, and only 17% from the least selective undergraduate schools. Id. at page 38, Table 20. Thus, it is very likely that these two data points are related – that the greater decline among applicants from more elite colleges and universities is correlated directly with the greater decline among applicants with LSAT scores of 165+.
I want to offer three possible explanations for this differential response to market signals among different populations of prospective law students. The first two focus on the possibility that market signals are communicated differently to different populations. The third focuses on how different populations of prospective law students simply might respond to the same market signals in markedly different ways.
Different Pre-Law Advising Resources May Mean Market Signals Penetrate Some Populations of Prospective Law Students More Deeply Than Other Populations of Prospective Law Students. Focusing first on the nature of the feeder schools, one possibility is that access to pre-law advising resources differs across these different categories of feeder schools resulting in different messages being communicated to applicants from less elite colleges and universities than to applicants from more elite colleges and universities regarding the cost of legal education and the diminished employment prospects for law school graduates in recent years. Perhaps there are more robust pre-law advising programs among the elite colleges and universities than among the less elite colleges and universities, with pre-law advisors who really have their finger on the pulse of what is happening in legal education and the legal employment market. Perhaps these more robust pre-law advising programs are engaging in programming and advising that communicates more effectively to prospective law students the significant costs of legal education and the ways in which the challenging employment reality for law graduates in recent years makes the significant cost problematic. As a result, perhaps larger percentages of prospective law students at more elite colleges and universities are getting more information about the increasing costs and diminished employment prospects for law graduates and are deciding to wait to apply to law school or are deciding to pursue a different career completely.
Alternatively, pre-law advisors may have different responses to market signals in thinking about their role in advising students. Perhaps pre-law advisors at more elite colleges and universities are more directive about discouraging students from considering law school while pre-law advisors at less elite colleges and universities are more inclined simply to support student interest in pursuing law school.
There clearly are disparate allocations of resources to pre-law advising across various colleges and universities, different levels of engagement among pre-law advisors and different perspectives on how directive one should be in advising students considering law school. That said, I am not sure these differences necessarily can be delineated in relation to the extent to which a college or university is considered an elite college or university or a less elite college or university. Moreover, with so much information now available on the internet, it is not clear that pre-law advisors are the primary source of information for prospective law students.
These hypotheses would benefit from being explored empirically. What are the relative pre-law advising resources at the schools down more than 30% in applicants between 2010 and 2012 relative to the pre-law advising resources at the schools down less than 10%? Are pre-law advisors at the colleges and universities down more than 30% in applicants between 2010 and 2012 more inclined to affirmatively discourage students from considering law school than pre-law advisors at colleges and universities down less than 10%? Were prospective students at these two categories of schools really receiving different messages about the employment situation for law graduates and the cost of law school?
Different Social Network Signals and Influences --- Another possibility might involve social network signals and influences. Significant empirical data indicates that on average different socio-economic populations attend different types of colleges and universities. Among those entering law school in fall 1991 from very highly selective undergraduate schools, nearly three times as many were from families from upper socio-economic status as from lower-middle socio-economic status. Legal Education at the Close of the Twentieth Century: Descriptions and Analyses of Students, Financing, and Professional Expectations and Attitudes, at page 38, Table 20. By contrast, among those entering law school in fall 1991 from the least selective undergraduate schools, nearly twice as many were from lower-middle socio-economic status as from upper socio-economic status. Id. Similarly, there is fairly significant empirical data indicating that different socio-economic populations generally attend different tiers of law schools with more of the socio-economically elite at higher-ranked law schools and fewer of the socio-economically elite at lower-ranked low schools. Id. at pages 30-31, Table 15 and Figure 7; Richard H. Sander and Jane R. Bambauer, The Secret of My Success: How Status, Eliteness and School Performance Shape Legal Careers, 9 J. Empirical Legal Stud. 893, Table 2 (2012)(analysis of the After the JD dataset looking at a representative sample of law school graduates who took the bar in 2000).
Given this background, it would seem plausible that graduates of more elite colleges and universities on average represent more of an upper-income socio-economic population who may know more lawyers than graduates of less elite colleges and universities who may on average represent more of a middle class socio-economic population. The parents of graduates of more elite colleges and universities may be more likely to be lawyers and/or have friends who are lawyers. Thus, it is possible that graduates of more elite colleges and universities may be more likely to have received negative signals about the rising cost of legal education and the diminished employment prospects for law school graduates in recent years from family and friends than did their peers from less elite colleges and universities. This hypothesis also would benefit from being explored empirically.
Different Decision Matrices Based on Socio-Economic Status and Opportunity – Another possibility is that regardless of whether students across different types of feeder schools really are getting different messages about the costs of legal education and the challenging employment prospects for law school graduates, they simply may be making different decisions in response to that information. This hypothesis builds on the possibility that different populations of prospective law students may have different motivations for considering law school or may evaluate the value of a legal education using different parameters given different sets of options that might be available to them. It is possible that the market signals regarding employment of law graduates are more nuanced than we might generally appreciate.
For example, it may be that graduates of elite colleges and universities, who also tend to be among the socio-economic elite, have a variety of employment options coming out of college that are more attractive than law school at the moment given the diminished job prospects for law graduates in recent years. If these students generally value a law degree primarily because of the status associated with acquiring a “prestigious” job in a big firm upon graduating from law school, than the significant decline in big firm jobs might frame their analysis of the value-proposition of law school. Changes in the legal employment marketplace, particularly significant declines in the number of positions with “prestigious” big firms, may have made the legal profession less attractive to the socio-economic elite, who may be able to pursue job opportunities in finance, investment banking, consulting, or technology, or meaningful public interest opportunities such as Teach for America, that are viewed favorably within their social network.
By contrast, for graduates of less elite colleges and universities, who are generally not from the socio-economic elite, fewer opportunities may be available in finance, investment banking, consulting, and technology. In addition, they may lack the financial flexibility to make Teach for America or other public interest opportunities viable. Moreover, this set of prospective law students may be more motivated simply about becoming a lawyer and acquiring the status that comes with being a lawyer (even if they are not going to become a big firm lawyer, but are simply going to be a family law attorney, or a public defender or a worker’s comp attorney). This population may be less focused on big firm options and less concerned about the lack of jobs in that niche within the market and may see any position within the legal profession as a path toward financial security and social status, despite the increasing costs of legal education and the diminished employment prospects of law graduates.
These hypotheses also may merit more empirical assessment. What are the graduates of more elite colleges and universities choosing to do in greater numbers as significantly smaller numbers apply to law school? Are there different motivations for pursuing law school among different socio-economic populations?
Regardless of the explanation for the current changes in application patterns, it would appear that the population of law students not only is shrinking, but may be going through a modest demographic transformation, with a somewhat smaller percentage of law students representing the socio-economic elite and a somewhat larger percentage of law students from lower on the socio-economic scale. First-year students in 2013 may be slightly less “blue blood” and slightly more “blue collar” than they were in 1991. Whether this is a short-term trend or a longer term reality remains to be seen. What it might mean for legal education and the legal profession over time also remains to be seen.
Thursday, October 17, 2013
Trends in LSAT Profiles of Applicants and Matriculants
In looking at trends over the last 12 years, there are two relevant time frames due to changes in how LSAC reported data. Between 2002 and 2009, the LSAC’s annual National Decision Profiles were based on the average LSAT scores of applicants and matriculants. From 2010 to the present, the National Decision Profiles were based on the highest LSAT scores of applicants and matriculants. This post compares trends in LSAT profiles between 2002 and 2009 with trends between 2010 and 2013, noting that the latter period not only has seen a decline in enrollment but also has seen a significant weakening of the overall LSAT profile of first-years.
Changes in LSAT Profiles from 2002-2009 Using Average LSAT
The following chart shows the difference in LSAT composition of first-years in three cycles between 2001-02 and 2008-09.
Matriculants by LSAT Category (Reflecting Average LSAT) 2002-2009
165+ 150-164 <150 Total
2001-02 5,889 30,100 9,097 45,086
2004-05 7,447 32,007 6,036 45,490
2008-09 7,652 31,991 8,943 48,586
In the three years between 2002 and 2005, applications grew by roughly 5,000, to roughly 95,000, with growth among those with an average LSAT of 165+ and an average LSAT of 150-164, and a modest decline among those with an average LSAT of <150. Law schools matriculated only 400 more first-years in 2005 than in 2002, but there were roughly 3,050 fewer first-year students with average LSATs <150, with 1,900 more first years with average LSATs of 150-164 and roughly 1,550 more with average LSATs of 165+. This three-year period saw strengthening of the LSAT profile of first-year students.
Four years later, with an applicant pool that had declined to nearly 87,000, however, law schools enrolled over 3,000 additional first-year students, 2,900 of whom had average LSATs of <150. Virtually all of the growth in first-years between 2005 and 2009, therefore, was comprised of students at the lower end of the LSAT profile.
Nonetheless, in comparison with the 2002 first-years, the 2009 first-years included slightly fewer students with an average LSAT of <150 (down 154 – 1.7%) and larger populations of students with average LSATs of 165+ (up 1,763 – nearly 30% more) and with average LSATs of 150-164 (up 1,891 – or roughly 6.3% more). In 2009, therefore, the average LSAT profile of all first-years, while less robust than in 2005, was still more robust than in 2002.
Between 2004 and 2008, the ABA approved nine new law schools (with fall 2009 first-year enrollment in parentheses) – Atlanta’s John Marshall (211) and Western State (188) in 2005, Liberty (119), Faulkner (150) and Charleston (241) in 2006, Phoenix (272) in 2007, and Elon (121), Drexel (156) and Charlotte (276) in 2008. The first-year enrollment of these nine schools in Fall 2009 totaled 1,734, roughly 60% of the growth in matriculants with average LSATs of < 150 between 2005 and 2009. While many of the first-year students at these schools had LSATs of greater than 150, these schools took students who might have gone to other schools and increased the overall demand for applicants with average LSATs of <150.
Changes in LSAT Profiles from 2010-2013
The following chart focuses on the last three admissions cycles and the current admission cycle, covering the period in which the LSAC National Decision Profiles were based on each applicant’s highest LSAT score.
Applicants and Matriculants Across Three LSAT Categories Based on Highest LSAT from 2010 to 2013
Adm. Cycle Total Total Apps. Mat. Apps. Mat. Apps. Mat.
Apps. Mat.* 165+ 165+ 150-164 150-164 <150 <150
Fall 2010 87912 49719 12177 9477 47722 32862 26548 7013
Fall 2011 78474 45616 11190 8952 41435 29220 24396 7101
Fall 2012 67925 41422 9196 7571 34653 25425 22089 7906
Fall 2013** 59426 38900 7496 6300 30263 24000 20569 8200
*Note that the total matriculants number is greater than the sum of the matriculants across the three categories in any given year because the total matriculants number includes non-standard test-takers and those without an LSAT.
**The Fall 2013 numbers represent estimates based on the number of applicants in each category and an assumption that 2013 saw another slight increase in the percentage of applicants from each LSAT category who matriculated (consistent with increases in the two previous years in response to the decreasing applicant pool).
During this period, the number of applicants declined by 28,000, or over 32%, but the number of applicants with a highest LSAT of 165+ declined by 38%, and the number with a highest LSAT of 150-164 declined by 36.5%, while the number with a highest LSAT of <150 declined by only 22.5%. Thus, the pool of applicants is not only smaller in the 2012-13 admissions cycle as compared to 2009-10, but it is “weaker” in terms of the LSAT profile.
The number of matriculants in the top two LSAT categories also declined significantly between Fall 2010 and Fall 2012, while the number of matriculants in the bottom LSAT category actually grew.
The number of matriculants whose highest LSAT score was 165+ fell from 9,477 in 2010 to 7,571 in 2012, a decline of over 20%, while the percentage of applicants in this category who became matriculants increased from 78% to 80% to 82% over that period. If we estimate that 84% of the 2013 applicants with a highest LSAT of 165+ matriculate, then we can anticipate roughly 6300 matriculants for Fall 2013 with a highest LSAT of 165+, a drop of nearly 33% since 2010.
The number of matriculants whose highest LSAT score was 150-164 fell from 32,862 in 2010 to 25,425 in 2012, a decline of nearly 23%, while the percentage of applicants in this category who became matriculants increased from 69% to 70.5% to 73% over that period. If we estimate that roughly 79% of the applicants with a highest LSAT of 150-164 matriculate, then we can anticipate roughly 24,000 matriculants for Fall 2013 with an LSAT of 150-164, a decline of roughly 27% since Fall 2010.
Meanwhile, the number of matriculants whose highest LSAT score was <150 grew from roughly 7,000 to over 7,900, an increase of roughly 13%, while the percentage of applicants in this category who became matriculants increased from 26% to 29% to 36% over that period. If we estimate that roughly 40% of the applicants with a highest LSAT of <150 matriculate, then we can anticipate roughly 8,200 matriculants with an LSAT of <150 for Fall 2013, an increase of roughly 17% since Fall 2010.
Percentage of First-Years from Each LSAT Category Using Highest LSAT-- 2010-2013*
165+ 150-164 <150
2010 0.191 0.661 0.141
2011 0.196 0.641 0.156
2012 0.183 0.614 0.191
2013 0.162 0.617 0.211
*The sum of the percentages in any given year will be slightly less than 1.00 because the denominator -- total matriculants -- includes matriculants with non-standard LSAT and those with no LSAT.
This table shows that if my estimates for 2013 are roughly accurate, while the percentage of matriculants whose highest LSAT score was 165+ in the first-year class has declined between Fall 2010 and Fall 2013 by roughly 16% (from 19% to 16%) and the percentage of matriculants whose highest LSAT was 150-164 has declined by roughly 6% (from 66% to 62%) the percentage of matriculants whose highest LSAT was <150 has increased 50% (from 14% to 21%).
Adjusting from Highest LSAT to Average LSAT to Compare 2002 and 2013
The change in the 2009-10 admissions cycle to using highest LSAT rather than average LSAT resulted in an increase in matriculants with scores of 165+ of roughly 1,800 between Fall 2009 and Fall 2010. Given that there had been a modest increase in the number of matriculants with an average LSAT of 165+ between 2008 and 2009 (an increase of roughly 600, from 7,023 to 7,652), it might be fair to assume that there would have been another modest increase in the number of matriculants with an average LSAT of 165+ between 2009 and 2010 given the challenging economic environment at the time and the continued growth in applications between 2009 and 2010. Assume then that of the 1,800 additional matriculants with scores of 165+, 400 would have been included in the category if we were still using an average LSAT of 165+ rather than the highest LSAT of 165+. That would suggest that to estimate the number of matriculants with an average LSAT of 165+ in 2010, it might make sense to subtract 1,400 matriculants from the number of matriculants with a highest LSAT of 165+ in 2010 and then for the next three years apply the same percentage reduction as reflected in the number of those with a highest LSAT of 165+ over those three years.
The change to highest LSAT rather than average LSAT also resulted in a drop in the number of matriculants with an LSAT <150 between 2009 and 2010 of roughly 1,900 matriculants. Notably, the number of applicants and matriculants with an average LSAT <150 had grown slightly between 2007 and 2009 (applicants from 29,123 to 29,926, matriculants from 7,013 to 7,906). Nonetheless, to err on the conservative side, assume that the number of matriculants with an average LSAT <150 actually may have declined in Fall 2010 from Fall 2009 rather than continuing to increase modestly. Assume it would have declined by roughly 5% or 400 (rather than 1,900). That would mean that to estimate the number of matriculants with an average LSAT of <150 in Fall 2010, we would need to add to the number with a highest LSAT of <150 roughly 1,500 more matriculants and then for the next three years apply the same percentage increase as reflected in the number of those with a highest LSAT of <150 over those three years.
Using these assumptions, the estimated number of first-years with an average LSAT of 165+ would fall to roughly 5,400 as of Fall 2013, while the estimated number of first-years with an average LSAT of <150 would rise to over 9,800 in Fall 2013.
If the estimates above are close to accurate, then the number of Fall 2013 matriculants with an average LSAT score of 165+ represents roughly 14% of Fall 2013 matriculants (a slightly higher percentage than in Fall 2002), while the number of Fall 2013 matriculants with an average LSAT of <150 represents over 25% of Fall 2013 matriculants (a much higher percentage than in Fall 2002). The following chart shows the percentage of matriculants for the period from 2002-2013 taking into account the estimates set forth in the preceding paragraph regarding the number of matriculants with an average LSAT in each range over the period from 2010-2013.
This graph shows that the percentage of matriculants with an average LSAT of 165+ has varied between roughly 13% and roughly 17% percent over the period from 2002-2013, and appears to have returned in Fall 2013 to a percentage only slightly higher than where it was in Fall 2002. By contrast, this chart also shows that the percentage of matriculants with an average LSAT of <150 had varied between roughly 19% and roughly 13% until the Fall 2012 and Fall 2013 groups of matriculants, when the percentages increased to roughly 22% (in 2012) and over 25% (in 2013). While this graph does not include the percentage of matriculants with average LSATs of 150-164, one can infer that percentage as the difference between 100% and the sum of the 165+ percentage and the <150 percentage. For the period between 2002 and 2011, this generally hovered between 65% and 70%, but in the last two years it has fallen closer to 60%.
This shift in LSAT profile is further evidenced by changes in LSAT profiles among first-year entering classes between 2010 and 2013. For Fall 2010, there were only nine law schools with a median LSAT of 149 or lower (using highest LSAT for reporting purposes). For Fall 2011, there were 14 law schools with a median LSAT of 149 or lower. For Fall 2012, there were 21 law schools with a median LSAT of 149 or lower. That number may grow to nearly 30 when official data is published next spring on the Fall 2013 entering class.
If one uses the LSAT profile as an indicator of the “strength” of a given class of first-year students, and uses the framework set forth above for looking at the LSAT profile, then in the last three years we not only have seen first-year enrollment shrink by roughly 10,000 students, but also have seen a significant “weakening” of the LSAT profile. In terms of LSAT profile, the Fall 2013 entering class is almost certainly the weakest of any class going back to Fall 2002. This may impact the classroom experience at some law schools and may impact bar passage results when the Fall 2013 entering class graduates in 2016.
Why the Differential Response to Market Signals by Different Populations of Prospective Law Students?
What might explain the extent to which different populations of prospective law students have responded to market signals in such different ways, with those from elite college and universities and those with higher LSATs turning away from law school more than those from less elite colleges and universities and those with lower LSATs? In Part Three I will explore some possible explanations.
Friday, October 11, 2013
Analysis of Differential Declines in Law School Applicants Among Top-240 Feeder Schools
Some people recently have noted the decline in applications to law school from graduates of relatively elite colleges and universities - here and here. This suggests that different populations of potential applicants to law school are responding differently to market signals about the cost of legal education and the diminished employment prospects for law school graduates in recent years.In this blog posting, I analyze the changes in applications among the LSAC's Top 240 Feeder Schools between 2010 and 2012, documenting the extent to which the response to market signals about legal education has been different among graduates of elite colleges and universities when compared with graduates of less elite colleges and universities. In Part Two, I will look at a different set of data regarding changes in LSAT profiles of applicants. In Part Three, I will offer some possible explanations for the different responses to market signals among different groups of applicants.
Between 2010 and 2012, the total number of applicants from the Top 240 Feeder Schools fell from 55,818 to 42,825. In both years, the Top 240 Feeder Schools were responsible for roughly 63% of the total pool of applicants (63.5% of 87,900 in 2010 and 63.1% of 67,900 in 2012). But the decline in applications was not uniform across all of the Top 240 Feeder Schools. There are a few different ways one can look at this information to get a sense of the different responses among different populations of potential applicants.
Differential Declines Among Feeder Schools with Law Schools Ranked in Different Tiers
First, one can look at declines across the Top 240 Feeder Schools that have law schools.
One might surmise that potential applicants who are graduates of colleges and universities with a law school might be particularly well aware of the increasing costs of legal education and the challenging employment environment for recent law school graduates and assume that feeder schools with law schools would generally see similar declines in applications. In fact, however, the percentage decline in applications between 2010 and 2012 varied significantly by the ranking of the law school at the feeder school.
Among feeder schools with law schools ranked between 1-50 in the most recent USNews rankings, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 28.08%. Among feeder schools with law schools ranked between 51-100, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 20.27%. Among feeder schools with law schools ranked between 100-146, the average percentage decline in applicants was 18.14%. But among feeder schools with law schools that are ranked alphabetically, the average percentage decline in applicants was only 3.31%.
Given that most of the top ranked law schools are at colleges and universities that also are considered elite colleges and universities, and most of the alphabetically ranked law schools are at colleges and universities that are not considered elite colleges and universities, this analysis suggests that graduates of elite colleges and universities are responding to the market signals regarding legal education differently than graduates of less elite college and universities. (This may seem particularly paradoxical, given that the percentage decline in applicants generally is greater at colleges and universities with more highly ranked law schools (whose graduates generally experience more promising employment outcomes) while the percentage decline in applicants is lowest at colleges and universities with less highly ranked law schools (whose graduates generally experience less promising employment outcomes.))
Comparisons of Outlier Schools – Those Schools More than One Standard Deviation from the Mean
Second, one can look at “outlier” schools and see how negative outliers compare to positive outliers. The average percentage decline in applicants across the Top 240 Feeder Schools between 2010 and 2012 was 19.76%. The standard deviation was 18.67%. How do those schools more than one standard deviation from the mean compare with each other?
There are a total of 13 schools that saw a decline in applicants between 2010 and 2012 putting them below the mean by more than one standard deviation – schools with a decline in applications greater than 38.44%. There are a total of 26 schools that saw an increase in applications or such a modest decline in applications that their increase/decline was more than one standard deviation above the mean – a decline of less than 1.09% or an increase. How do these schools compare?
Eight of the 13 feeder schools that saw the most significant declines in applications had a law school with an average rank of 69. (These schools include NYU (6), Virginia (7), Cornell (13), George Mason (41), Marquette (94), Akron (119), Loyola (New Orleans) (126), and Univ. of San Fran. (144). Four of the eight were top-50 law schools, while none were alphabetically ranked.)
Thirteen of the 26 feeder schools that saw the least significant declines in applications (or saw increases in applications) had a law school, including four that were ranked alphabetically. Among just the nine law schools in this category that are ranked, the average rank is 104. (These schools include Denver (64), UNLV (68), Loyola (Chicago) (76), Rutgers (91), Florida International (105), Wyoming (113), CUNY (132), Southern Illinois (140), and Suffolk (144), along with Florida A & M, North Carolina Central, Nova Southeastern, and Southern (all alphabetical). Notably, only four of the thirteen were ranked in the top-100 law schools (none in the top-50).)
Again, in this analysis, with a few exceptions, those feeder schools that saw significant declines in applicants generally represent a more elite slice of American colleges and universities, while those with the most nominal declines in applicants (or increases in applicants) generally represent a less elite slice of American colleges and universities.
Outliers More Broadly – Comparing Schools with Declines Greater than 30% and Less than 10%
Third, if one wanted to look at a broader pool of feeder schools at the bottom and the top, one could look at all schools down 30% or more in applicants and all schools that were down 10% or less in applicants between 2010 and 2012 (roughly 10% above and below the mean), two sets that account for nearly half of the Top 240 Feeder Schools.
There were 68 schools down 30% or more in applicants, 46 of which had a law school, of which 29 were ranked in the top-50, with only one school ranked alphabetically. The average rank of the 45 numerically ranked law schools was 48. The other 22 feeder schools in this category include several highly regarded schools – including, for example, Rice, Vassar, Miami University, Brown, Amherst, Johns Hopkins and Princeton.
There were 51 schools with a decrease in applicants of 10% or less, 25 of which had law schools, only two of which were ranked in the top-50, with six schools ranked alphabetically. The average rank of the 19 numerically ranked law schools was 94. The other 26 feeder schools in this category include mostly less elite colleges and universities – including, for example, Kenesaw State University, University of Texas at San Antonio, and Florida Gulf Coast University, along with University of Phoenix and Kaplan University.
All three approaches to analyzing the changes in applicants among the Top-240 Feeder Schools point in the same direction. Graduates of elite colleges and universities are opting not to apply to law school at a greater rate than graduates of less elite colleges and universities. One might suppose that this translates to a greater decline in the number of applicants and matriculants with really high LSATs (165 or above) as compared to those with relatively low LSATs (149 and below). In Part 2, I explore whether this supposition is accurate.
Posted by Jerry Organ
Wednesday, October 2, 2013
Because the U.S. News & World Report ranking era has been associated with so much turmoil and bad behavior, many of us in legal education tend to think of the magazine as the source of woes. In fact, the evidence compiled in an new paper on SSRN, "Enduring Hierarchies in American Legal Education," suggest that our desire (or propensity) to establish a legal education pecking order predates the U.S. News rankings by century or so. Vanity of vanities, all is vanity -- at least that is what the data seem to suggest.
My brilliant and industrious colleagues, Funmi Arewa and Andy Morriss, led the charge on this. For many, a major contribution of this research will be the detailed 40+ tables compiled at the end of the article. Now that all that fact-gathering work is done, others can use it. Below is the paper's abstract:
Although much attention has been paid to U.S. News & World Report’s rankings of U.S. law schools, the hierarchy it describes is a long-standing one rather than a recent innovation. In this Article, we show the presence of a consistent hierarchy of U.S. law schools from the 1930s to the present, provide a categorization of law schools for use in research on trends in legal education, and examine the impact of U.S. News’s introduction of a national, ordinal ranking on this established hierarchy. The Article examines the impact of such hierarchies for a range of decision-making in law school contexts, including the role of hierarchies in promotion, tenure, publication, and admissions, for employers in hiring, and for prospective law students in choosing a law school. This Article concludes with suggestions for ways the legal academy can move beyond existing hierarchies and at the same time address issues of pressing concern in the legal education sector. Finally, the Article provides a categorization of law schools across time that can serve as a basis for future empirical work on trends in legal education and scholarship.
Posted by Bill Henderson
Wednesday, July 3, 2013
As a result of the ABA’s revisions to Standard 509, Consumer Information, there is now a much greater universe of publicly available information about law school scholarship programs, specifically conditional scholarship programs and scholarship retention. Based on a review of law school websites conducted between March 19 and May 29, 2013, I have compiled a complete list of schools with conditional scholarship programs, with only one-year scholarships, with good standing (or guaranteed) scholarships and with only need-based scholarships.
The availability of this data now gives each admitted scholarship recipient some meaningful basis for assessing the likelihood that any given scholarship will be renewed. (That said, within a given cohort of conditional scholarship recipients at a given school, those at the top end of the entering class profile likely retain their scholarships at a higher percentage than reflected in the law school's overall data while those further down the class profile likely retain their scholarships at a lower percentage than reflected in the law school's overall data.)
What do we know about the conditional scholarship programs in place for students entering law school in 2011-12? There were 140 schools with conditional scholarship programs. The average retention rate across all law schools was 69%. In total, 12,735 students who entered law school in the fall of 2011 and continued into their second year of law school at the same school entered with conditional scholarships and 4,387 students lost those scholarships, a retention rate across individual students of 66%. Across the 194 law schools on which I compiled data, the Fall 2011 entering first-year class totaled 46,233, so roughly 27.5% of the students in the Fall 2011 entering first-year class were on conditional scholarships and roughly 9.5% of the students in the Fall 2011 entering first-year class failed to retain their conditional scholarship as they moved into the second year of law school.
The distribution of scholarship retention rates by deciles across all 140 schools reporting conditional scholarship programs is set forth in Table 1. Table 1 shows the largest number of law schools grouped around the overall average retention rate, with 30 law schools in the 60-69% range and 24 law schools in the 70-79% range; nearly 40 percent of law schools with conditional scholarships fall in these two ranges. Interestingly, the decile range of 90% or better is the second largest decile range, with 26 law schools (nearly half of which are ranked 50 or better in the USNEWS ranking). Notably, 23 law schools had scholarship retention rates of less than 50%.
Table 1: Number of Law Schools Reporting Retention Rates by Decile Range
Less than 40%
Four of the eight were law schools ranked alphabetically
Eight of the 15 were law schools ranked between 50 and 99
16 of the 20 were law schools ranked 100 or lower, while only two were in the top 50
23 of the 30 were law schools ranked 100 or lower, while only one was in the top 50
13 of the 24 were law schools ranked in the top 100, but only three of those were in the top 50
12 of the 17 were law schools ranked between 50 and 145
90% or better
12 of the 26 were law schools ranked in the top 50
As shown in Table 2, law schools ranked in the top-50 in the U.S.News 2012 Rankings had the smallest percentage of law schools with conditional scholarship programs, with only 20 law schools – 40% -- having conditional scholarship programs, directly impacting only 1,674 students who had conditional scholarships (12.8% of the 13,109 first-year students at these law schools) and only 192 who failed to retain their scholarships (11.5% of the 1674 conditional scholarship recipients and only 1.5% of the 13,109 first year students). By contrast, across the balance of law schools, over 80% of the law schools had conditional scholarships with 11,061 of the 33,124 first-year students (33.4%) having conditional scholarships and 4,195 (37.9% of those on scholarship and 12.7% of first-years at the balance of law schools) losing their scholarships after their first-year of law school.
Table 2: Number and Percentage of First-Year Students in 2011 Having Conditional Scholarships and Losing Conditional Scholarships by US News Rankings Categories
Top 50 Law Schools
Law Schools Ranked 51-100
Law Schools Ranked 101-146
Law Schools Ranked Alphabetically
Total Number of Law Schools
Number (%) of Law Schools with Conditional Scholarship Programs
Total First-Years at These Law Schools
Number (%) of First-Years with Conditional Scholarships
1,674 (12.8% of all first-year students in top-50 schools)
4,176 (36% of all first-year students in schools 51-100)
2,754 (29.6% of all first-year students in schools 101-145)
4,131 (33.6% of all first-year students at alphabetically-ranked schools)
Number (%) of Conditional Scholarship Recipients NOT Retaining Scholarships
192 (11.5% of conditional scholarship recipients and 1.5% of first-years)
1,454 (34.8% of conditional scholarship recipients and 12.5% of first-years)
1,044 (37.9% of conditional scholarship recipients and 11.2% of first-years)
1,697 (41% of conditional scholarship recipients and 13.7% of first-years)
A number of law schools switched to non-conditional scholarship programs for 2012-13 or will be switching to non-conditional scholarship programs for the 2013-14 academic year. As a result, for the 2013-14 academic year, there will be 131 law schools with conditional scholarship programs, five law schools with non-renewable one-year scholarships, four that only offer need-based scholarships, and 54 law schools with good standing (or guaranteed) scholarships. Of the 194 schools on which I was gathering information, therefore, as of the 2013-14 academic year, 70% will have conditional or one-year scholarship programs (136/194), while nearly 28% will have good standing (or guaranteed) scholarships (54/194), with 2% (4/194) having only need based scholarship assistance. (Note that some law schools with conditional scholarship programs also offer some scholarships on a non-conditional basis and/or offer some need-based assistance.)
Those who might be interested in a more detailed analysis of conditional scholarship programs, may want to look at the draft article I have posted on SSRN – Better Understanding the Scope of Conditional Scholarship Programs in American Law Schools.
[posted by Jerry Organ]
Wednesday, June 5, 2013
For those trying to better understand how legal education can better prepare law students for the world that awaits them, I would encourage you to take a look at the draft article my colleague, Neil Hamilton, Director of the Holloran Center for Ethical Leadership in the Professions at the University of St. Thomas School of Law, recently posted on SSRN. The article is entitled Law-Firm Competency Models and Student Professional Success: Building on a Foundation of Professional Formation/Professionalism. Here is some of the description from the abstract:
A law student who understands legal employer competency models can differentiate him or herself from other graduates by using the three years of law school to develop (and to create supporting evidence to demonstrate) specific competencies beyond just knowledge of doctrinal law, legal analysis, and some written and oral communication skills. . . .
In Part I below, this essay analyzes all available empirical research on the values, virtues, capacities and skills in law firm competency models that define the competencies of the most effective and successful lawyers. Part II examines empirical evidence on the competencies that clients evaluate. Part III evaluates the competencies that make the most difference in fast-track associate and partnership promotions. These data and analyses lead to several bold propositions developed in Part IV:
1. Law students and legal educators should identify and understand the values, virtues, capacities and skills (the competencies) of highly effective and successful lawyers in different types of practice (one major example is law firm competency models analyzed below in Part I);
2. Each student should use all three years of experiences both inside and outside of law school (including the required and elective curriculum, extracurricular activities, and paid or pro bono work experiences) to develop and be able to demonstrate evidence of the competencies that legal employers and clients want in the student’s area of employment interest;
3. Law schools should develop a competency-based curriculum that helps each student develop and be able to demonstrate the competencies that legal employers and clients want; and
4. Both law students and law schools should understand that the values, virtues, capacities and skills of professional formation (professionalism) are the foundation for excellence at all of the competencies of an effective and successful lawyer.
The article presents far more useful information than can be summarized here, and different readers may be struck by different things discussed in the article. One of the most significant takeaways for me, however, is the convergence around an array of competencies frequently not taught in law school. The article analyzes competency models used to assess associate development at 14 medium to large law firms in the Twin Cities and compares that with some other literature on competencies clients look for in attorneys. The analysis demonstrates that in addition to traditionally understood technical skills – legal analysis, oral and written communication, and knowledge of the law – there is significant convergence around several competencies frequently not taught in law school – 1) Ability to initiate and maintain strong work and team relationships; 2) Good judgment/common sense/problem-solving; 3) Business development/marketing/client retention; 4) Project management including high quality, efficiency, and timeliness; 5) Dedication to client service/responsive to client; and 6) Initiative/ambition/drive/strong work ethic.
Whether law schools are going to be able to find efficient ways to offer students opportunities to develop these competencies, it is imperative that we make our students aware that they need to be developing these competencies to give themselves the greatest likelihood of professional success.
[posted by Jerry Organ]
June 5, 2013 in Data on legal education, Data on the profession, Important research, Innovations in legal education, Law Firms, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (0)
Wednesday, February 13, 2013
My previous post on Washington & Lee's 3L Program stirred a lot of interest and commentary, including some disbeleiving critics. Fortunately, Professor Jim Moliterno agreed to write a reply essay, below, that completes the cycle. [Bill Henderson]
Jim Moliterno Replies [This is a long reply, so a PDF version online here]
A number of comments to Bill’s January 28 post and posts regarding it on other blogs cause me to enter this conversation.
Are students really coming to W&L because of the new curriculum? Yes, to a significant extent. How do we know? Because the entering students say so. As do many law schools, we administer a questionnaire to our enrolling students. Among the questions asked is the obvious one: why are you here?
In the most recent such survey the students were asked to rank the strengths of the law school. Here are the top ten, in order, according to the entering students:
- Third Year Curriculum
- Ranking / Prestige
- Quality of Life
- National Reputation
- Job Placement
- General Cirriculum
- Clinical Program
- Financial Aid Award
- Size of Lexington
The curriculum reform was first. Financial aid awards were 9th, just ahead of the “size of Lexington.” The data does not support the unsubstantiated claims of some bloggers that students are choosing W&L because of the generosity of financial aid awards.
The curriculum reform has steadily moved higher on the “strength” rankings given by enrolled students since 2009. The 2011 and 2012 surveys are nearly identical, and the written comments of students about their reasons for coming to W&L (none reprinted here), are more striking than the numbers themselves.
I don’t know of any better data on this proposition but the statements of those whose reasons are under study. If that data is unsatisfying to some, then they will continue to be unsatisfied.
Are there other reasons students come to W&L? Of course. W&L has a highly productive, highly visible faculty engaged in scholarship and projects at the highest levels. Some students undoubtedly value W&L’s faculty prowess. W&L is highly ranked. Some students undoubtedly are affected by a top 25 ranking. It has an excellent reputation as a small, closely-knit academic community. Some students select for the sense of community and size. No reason will ever be the only reason for prospective students to choose a law school. Changes made by law schools will affect student choices for or against a particular law school. The W&L curriculum reform is positively affecting a significant number of students’ calculus about choosing W&L.
And some do come because of the financial aid package they were offered. But the financial aid reason is unlikely to explain the increase in applications since 2008. Some students, the recipients of aid, undoubtedly come in part because of the aid. That is no different than the students who choose [insert name of any school] because of the financial aid they were awarded. In 2012, about the same number of offers of admission were made as in previous years, but instead of the usual 130 or 135 admittees choosing to attend, more than 260 made deposits. Some were asked to defer their attendance until 2013 and once the dust settled we had a class of 187 instead of the usual 130 to 135. This same class entering in 2012 listed the curriculum reform first and financial aid ninth as strengths of the law school.
What else was happening in 2008 and 09 when the applications increased by nearly 33% per year?
In 2009 and 10, while W&L applications were on the rise, the US News ranking fell from 25-34 (while its reputation rank among academics stayed steady). It has now recovered to 24. If anything, that should have led to a drop in applications during 2008-2011 rather than the sharp increases that actually occurred.
Can we exclude all other possible explanations than those previously mentioned? Of course not. It could be that being in a small, beautiful mountain town is all the rage among young adults and 33% more students want that now than wanted it in 2007. I know of no data to prove or disprove that proposition, so it remains one that could be true. The reality is that the students who have come in recent years rate the curriculum reform among the top reasons (often the most important reason) for their attendance at W&L. That matters.
There is empirical evidence that the W&L curriculum reform is engaging students more than in the traditional “no plan” third year curriculum. Is it perfect evidence? Of course not. Is it definitive evidence that has no flaw? Of course not. Is anything ever supported by perfect, definite evidence that has no flaw? Not to my knowledge. We make all of our most important decisions in life based on the best available evidence. As long as the evidence is empirically sound and statistically significant, it is worthy of respect. The evidence of W&L 3L engagement increases is sound and statistically significant and marks a path toward further research and verification.
One commenter suggested that the data is suspect because the peer schools have not been identified. Their data belongs to them, not W&L. LSSSE does not make specific school data available to other schools. So W&L has only a composite score for those peer schools. And it would be unseemly for W&L to reveal the specific schools. I will not do so here. But to be sure, W&L asked LSSSE to calculate the data from a list of schools because they are the schools with whom W&L competes for students and competes in the rankings. It would not have served W&L’s research interests to learn how it compares with a list of schools that it does not compete with in the marketplace. No one at W&L has the data for any specific school.
Nonetheless, do not be mistaken, the schools with whom W&L is compared in LSSSE data are the schools anyone would expect them to be: schools that by their geography, rank and quality compete with W&L in the relevant markets for students and placement.
One observation: in the legal profession and legal education in particular, the status quo never seems to need empirical justification. Only change is suspect and wrong until proven definitively to be otherwise. Is there any empirical evidence that the status quo third year is the best possible third year except that it has been done that way for a long time? None that I know of. The old adage, “if it ain’t broke don’t fix it” does not apply here. The third year of legal education is “broke”.
Amid calls for its abandonment by some, dating back at least to the early 1970s report by Paul Carrington, the third year is widely acknowledged to be of the least value among the three years. (See below on W&L’s largely unchanged approach to years 1 and 2.) The Roman Legions (and more than a few other military powers) have found out that the mere fact that something has been successfully done before is not sufficient evidence that it will prevail in the present or future. Arguing in favor of the status quo based on no empirical evidence, . . . based only on instinct and the argument that it is the way things are currently done, is an approach doomed to failure. Just ask Kodak. (And see my forthcoming book: “The American Legal Profession In Crisis,” Oxford, March 2013.)
How about the claim that “[W&L’s LSAT has] gone down every year since [the new curriculum was announced], while its GPA rank has, after a plunge, more or less returned to where it was.” The blogger made that claim, once again without any data, let alone empirically credible data. Actually the W&L median LSAT was steady at 166 from 2005-2010, dropped 2 points to 164 in 2011 and stayed at 164 for 2012. It has not “gone down every year since [the new curriculum was announced in 2008].” Meanwhile, the GPA of entering classes, which was in the 3.5 and 3.4 range in 2008-2010, has gone up to the 3.6 range (3.65 and 3.62) in 2011 and 2012. The two modest changes in LSAT and GPA have essentially off-set one another in US News points. Hardly the reason for pause suggested by the blogger.
It seems that as long as someone is arguing against change, no rules apply to the arguments’ underpinnings.
Here is what the empirical evidence from the LSSSE surveys shows and what it does not show: students are more engaged in their work and their work includes more writing, more collaboration and more problem solving. Here are a few charts even more striking than those Bill used in his post. Together they say that significantly more than their peers or their predecessors at W&L, current third year students are working more, writing more, collaborating more, applying law to real world problems more, and preparing for class more often. Overall, they describe a harder-working, more engaged student body. And they are working harder at acquire the skills that matter to success as a lawyer.
February 13, 2013 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Structural change | Permalink | Comments (6)
Tuesday, January 29, 2013
Here it is in a nutshell. There is empirical evidence that Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students—significantly better than prior graduating classes at W&L, and significantly better than W&L’s primary competitors. Moreover, at a time when total law school applicants are on the decline, W&L’s getting more than its historical share of applicants and getting a much higher yield. When many schools are worried about revenues to survive next year and the year after, W&L is worried about creating the bandwidth needed to educate the surplus of students who enrolled in the fall of 2012, and the backlog of applicants that the school deferred to the fall of 2013.
[This is a long essay. If you want it in PDF format, click here.]
Alas, now we know: There is a market for high quality legal education. It consists of college graduates who don’t want to cast their lot with law schools who cannot guarantee students entree to meaningful practical training. Some might argue that W&L is not objectively better-- that the 3L curriculum is a marketing ploy where the reality falls well short of promotional materials and that, regardless, prospective students can't judge quality.
Well, in fact there is substantial evidence that the W&L 3L program delivers comparative value. The evidence is based on several years' worth of data from the Law School Survey of Student Engagement (LSSSE). I received permission from Professor James Moliterno, someone who took a leadership role in building W&L’s third year program, to share some of the key results (each school controls access to its LSSSE data.) They are below.
But before getting into empirical evidence, I want to put squarely on the table the most sobering finding that likely applies to virtually all of legal education. It is this: On several key LSSSE metrics, W&L has made impressive gains vis-à-vis its own historical benchmarks and its primary rival schools. But even for this leader, there remains enormous room for improvement. More on that below.
Here is the bottom line: Traditional legal education, when it is measured, does not fare very well. Yet, as W&L shows, substantial improvement is clearly possible. We law professors can respond to this information in one of two ways:
- Don’t measure, as it may disconfirm our belief that we are delivering a great education.
- Measure—even when it hurts—and improve.
I am in the second camp. Indeed, I don’t know if improvement is possible without measurement. Are we judging art work or the acquisition of key professional skills needed for the benefit of clients and the advancement of the public good?
Moving the Market
I doubt I will ever forget Jim Moliterno’s September 2012 presentation at the Educating Tomorrow’s Lawyers (ETL) conference at the University of Denver. He presented a single graph (chart below) showing W&L actual applicant volumes since 2008 versus what would have happened at W&L if its applicant volume had followed national trends.
While law school applicants crested a few years ago, W&L enjoyed a large run-up in volume of applicants, presumably due to the launching of their new 3L program. This larger applicant pool effectively served as a buffer when applicant declines began in 2011 and 2012. Since 2008, overall law school applicants are down -19%, yet W&L is up overall +33%.
But much more significantly, after their experiential 3L year was up and running and the overall legal job market continued to stagnate, W&L yields spiked. Ordinarily they would enroll 135 students. But for the fall of 2012, they received enrollment commitments from well over 260 students. Indeed, at the ETL conference Jim Moliterno said the school had to offer financially attractive deferments to get the class to approximately 185 incoming students -- a 50 student bulge.
When Jim Moliterno showed the above graph and explained the corresponding changes in yield, my good friend Gillian Hadfield, a skeptical, toughminded, evidence-demanding economist who teaches at USC Law, leaned over and said to me, “that is the single most important takeaway from this entire conference.” I agreed. The market for a legal education with practical training is, apparently, much more inelastic than the market for traditional JD programs.
Yet, what is perhaps most remarkable is that a large proportion of incoming students at W&L were enrolling based on little more than faith. Nobody knew for sure if W&L had the ability to pull off their ambitious 3L curriculum. The program relies on a large cadre of adjunct professors, after all, and W&L is located in remote Lexington, Virginia. Many law faculty outside of W&L, and perhaps some inside, thought (or perhaps think) that the program could not live up to the hype. Well, as shown below, the program appears to have produced meaningful gains.
The only data-driven critique anyone can muster is that the gains remain significantly short of perfection. But that critique bites harder on the rest of us. To use a simple metaphor, W&L is tooling around in a Model-T while the rest of us rely on horse and buggy. What ought to be plain to all of us, however, is that, just like automobile industry circa 1910, we are entering a period of staggering transformation that will last decades. And transformation will be roughly equal parts creation and destruction. See Schumpeter.
W&L Data, Internal Historical Benchmark
LSSSE is a phenomenally rich dataset – nearly 100 questions per year on a wide variety of topics related to student classroom experience, faculty interaction, type and quantity of assessments, time allocation, and perceived gains on a variety of dimensions related to personal and professional development. The survey instrument is online here.
Aside from a host of questions related to demographics, career goals, and debt, major sections in the LSSSE include:
- Section 1, Intellectual Experience (20 questions)
- Section 2, Examinations (1 question)
- Section 3, Mental Activities (5 questions)
- Section 4, Writing (3 questions)
- Section 5, Enriching Educational Experiences (9 questions)
- Section 6, Student Satisfaction (7 questions)
- Section 7, Time Usage (11 questions)
- Section 8, Law School Environment (10 questions)
- Section 9, Quality of Relationships (3 questions)
- Section 10, Educational and Personal Growth (16 questions)
W&L deserves to be a detailed case study. But frankly, legal education can’t wait. So I will do the best I can to cover the landscape in a blog post. I hope every law faculty member who reads this post makes a strong plea to their dean to enroll in LSSSE. Why? So your school can benchmark itself against the detailed LSSSE case studies that are bound to flow out of W&L and other innovative law schools. Though they don’t get much press, there are, in fact, other innovative law schools.
Friday, January 18, 2013
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occurring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
I tried to write Part II, but a blog post just was not up to the task. Further, I sensed that my colleagues were in no mood for half-baked solutions. There has been enormous criticism of legal education on the blogs and in the media, but very little in the way of detailed prescriptions to improve the situation. I felt an obligation to back off on the criticism and focus on solutions. So, in essence, Part II of my Tamanaha review became an article.
I just posted to SSRN an article entitled "A Blueprint for Change" forthcoming in the Pepperdine Law Review. It is both a diagnosis and a proposed solution -- a solution I am actively pursuing. Here is the abstract:
This Article discusses the financial viability of law schools in the face of massive structural changes now occurring within the legal industry. It then offers a blueprint for change – a realistic way for law schools to retool themselves in an attempt to provide our students with high quality professional employment in a rapidly changing world. Because no institution can instantaneously reinvent itself, a key element of my proposal is the “12% solution.” Approximately 12% of faculty members take the lead on building a competency-based curriculum that is designed to accelerate the development of valuable skills and behaviors prized by both legal and nonlegal employers. For a variety of practical reasons, successful implementation of the blueprint requires law schools to band together in consortia. The goal of these initiatives needs to be the creation and implementation of a world-class professional education in which our graduates consistently and measurably outperform graduates from traditional J.D. programs.
I have a large backlog of shorter articles and analyses that I have not posted because I wanted my own detailed solution in the public domain. I hope to tie all of these ideas together over the coming weeks.
Thank you, Brian Tamanaha, for writing an book that required me to think in terms of solutions.
[posted by Bill Henderson]
January 18, 2013 in Current events, Data on legal education, Data on the profession, Innovations in legal education, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (2)
Monday, November 19, 2012
Law schools care deeply about their academic reputation. If this were not true, my Indiana Law mailbox would not be stuffed full with glossy brochures sharing the news of faculty publications, impressive new hires, areas of concentration, and sundry distinguished speaker series, etc.
Because of the timing of these mailings – I got nearly 100 in Sept and October—I am guessing that the senders hoped to influence the annual U.S. News & World Report Academic Reputation survey. Cf. Michael Sauder & Wendy Espeland, Fear of Falling: The Effects of U.S. News & World Report Rankings on U.S. Law Schools 1 (Oct 2007) (reporting "increases in marketing expenditures aimed toward raising reputation scores in the USN survey"). But does it work? A recent study by Larry Cunningham (St. Johns Law) suggests that the effect is, at best, decimal dust.
Glossy brochures may not reliably affect Academic Reputation, but I have uncovered four factors that are associated with statistically significant increases and decreases of USN Academic Reputation. To illustrate, consider the scatterplot below, which plots the 1993 ordinal rank of USN Academic Reputation against the 2012 ordinal rank [click on to enlarge].
Four sets of dot (Red, Blue, Orange, and Green), each representing distinctive shared features of law schools, tend to be above or below the regression line. These patterns suggest that changes in USN Academic Reputation over time are probably not the result of random chance. But we will get to the significance of the Red, Blue, Orange, and Green dots soon enough.
The primary takeaway from the above scatterplot is that 2012 USN Academic Reputation is overwhelmingly a function of 1993 USN Academic Reputation. Over 88% of the variation is explained by a school's starting point 20 years earlier. Part of this lock-in effect may be lateral mobility. That is, there are perks at higher ranked schools: they tend to pay more; the teaching loads are lighter; and the prestige is greater, etc. So school-level reputations rarely change, just the work addresses of the most productive scholars. This is, perhaps, the most charitable way to explain the enormous stickiness of USN Academic Reputation.
That said, the scatterplot does not show a perfect correlation; slightly less than 12% of the variation is still in play to be explained by influences other than starting position. A small handful of schools have made progress over these 20 years (these are the schools above the regression line), and a handful have fallen backwards (those below the line).
The Red circles, Blue rectangles, Orange diamonds, and Green circles represent four law school-level attributes. The Reds have been big gainers in reputation, and so have the Blues. In contrast, the Oranges have all experienced big declines; and as as a group, so have the Greens. When the attributes of the Red, Blue, Orange, and Green Schools are factored into the regression, all four are statistically signficant (Red, p =.000; Blue, p = .001; Orange, p = .012; Green, p = .000) and the explained variation increases 4% to 92.3%. As far as linear models goes, this is quite an impressive result.
Before you look below the fold for answers, any guesses on what is driving the Red and Blue successes and Orange and Green setbacks?
Monday, May 14, 2012
[Update: I edited the review below to remove three paragraphs from my analysis. It was a metaphor that was not key to my review of Brian's book yet could be fairly viewed as insulting to readers I both respect and hoped to persuade. I am sorry about that. It was a substantial change, so I am acknowledging it here. wdh.]
Many legal academics are going to dismiss Brian Tamanaha's book, Failing Law Schools, without ever reading a page. A larger number may simply ignore it. That is ironic, because this is the response one would expect if Tamanaha's account of a corrupt, self-indulgence academic culture were true.
I have lived inside this culture since I joined the academy in 2002. And I can attest that very few people inside the academy believe that we are living the high life on the backs of our students. But in the year 2012, that perception does not matter very much. Rather, the perception that matters is the one from the outside looking in.
Over the last eighteen months or so, The New York Times, The Wall Street Journal, The Washington Post, The Atlantic, the legal press and countless blogs (many written by unhappy students) have relentlessly hammered away at law schools.
The lay public, including most practicing lawyers, are looking for a definitive account that can explain the legal education's maelstrom. Tamanaha's account is a veritable Brandeis Brief on what went wrong, chocked full of facts and history and persuasive analysis.
It begins with a deal between the ABA and AALS to join forces to persuade the state bars to restrict entry to ABA accredited law schools (the ABA's goal) and thereby to elevate the stature of the legal professoriate (the AALS's goal). Once this deal was struck -- in the early 20th century -- pretty much every change accrued to the benefit of the law faculties: higher salaries, lower teaching loads, the advent of administrators to lighten the burden of governance, and more freedom to pursue scholarly interests. When U.S. News & World Report ranking appear in the early 1990s, the law schools are forced to make choices. And our collective behavior suggests that vanity and prestige are all-too-likely to trump important principles like student diversity or honesty in reporting data.
For us law professors, here is our conundrum. From the outside looking in, things look bad, even corrupt. Yet we don't feel we have done anything wrong. We are certain that we lack the intent to cheat or defraud. But that, unfortunately, is error #1. As we all know, establishing intent is always a matter of circumstantial evidence. So let's review that evidence from the perspective of the neutral fact finder.
Life is objectively good for us: We have high salaries, social prestige, lots of travel, job security, and near absolute freedom to organize our time outside the three to six hours a week we teach, 30 weeks a years. Against this backdrop, there is consensus among legal employers that we are not very good at practical training including, in the eyes of many, basic legal writing. Moreover, the overproduction of lawyers creates problems for the legal profession as a whole. Similarly, our students are saddled with enormous debt and nothing we are doing curricularly seems geared to solving their burgeoning unemployment or underemployment problem. The federal government finances this "system." And through Income-Based Repayment programs, the U.S. taxpayers are backstopping our high costs.
Because law faculty seems to be getting the long end of the bargain here, our subjective feelings of honesty and rectitude are unlikely to be viewed by many students, practicing lawyers, or the broader public as credible. In fact, they may be viewed as insincere or out of touch. How did things get so badly out of kilter?
But for Tamanaha, some pesky journalists, angry students, and the ticking time-bomb of law students debt, I am confident that we law professors could coast along on our present track for another several decades. As an insider, I can honestly testify that we believe--sincerely beheve--that we care about our students, the quality of their education, their debt loads, and their future job prospects. But looking at the same set of facts, history will draw its own conclusions. And Tamanaha, akin to a lawyer building a case, offers up a very compelling narrative that the dispassionate observer is likely to find convincing.
Other bloggers and news outlets have commented on Tamanaha's book, often drawing very different conclusions. Compare Brian Leiter's Law School Updates and Orin Kerr at Volokh Conspiracy (Tamanaha's argument has merit, particularly when he suggests that lower ranked law schools should consider changing their models), with Scott Greenfield at Simple Justice (here and here) (Tamanaha describes an insular, out-of-touch professoriate from the top down that distains the input of practicing lawyers) and the Chronicle of Higher Education (subscription req'd) (describing Tamanaha's thesis, "Law schools are bloated with too many underworked, overpaid professors whose salaries are supported by tuition increases that are making law school a losing bet for many students").
What are the proper inferences to draw?
In late 2011, I reviewed a copy of Tamanaha's book as part of the peer-review process for University of Chicago Press. My primary advice to Brian, communicated directly to him as well as his editors, was "to condemn the sin, not the sinner." Legal academics may seem culpable for privileging their interests ahead of students, I said, but these are the same folks who need to be relied upon to fix the problem. (The alternative is that nearly all of U.S. legal education will collapse under the weight of high costs and fewer entry level legal jobs; and on many days, I think the latter is just as likely as the former.)
Frankly, I don't know if my "condemn the sin, not the sinner" recommendation was good advice. In order to change, the legal academy may need more pressure brought to bear from outside forces. This may happen if the legal academy is painted as more selfish, insular, elitist and out of touch than we already look now. Congress and the Department of Education hold the ultimate trump card, and Tamanaha's book provides the essential supporting evidence for radical action. If and when this happens, law faculties will be forced to pick sides.
History is now playing out right before our eyes. I believe there is a good chance that Brian Tamanaha's book will be viewed--by history at least--as a great act of courage. The implication, of course, is that the rest of us will look foolish.
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occuring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
[posted by Bill Henderson]
Saturday, March 17, 2012
My blog post from last week, "Too Good for BigLaw: The Statistician Edition" has resulted in a minor kerfuffle with some of the distinguished empiricists at Northwestern Law. See Dan Rodriguez, Law School Sorting and the Partnership Track: Northwestern Empiricists Weigh In, Word on the Streeterville [The Blog of NWU Law Dean]. NWU Law folks were not impressed with my analysis. Dan Rodriguez was gracious enough to send me the link at the same time his post, quoting the views of his colleagues, went live. He has also encouraged me to reply publicly.
I am happy to do that. Let me start with big picture issues. Then, for those folks with the curiousity and stamina to wade through arcane details--and experience tells me this is a small group--I will directly address, point by point, the the issues raised by Kate Litvak and Max Schanzenbach. But at the outset, I will say that I am not conceding any ground.
It all started with a provocative blog post by Vivia Chen, the columnist for The Careerist. Vivia reviewed hiring and promotion data from the NLJ 250 Law School Hiring Survey and noted that elite law school graduates were becoming partner in very low numbers when compared to the hiring pipeline. Vivia editorialized on the numbers in a way that played into readers' fragile egos and insecurities. Of course, that is her job, which she does very well.
In a nutshell, here is why people care -- or more precisely, get anxious -- about this topic: it is conventional wisdom that graduation from elite law schools produces better career outcomes. When that expectation is countered by actual marketplace data, people are surprised. See, e.g., Bruce MacEwen, "The Best & The Brightest" at Adam Smith Esq. (leading blog on law firm economics). Surprise is the first reason this issue got so much play. Emotion is the second.
Emotion matters because very few lawyers and law professors are dispassionate on this topic. When it comes to conventional wisdom on law school pedigree, we all have horses in the race. Because we are human beings, we lawyers and law professors don't wait for balanced market data to develop our own entrenched worldviews. When the conventional wisdom favors us, we go with it -- albeit we aren't really conscious this is happening. So when data upset the apple cart and potentially make us look complacent, our passions get aroused.
The folks at Above the Law have built a entire business model around such predictable lawyer foibles. The more chum thrown in the water, the higher the ad revenues. It's just that simple.
Vivia's primary point, stated through metaphor, is that regional schools (such as Chicago Loyola) seem to be making partner at higher rates than the elite schools (such as Chicago). This is a reasonable inference because the ratio of associates hired to partners promoted appears to be consistently high for elite law schools and very low for a large number of regional law schools. This very point was made independently by Bruce MacEwen, who is a very sophisticated guy who advises law firms on strategy.
That said, there was ample opportunity for readers to draw spurious inferences from Vivia's metaphor-driven blog post. Thus, to avoid any school-specific claims (a 1-year crossectional sample is not suitable for such a purpose), I pooled the schools by U.S. News ranking, drawing a line between elite and non-elite at the T14 mark. Why T14? Because these schools have played a closed loop of musical chairs for 20 years in the U.S. News rankings. These schools would be viewed by most employers as "national" law schools.
Here is what the data showed:
- Pipeline in: 53.7% T14, 46.3% non-T14
- Partners Promoted: 29.4% T14. 70.6% non-T14.
That is, well, an enormous skew. In 2011, for every 5.43 elite grads hired, a senior associate from an elite school makes partner. For non-elite schools, that corresponding statistic is 1.95. Vivia found these numbers surprising and somewhat counter-intuitive. So did Bruce MacEwen, Above the Law, ABA Journal, etc.
There are ways to break down these numbers to gain additional insights, but the key point here is one of magnitude. Elite law graduates are supposed to be smarter and more capable -- no one expects these folks to be on short side of any race, tournament or desired outcome. The magnitude of hiring/promotion gap is the surprising fact that needs to be explained.
I had observed roughly the same skew several years ago (pooled 2007 and 2008 data) and alluded to it in this article, "Why is the Job Market Changing," Nat'l Jurist (Nov. 2010). I also follow other relevant studies, such as The After the JD, which have noted differences between elite and non-elite graduates. So I had a head start in thinking through possible explanations. I thus offered five theories, all of which could work in concert, to explain the large skew in the data:
- Selection effects
- Differences in first jobs
- Intergeneraional privilege
- Influence of admission criteria on the associate pipeline
- "A Better Plan B" for elite grads
So, to be very clear, I am not using the NLJ 250 data to support the above theories. It is the reverse: I am offering the above theories as a likely explanation for the very large skew between elite and nonelite grads. Framed as a open-ended research question, it might be written, "why are elite grads not becoming BigLaw partners in numbers commensurate with hiring patterns and general presumptions of their higher ability?" That is a mystery and a puzzle.
Statistics Minutiae [After the jump ...]
March 17, 2012 in Blog posts worth reading, Data on legal education, Data on the profession, Law Firms, New and Noteworthy, Scholarship on legal education, Structural change | Permalink | Comments (3)
Tuesday, February 7, 2012
A new paper by Prof. Richard Bourne of the University of Baltimore. Bourne has been teaching for over 30 years, following five years in practice and time as a teaching fellow at Harvard. I find the reflections of long time participants in legal education to be useful inputs in evaluating how things have changed. This is an interesting paper with much to offer.
This paper will first track the ways in which the legal services market has grown and changed over the past forty years. It will then track the major changes that have attended legal education during the same period and the increasing dependence of the legal education industry on student debt. The paper will then explore why, at long last, the boom-times may have run their course and why, at some point, painful changes will likely occur. Though they cannot be described in detail, the author will attempt to outline the likely nature of the changes that will occur. Finally, the paper will briefly explore how the predicted reckoning may yet lead to an improvement in the marketing of legal services and an enhanced role for law schools in preparing new attorneys for the new bar they will be joining.
There's quite a bit of provocative stuff in here (on the chopping block are clinics, faculty scholarship, "law and.." courses, merit scholarships, and light course loads!) but also the traditional laments about U.S. News:
If law schools could somehow eliminate or seriously weaken the impact of U.S. News rankings they could begin to cut back in a big way on many of the marketing costs that currently burden legal education.
It isn't a "ruinous U.S. News sweepstakes" that drives the cost structure of legal education. Brochures and other marketing measures are a tiny fraction of even a single entry level professor's salary. The law school cost structure is largely salaries. I do agree with Bourne (especially since he cites Bill's and my work on this point) that competition for rankings has shifted financial aid to merit aid and away from need based aid, with deleterious consequences for the profession, legal education, and general social mobility.
But, as Bill and I have written elsewhere, if law schools released more useful data for students to use and facilitated such comparisons, U.S. News would be less important. It is important precisely because law schools don't make it easy to compare across schools while applicants making massive investments in education desperately want to compare their options. Fill that need with something better than the current rankings and the U.S. News problem will solve itself.
Bourne notes in his conclusion,"The time has come to stop pursuing the ephemeral benefits of prestige, simply for the sake of prestige, and to deliver more in the way of value." That's much more important than fussing about U.S. News. But how to do that? A footnote at the end of Bourne's paper notes Alfred Z. Reed's thoughtful and provocative 1921 report, Training for the Public Profession of the Law and its argument for a wide range of legal training, producing different kinds of lawyers licensed to do different things. As Bourne notes Reed's report "was rejected as heretical by the organized bar." Maybe that's where to start the discussion.
[Posted by Andy Morriss]
Wednesday, February 1, 2012
Brent E. Newton, an adjunct professor at Georgetown University Law Center, has posted a legal education reform piece on SSRN, entitled The Ninety-Five Theses: Systemic Reforms in the American Legal Education and Licensure [Hat-tip TaxProf]. Judging by his title, Newton is hoping to spur a Reformation of legal education, akin to what Martin Luther did for Christianity in the 16th century. If that is his agenda, I will not stand in his way.
According to his GULC web bio, Newton is Deputy Staff Director of the U.S. Sentencing Commission; prior to that, he had a distinguished career as a public defender. Newton is not the only adjunct-practitioner who has forcefully challenged U.S. legal education. In 2008, Jason Dolin (solo practitioner, adjunct at Capital), published Opportunity Lost: How Law School Disappoints Law Students, the Public, and the Legal Profession. In 2010, Steve Bennett (partner at Jones Day, adjunct at Fordham) published a law review article entitled, When Will Law Schools Change?
Law professors rarely engage with these critiques; to acknowledge these critiques, some might argue, is to give them oxygen and legitimacy. I think this approach is a huge mistake. Any enterprise interested in long-term success cares about the perceptions held by its stakeholders -- and adjuncts are definitely in that group. In times of crisis, we need friends, not enemies. Further, Newton, Dolin and Bennett are serious people and very capable lawyers. If you leaf through these articles, you'll see that they read like Brandeis Briefs against the legal education establishment. The authors present thoughtful, fact-based, and (albeit occasionally) trenchant arguments on why we, speaking as a legal education insider, should change.
Simple question: Can any of us identify a single historical example in which the establishment reformed itself because a critic effectively marshaled facts and logic to reveal the errors of its ways? Institutional change doesn't happen that way -- facts and logic are no match for a few thousand egos and pious rationalizations for why others should change, but not me.
The common storyline for institutional change is failure, with the rise of other institutions that better address the social, political and economic needs of stakeholders and broader society. A less common narrative is institutional adaption, thanks in part to (1) the self-interest and survival instincts, and (2) the serendipity of timely, brilliant leadership. (Does the legal academy have a few hundred great leaders?)
That said, Newton, Dolin and Bennett may be on the right side of history. Because of the overproduction of law school graduates and their high levels of debt, we are now at a point when survival for a large proportion of law schools can no longer be taken for granted. "What cannot go on forever, won't." Herbert Stein, economist.
Prediction: In the next few years, some law schools will change and thrive. Others won't and they will fail. There will be nasty recriminations and gnashing of teeth. A few at the very top will throw dice and decide not to change. They will survive, but the innovations taking root in the rest of the law school hierarchy will make them look like anachronisms. It will be a slow decay. In the meantime, some aspects of the Post-Langdellian paradigm will look a lot like the suggestions made by Newton, Dolin and Bennett. In twenty years, maybe sooner, the revolution will be over. Finally, Newton et al. will get a must deserved footnote.
[Posted by Bill Henderson]