Wednesday, February 13, 2013
My previous post on Washington & Lee's 3L Program stirred a lot of interest and commentary, including some disbeleiving critics. Fortunately, Professor Jim Moliterno agreed to write a reply essay, below, that completes the cycle. [Bill Henderson]
Jim Moliterno Replies [This is a long reply, so a PDF version online here]
A number of comments to Bill’s January 28 post and posts regarding it on other blogs cause me to enter this conversation.
Are students really coming to W&L because of the new curriculum? Yes, to a significant extent. How do we know? Because the entering students say so. As do many law schools, we administer a questionnaire to our enrolling students. Among the questions asked is the obvious one: why are you here?
In the most recent such survey the students were asked to rank the strengths of the law school. Here are the top ten, in order, according to the entering students:
- Third Year Curriculum
- Ranking / Prestige
- Quality of Life
- National Reputation
- Job Placement
- General Cirriculum
- Clinical Program
- Financial Aid Award
- Size of Lexington
The curriculum reform was first. Financial aid awards were 9th, just ahead of the “size of Lexington.” The data does not support the unsubstantiated claims of some bloggers that students are choosing W&L because of the generosity of financial aid awards.
The curriculum reform has steadily moved higher on the “strength” rankings given by enrolled students since 2009. The 2011 and 2012 surveys are nearly identical, and the written comments of students about their reasons for coming to W&L (none reprinted here), are more striking than the numbers themselves.
I don’t know of any better data on this proposition but the statements of those whose reasons are under study. If that data is unsatisfying to some, then they will continue to be unsatisfied.
Are there other reasons students come to W&L? Of course. W&L has a highly productive, highly visible faculty engaged in scholarship and projects at the highest levels. Some students undoubtedly value W&L’s faculty prowess. W&L is highly ranked. Some students undoubtedly are affected by a top 25 ranking. It has an excellent reputation as a small, closely-knit academic community. Some students select for the sense of community and size. No reason will ever be the only reason for prospective students to choose a law school. Changes made by law schools will affect student choices for or against a particular law school. The W&L curriculum reform is positively affecting a significant number of students’ calculus about choosing W&L.
And some do come because of the financial aid package they were offered. But the financial aid reason is unlikely to explain the increase in applications since 2008. Some students, the recipients of aid, undoubtedly come in part because of the aid. That is no different than the students who choose [insert name of any school] because of the financial aid they were awarded. In 2012, about the same number of offers of admission were made as in previous years, but instead of the usual 130 or 135 admittees choosing to attend, more than 260 made deposits. Some were asked to defer their attendance until 2013 and once the dust settled we had a class of 187 instead of the usual 130 to 135. This same class entering in 2012 listed the curriculum reform first and financial aid ninth as strengths of the law school.
What else was happening in 2008 and 09 when the applications increased by nearly 33% per year?
In 2009 and 10, while W&L applications were on the rise, the US News ranking fell from 25-34 (while its reputation rank among academics stayed steady). It has now recovered to 24. If anything, that should have led to a drop in applications during 2008-2011 rather than the sharp increases that actually occurred.
Can we exclude all other possible explanations than those previously mentioned? Of course not. It could be that being in a small, beautiful mountain town is all the rage among young adults and 33% more students want that now than wanted it in 2007. I know of no data to prove or disprove that proposition, so it remains one that could be true. The reality is that the students who have come in recent years rate the curriculum reform among the top reasons (often the most important reason) for their attendance at W&L. That matters.
There is empirical evidence that the W&L curriculum reform is engaging students more than in the traditional “no plan” third year curriculum. Is it perfect evidence? Of course not. Is it definitive evidence that has no flaw? Of course not. Is anything ever supported by perfect, definite evidence that has no flaw? Not to my knowledge. We make all of our most important decisions in life based on the best available evidence. As long as the evidence is empirically sound and statistically significant, it is worthy of respect. The evidence of W&L 3L engagement increases is sound and statistically significant and marks a path toward further research and verification.
One commenter suggested that the data is suspect because the peer schools have not been identified. Their data belongs to them, not W&L. LSSSE does not make specific school data available to other schools. So W&L has only a composite score for those peer schools. And it would be unseemly for W&L to reveal the specific schools. I will not do so here. But to be sure, W&L asked LSSSE to calculate the data from a list of schools because they are the schools with whom W&L competes for students and competes in the rankings. It would not have served W&L’s research interests to learn how it compares with a list of schools that it does not compete with in the marketplace. No one at W&L has the data for any specific school.
Nonetheless, do not be mistaken, the schools with whom W&L is compared in LSSSE data are the schools anyone would expect them to be: schools that by their geography, rank and quality compete with W&L in the relevant markets for students and placement.
One observation: in the legal profession and legal education in particular, the status quo never seems to need empirical justification. Only change is suspect and wrong until proven definitively to be otherwise. Is there any empirical evidence that the status quo third year is the best possible third year except that it has been done that way for a long time? None that I know of. The old adage, “if it ain’t broke don’t fix it” does not apply here. The third year of legal education is “broke”.
Amid calls for its abandonment by some, dating back at least to the early 1970s report by Paul Carrington, the third year is widely acknowledged to be of the least value among the three years. (See below on W&L’s largely unchanged approach to years 1 and 2.) The Roman Legions (and more than a few other military powers) have found out that the mere fact that something has been successfully done before is not sufficient evidence that it will prevail in the present or future. Arguing in favor of the status quo based on no empirical evidence, . . . based only on instinct and the argument that it is the way things are currently done, is an approach doomed to failure. Just ask Kodak. (And see my forthcoming book: “The American Legal Profession In Crisis,” Oxford, March 2013.)
How about the claim that “[W&L’s LSAT has] gone down every year since [the new curriculum was announced], while its GPA rank has, after a plunge, more or less returned to where it was.” The blogger made that claim, once again without any data, let alone empirically credible data. Actually the W&L median LSAT was steady at 166 from 2005-2010, dropped 2 points to 164 in 2011 and stayed at 164 for 2012. It has not “gone down every year since [the new curriculum was announced in 2008].” Meanwhile, the GPA of entering classes, which was in the 3.5 and 3.4 range in 2008-2010, has gone up to the 3.6 range (3.65 and 3.62) in 2011 and 2012. The two modest changes in LSAT and GPA have essentially off-set one another in US News points. Hardly the reason for pause suggested by the blogger.
It seems that as long as someone is arguing against change, no rules apply to the arguments’ underpinnings.
Here is what the empirical evidence from the LSSSE surveys shows and what it does not show: students are more engaged in their work and their work includes more writing, more collaboration and more problem solving. Here are a few charts even more striking than those Bill used in his post. Together they say that significantly more than their peers or their predecessors at W&L, current third year students are working more, writing more, collaborating more, applying law to real world problems more, and preparing for class more often. Overall, they describe a harder-working, more engaged student body. And they are working harder at acquire the skills that matter to success as a lawyer.
February 13, 2013 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Structural change | Permalink | Comments (6)
Tuesday, January 29, 2013
Here it is in a nutshell. There is empirical evidence that Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students—significantly better than prior graduating classes at W&L, and significantly better than W&L’s primary competitors. Moreover, at a time when total law school applicants are on the decline, W&L’s getting more than its historical share of applicants and getting a much higher yield. When many schools are worried about revenues to survive next year and the year after, W&L is worried about creating the bandwidth needed to educate the surplus of students who enrolled in the fall of 2012, and the backlog of applicants that the school deferred to the fall of 2013.
[This is a long essay. If you want it in PDF format, click here.]
Alas, now we know: There is a market for high quality legal education. It consists of college graduates who don’t want to cast their lot with law schools who cannot guarantee students entree to meaningful practical training. Some might argue that W&L is not objectively better-- that the 3L curriculum is a marketing ploy where the reality falls well short of promotional materials and that, regardless, prospective students can't judge quality.
Well, in fact there is substantial evidence that the W&L 3L program delivers comparative value. The evidence is based on several years' worth of data from the Law School Survey of Student Engagement (LSSSE). I received permission from Professor James Moliterno, someone who took a leadership role in building W&L’s third year program, to share some of the key results (each school controls access to its LSSSE data.) They are below.
But before getting into empirical evidence, I want to put squarely on the table the most sobering finding that likely applies to virtually all of legal education. It is this: On several key LSSSE metrics, W&L has made impressive gains vis-à-vis its own historical benchmarks and its primary rival schools. But even for this leader, there remains enormous room for improvement. More on that below.
Here is the bottom line: Traditional legal education, when it is measured, does not fare very well. Yet, as W&L shows, substantial improvement is clearly possible. We law professors can respond to this information in one of two ways:
- Don’t measure, as it may disconfirm our belief that we are delivering a great education.
- Measure—even when it hurts—and improve.
I am in the second camp. Indeed, I don’t know if improvement is possible without measurement. Are we judging art work or the acquisition of key professional skills needed for the benefit of clients and the advancement of the public good?
Moving the Market
I doubt I will ever forget Jim Moliterno’s September 2012 presentation at the Educating Tomorrow’s Lawyers (ETL) conference at the University of Denver. He presented a single graph (chart below) showing W&L actual applicant volumes since 2008 versus what would have happened at W&L if its applicant volume had followed national trends.
While law school applicants crested a few years ago, W&L enjoyed a large run-up in volume of applicants, presumably due to the launching of their new 3L program. This larger applicant pool effectively served as a buffer when applicant declines began in 2011 and 2012. Since 2008, overall law school applicants are down -19%, yet W&L is up overall +33%.
But much more significantly, after their experiential 3L year was up and running and the overall legal job market continued to stagnate, W&L yields spiked. Ordinarily they would enroll 135 students. But for the fall of 2012, they received enrollment commitments from well over 260 students. Indeed, at the ETL conference Jim Moliterno said the school had to offer financially attractive deferments to get the class to approximately 185 incoming students -- a 50 student bulge.
When Jim Moliterno showed the above graph and explained the corresponding changes in yield, my good friend Gillian Hadfield, a skeptical, toughminded, evidence-demanding economist who teaches at USC Law, leaned over and said to me, “that is the single most important takeaway from this entire conference.” I agreed. The market for a legal education with practical training is, apparently, much more inelastic than the market for traditional JD programs.
Yet, what is perhaps most remarkable is that a large proportion of incoming students at W&L were enrolling based on little more than faith. Nobody knew for sure if W&L had the ability to pull off their ambitious 3L curriculum. The program relies on a large cadre of adjunct professors, after all, and W&L is located in remote Lexington, Virginia. Many law faculty outside of W&L, and perhaps some inside, thought (or perhaps think) that the program could not live up to the hype. Well, as shown below, the program appears to have produced meaningful gains.
The only data-driven critique anyone can muster is that the gains remain significantly short of perfection. But that critique bites harder on the rest of us. To use a simple metaphor, W&L is tooling around in a Model-T while the rest of us rely on horse and buggy. What ought to be plain to all of us, however, is that, just like automobile industry circa 1910, we are entering a period of staggering transformation that will last decades. And transformation will be roughly equal parts creation and destruction. See Schumpeter.
W&L Data, Internal Historical Benchmark
LSSSE is a phenomenally rich dataset – nearly 100 questions per year on a wide variety of topics related to student classroom experience, faculty interaction, type and quantity of assessments, time allocation, and perceived gains on a variety of dimensions related to personal and professional development. The survey instrument is online here.
Aside from a host of questions related to demographics, career goals, and debt, major sections in the LSSSE include:
- Section 1, Intellectual Experience (20 questions)
- Section 2, Examinations (1 question)
- Section 3, Mental Activities (5 questions)
- Section 4, Writing (3 questions)
- Section 5, Enriching Educational Experiences (9 questions)
- Section 6, Student Satisfaction (7 questions)
- Section 7, Time Usage (11 questions)
- Section 8, Law School Environment (10 questions)
- Section 9, Quality of Relationships (3 questions)
- Section 10, Educational and Personal Growth (16 questions)
W&L deserves to be a detailed case study. But frankly, legal education can’t wait. So I will do the best I can to cover the landscape in a blog post. I hope every law faculty member who reads this post makes a strong plea to their dean to enroll in LSSSE. Why? So your school can benchmark itself against the detailed LSSSE case studies that are bound to flow out of W&L and other innovative law schools. Though they don’t get much press, there are, in fact, other innovative law schools.
Friday, January 18, 2013
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occurring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
I tried to write Part II, but a blog post just was not up to the task. Further, I sensed that my colleagues were in no mood for half-baked solutions. There has been enormous criticism of legal education on the blogs and in the media, but very little in the way of detailed prescriptions to improve the situation. I felt an obligation to back off on the criticism and focus on solutions. So, in essence, Part II of my Tamanaha review became an article.
I just posted to SSRN an article entitled "A Blueprint for Change" forthcoming in the Pepperdine Law Review. It is both a diagnosis and a proposed solution -- a solution I am actively pursuing. Here is the abstract:
This Article discusses the financial viability of law schools in the face of massive structural changes now occurring within the legal industry. It then offers a blueprint for change – a realistic way for law schools to retool themselves in an attempt to provide our students with high quality professional employment in a rapidly changing world. Because no institution can instantaneously reinvent itself, a key element of my proposal is the “12% solution.” Approximately 12% of faculty members take the lead on building a competency-based curriculum that is designed to accelerate the development of valuable skills and behaviors prized by both legal and nonlegal employers. For a variety of practical reasons, successful implementation of the blueprint requires law schools to band together in consortia. The goal of these initiatives needs to be the creation and implementation of a world-class professional education in which our graduates consistently and measurably outperform graduates from traditional J.D. programs.
I have a large backlog of shorter articles and analyses that I have not posted because I wanted my own detailed solution in the public domain. I hope to tie all of these ideas together over the coming weeks.
Thank you, Brian Tamanaha, for writing an book that required me to think in terms of solutions.
[posted by Bill Henderson]
January 18, 2013 in Current events, Data on legal education, Data on the profession, Innovations in legal education, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (2)
Wednesday, November 28, 2012
In August, I posted to this blog a narrative analysis comparing the 2010 and 2011 enrollment and profile data among law schools based on the data published in the 2012 ABA-LSAC Guide and the 2013 ABA-LSAC Guide. In response to recent comments on the 2012 enrollment situation, see ABA Journal Weekly Newsletter and the discussion at The Faculty Lounge, and the further drop in LSAT test-takers in June/October 2012 recently discussed at Tax Prof Blog, I thought it might make sense to update the enrollment and profile analysis to account for 2012 enrollment and profile data, to the extent that it is available, and to offer some thoughts on 2013.
As of November 15, only 140 law schools had published enrollment data on their webpages and only 128 had published sufficient profile data on which to make meaningful year-to-year comparisons. Please note that this analysis is based on "unofficial data," having been taken from law school webpages, not from any ABA publication, and having been taken from law school webpages prior to the LSAC certification of enrollment and profile data which the LSAC is undertaking this year for the first time.
ENROLLMENT IN DECLINE – Between 2010 and 2012, only 12 schools were flat (a change between -1% and +1%) or had an increase in enrollment; 128 of the 140 law schools had a decline in enrollment (a decrease greater than 1%), of which
-89 had a decline of 10% or more, of which
-59 had a decline in enrollment of 20% or more, and of which
-15 had a decline in enrollment of 30% or more.
This means over 90% of law schools for which 2012 enrollment information is available had a decline in enrollment and that more than 40% had a decline in enrollment of 20% or more.
Based on the data published in the 2012 ABA-LSAC Guide, in 2010, these 140 law schools had 33,952 first-years (68.3% of the 49,700 total 1L enrollment (LSAC matriculants)). Based on the data published in the 2013 ABA-LSAC Guide, in 2011, these 140 law schools had 31,082 first-years (68.2% of the 45,600 total 1L enrollment (LSAC matriculants)). In 2012, based on data from law school webpages, these 140 law schools had 28,380 first-years.
The decline in first-year enrollment was roughly 8.45% percent across these 140 schools between 2010 and 2011 (slightly more than the national decline of 8.25%), while the decline in first-year enrollment was roughly 8.69% across these 140 schools between 2011 and 2012.
If enrollment at these 140 schools represents 68.25% of total first-year enrollment for 2012 (the average of 2010 and 2011), that would suggest that total first-year enrollment (LSAC matriculants) for fall 2012 may be as low as 41,500-41,600, a decline of roughly 8.8% from 2011 and a decline of roughly 16% since 2010. (The LSAC certification of enrollment and profile information may come in even slightly lower than this estimate as it is going to be based on snapshots of enrollment on October 5, 2012, which would exclude students who began classes but withdrew prior to October 5, 2012. This group of students might number a few hundred if there were one to three such students at each law school.)
PROFILES IN DECLINE – Between 2010 and 2012, 93 of the 128 law schools with available profile information had a decline in their LSAT/GPA profile (more indicators down then up), 23 had an increase in profile (more indicators up then down), and 12 had a mixed profile (same number of indicators up and down).
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, of the 128 law schools with both enrollment and profile information available for fall 2012, 85 law schools (nearly two-thirds) saw declines in enrollment and in their LSAT/GPA profiles between 2010 and 2012.
Of these 85 law schools, 38 law schools saw declines in enrollment of greater than 20% and saw declines in their LSAT/GPA profiles. That means nearly 30% of law schools with available enrollment and profile information for 2012 had declines in enrollment of 20% or more and saw their LSAT/GPA profile decline. It also means that over 75% of the 50 law schools with declines in enrollment greater than 20% and for which 2012 profile information is available had declines in profile for 2012.
Notably, five of these 38 law schools were in the USNews top-50, 10 were ranked between 51-100, 10 were ranked between 101-145 and 13 were in the alphabetical listing of schools. The declining interest in law school, therefore, is impacting law schools across the rankings, but is more dramatically impacting alphabetical schools than top-ranked schools.
FURTHER THOUGHTS ON 2012 – According to the LSAC Volume Summary, applications to law school slid from 87,900 in 2010 to 78,500 in 2011 to approximately 68,000 for 2012 (although the 2012 numbers have not been finalized). Over the last nine years, law schools, on average, have admitted roughly 56,800 students per year, with a low of 55,500 in 2007 and in 2008. The “admit” rate – which was only 56% for fall 2004 – had climbed to 71% for fall 2011. For the last several years, however, matriculants have averaged roughly 82% of admitted students. So if we did have 41,600 matriculants this fall (as estimated above), and if matriculants represented roughly 82% of admitted students, that would mean we had roughly 50,700 admitted students, the lowest number this millennium, with an admit rate of nearly 75%, the highest this millenium. (Alternatively, if matriculants declined as a percentage of admitted students, it is possible that a larger number of applicants were admitted.)
PROJECTIONS FOR 2013 -- June and October LSAT administrations suggest that there may be fewer than 60,000 applicants for fall 2013. There were 93,341 June/October test-takers in 2009 (for the 2010 admissions cycle) (resulting in 87,900 applicants – 94.2% of tests administered in June/October). There were 87,318 June/October test-takers in 2010 (for the 2011 admissions cycle) (resulting in 78,500 applicants – 89.9% of tests administered in June/October). There were 71,981 June/October test-takers in 2011 (for the 2012 admissions cycle) (resulting in roughly 68,000 applicants – 94.5% of tests administered in June/October).
That is a three-year average in which the number of applicants in a cycle represented roughly 92.9% of the tests administered in June/October. There were 63,003 June/October test-takers in 2012 (for the 2013 admissions cycle). If the 2013 cycle results in a number of applicants representing 92.9% of June/October test-takers, law schools can anticipate there being only roughly 58,530 applicants to law schools for fall 2013. (Notably, in the admissions cycles from 2007-2009, the number of applicants in a cycle represented, on average, roughly 111% of the June/October test-takers, so the estimate of 58,530 may understate the number of possible applicants.)
If there are only 58,530 applicants for fall 2013 (which would represent nearly a 14% decline from fall 2012 -- the third consecutive double-digit decline in applications), and if law schools admit only 50,700 of these applicants, the same as the estimate above for fall 2012, across all law schools over 86% of all applicants to law school would receive offers of admission. If 82% of these admitted students were to matriculate, that would mean a first-year enrollment for fall 2013 that once again would be around 41,500-41,600. Alternatively, if law schools remain somewhat selective and were to admit only 48,000 of the 58,530 estimated applicants, that still would be an admit rate of 82%. If 82% of those 48,000 matriculated, the first-year enrollment would decline to roughly 39,400, a decline of about 5.3% from the fall 2012 estimate set forth above.
There are two competing tensions law schools must weigh in making admissions decisions in a declining market – revenue and LSAT/GPA profile. Do you take the number of students you need to meet revenue projections (even if that means profile slides) or do you take a smaller number of students (and take a revenue hit) in an effort to maintain LSAT/GPA profile?
What the 2011 and 2012 classes demonstrate is that in the current market, for a large number of schools, even taking significantly fewer students did not allow them to maintain their profiles. Given that many schools already have lost significant revenue due to shrinking enrollments in 2011 and/or 2012 (for just one example see the recent discussion of Vermont Law School in the National Law Journal) they will be hard-pressed to shrink enrollment further to maintain profiles. As a result, I think when enrollment and profile data is evaluated in fall 2013, we will see even more widespread declines in profile than was manifested in 2011 and 2012, possibly along with some ongoing declines in enrollment. It seems likely that several more schools may experience both significant declines in enrollment and in profile.
[posted by Jerry Organ]
Monday, November 19, 2012
Law schools care deeply about their academic reputation. If this were not true, my Indiana Law mailbox would not be stuffed full with glossy brochures sharing the news of faculty publications, impressive new hires, areas of concentration, and sundry distinguished speaker series, etc.
Because of the timing of these mailings – I got nearly 100 in Sept and October—I am guessing that the senders hoped to influence the annual U.S. News & World Report Academic Reputation survey. Cf. Michael Sauder & Wendy Espeland, Fear of Falling: The Effects of U.S. News & World Report Rankings on U.S. Law Schools 1 (Oct 2007) (reporting "increases in marketing expenditures aimed toward raising reputation scores in the USN survey"). But does it work? A recent study by Larry Cunningham (St. Johns Law) suggests that the effect is, at best, decimal dust.
Glossy brochures may not reliably affect Academic Reputation, but I have uncovered four factors that are associated with statistically significant increases and decreases of USN Academic Reputation. To illustrate, consider the scatterplot below, which plots the 1993 ordinal rank of USN Academic Reputation against the 2012 ordinal rank [click on to enlarge].
Four sets of dot (Red, Blue, Orange, and Green), each representing distinctive shared features of law schools, tend to be above or below the regression line. These patterns suggest that changes in USN Academic Reputation over time are probably not the result of random chance. But we will get to the significance of the Red, Blue, Orange, and Green dots soon enough.
The primary takeaway from the above scatterplot is that 2012 USN Academic Reputation is overwhelmingly a function of 1993 USN Academic Reputation. Over 88% of the variation is explained by a school's starting point 20 years earlier. Part of this lock-in effect may be lateral mobility. That is, there are perks at higher ranked schools: they tend to pay more; the teaching loads are lighter; and the prestige is greater, etc. So school-level reputations rarely change, just the work addresses of the most productive scholars. This is, perhaps, the most charitable way to explain the enormous stickiness of USN Academic Reputation.
That said, the scatterplot does not show a perfect correlation; slightly less than 12% of the variation is still in play to be explained by influences other than starting position. A small handful of schools have made progress over these 20 years (these are the schools above the regression line), and a handful have fallen backwards (those below the line).
The Red circles, Blue rectangles, Orange diamonds, and Green circles represent four law school-level attributes. The Reds have been big gainers in reputation, and so have the Blues. In contrast, the Oranges have all experienced big declines; and as as a group, so have the Greens. When the attributes of the Red, Blue, Orange, and Green Schools are factored into the regression, all four are statistically signficant (Red, p =.000; Blue, p = .001; Orange, p = .012; Green, p = .000) and the explained variation increases 4% to 92.3%. As far as linear models goes, this is quite an impressive result.
Before you look below the fold for answers, any guesses on what is driving the Red and Blue successes and Orange and Green setbacks?
Thursday, September 20, 2012
NALP just announced that the median salary for first year associates in Big Law has dropped from $160K to $145K. I think that is very significant. We are now back to to the entry level price point of 2007.
But to my mind, there is much bigger story here. In 2011, firms of 500+ attorneys hired 2,856 entry level lawyers. In 2007, that figure was 4,745. So, after five years, Big Law is paying the same wage but hiring 40% fewer lawyers. Compare 2007 NALP Nat'l Summary with 2011 NALP Nat'l Summary.
Here is another important piece of NALP data, generated from the print versions of the July 2012 NALP Bulletin. It shows the percentage of entry level law jobs that are private practice.
Two takeaways here: (1) there is a longterm trendline showing a declining number of private practice jobs--and that is the economic engine that enables law schools to exist at current tuition levels, and (2) the cliff-like dropoff in 2010 and 2011 is likely Big Law, and that hurts.
[posted by Bill Henderson]
Monday, September 3, 2012
NALP notes that for the Class of 2010 -- and the Class of 2011 -- two-thirds of all employed graduates were employed in the state in which their law school was located. This suggests location matters.
Is location important to employment results at a large number of schools? Are some law schools more national than others? Are some states more “local” in hiring than other states? The answers are yes and yes and yes.
ANALYZING SCHOOL SPECIFIC DATA -- This analysis is based on the Class of 2010 and Class of 2011 employment outcome data reported on the ABA Section of Legal Education website, excluding the law schools in Puerto Rico. This means there are 195 law schools in this analysis (if the two Widener campuses are combined).
The law schools were asked to report the three states with the most employed graduates and the number of employed graduates in each of those three states. Taking those totals as a percentage of employed graduates, and paying attention to the states identified, one can get some idea of which schools are “regional” and which schools might actually have a more “national” footprint. The simple result of the analysis is that the vast majority of schools are “regional” rather than “national.”
- For both the Download Class of 2010 and the Download Class of 2011, there were 117 law schools for which more than 67 percent of their employed graduates are employed in the state in which the law school is located.
- For the Classes of 2010 and 2011, there were 144 and 145 law schools, respectively, for which more than 67 percent of their employed graduates are located in the state in which the law school is located or an adjacent state, and 104 law schools for which more than 80 percent of their employed graduates are located in the state in which the law school is located or an adjacent state.
- There were only 46 law schools for which less than 67 percent of their employed graduates were employed in the state in which the law school is located or an adjacent state for both the Classes of 2010 and 2011.
Notably, 28 of these 46 law schools are in the USNews top-50, for which it is easily imaginable that the employment geography is much more national than regional. For many of these 46 law schools, two of the three states with the most employed graduates generally are not adjacent to the state in which the law school is located, suggesting some national reach. The three non-adjacent jurisdictions reflected most frequently should not be surprising – California, the District of Columbia and New York. Of the 18 other law schools, nine law schools are ranked in the alphabetical list of schools -- schools one generally would consider regional – while nine are ranked between 51 and 145 in USNews.
Perhaps most significantly, due to the incomplete nature of some of the data sets, this summary probably understates the number of law schools for which the employment outcome data suggests the law school is more regional than national. Several of these 46 law schools come in with 60% or more of their employed graduates employed in the state of the law school or an adjacent state for both years -- Boston College, Minnesota, NYU, Ohio State and Penn State – and if the data were to include graduates employed in all adjacent states, the total for these schools well might exceed 67 percent.
In sum, then, more than 76% of all law schools and more than 87% of law schools outside the USNews top-50 had more than 67% of their employed graduates in the state in which the law school is located or an adjacent state for either the Class of 2010 or the Class of 2011.
LOOKING AT STATE SPECIFIC DATA -- NALP also notes that for the Class of 2010, there are 30 states in which two-thirds or more of the jobs were taken by graduates from law schools in those states. (Jobs & JDs, Class of 2010, p. 69) Taking NALP’s state-specific data for the Class of 2010 in conjunction with the ABA’s data for the Class of 2010, there actually are 35 states in which two thirds or more of the jobs were taken by graduates of law schools in those states or an adjacent state and 30 states in which three-quarters or more of the jobs within the state were taken by graduates of the law schools in the state or in an adjacent state.
Again, this data likely understates the results. For example, in Arizona, Colorado, Connecticut, Maryland, Tennessee, and Virginia, roughly 65-75 percent of jobs within the state were taken by graduates from law schools within the state or an adjacent state. But with several schools in adjacent states not counted in the tallies because these states were not one of the top three states for employed graduates from those schools, one could infer that were graduates from all schools from adjacent states included the percentage might exceed 75 percent. (Notably, 13 of the 15 states with less than 67 percent of jobs taken by graduates of the law school in the state or law schools in adjacent states are states with modest populations and only one law school (or no law school) – Alaska, Delaware, Hawai’i, Idaho, Maine, Montana, Nevada, New Hampshire, New Mexico, Rhode Island, South Dakota, Vermont, and West Virginia. The other two states are Utah and Virginia. The District of Columbia also falls into this category.)
LOCATION MATTERS -- In sum then, location matters. For the vast majority of law students at the vast majority of law schools, the vast majority of reasonable employment prospects associated with going to a given law school are going to be in the state in which the law school is located or an adjacent state. In the absence of a unique or specific aspect of a law school's program that might make a particular law school very appealing, this suggests that location should matter when considering a law school, perhaps more than ranking.
For example, if a prospective student has a choice between going to a higher ranked regional law school in a state in which the student does not anticipate practicing or living (and perhaps paying more in tuition), or a lower ranked regional law school in the location in which he or she hopes to live and work professionally (and perhaps paying less in tuition), the prospective law student should give serious consideration to attending the lower-ranked regional law school in the location in which he or she hopes to live and work professionally. This will make it easier to begin networking while in law school and to facilitate employment opportunities in the region in which the student is interested in practicing law and living. (And it may help the prospective student save money if the lower-ranked regional school happens to cost less (if it is a public school, for example), or if the prospective student has a more competitive LSAT/GPA profile at the lower-ranked regional school such that the student may be eligible for a scholarship.)
[Posted by Jerry Organ]
Thursday, August 16, 2012
The initial posting I made on August 9 was based on a “composite” database consisting of information gleaned over several months from different sources – initially from law school webpages, supplemented with information from U.S.News (when LSAT or GPA datapoints were not available on webpages) supplemented more recently with information from the ABA-LSAC Guide 2013 to fill in any remaining gaps (enrollment data and some medians). At the time of posting, I had not gone back through all the data for all the schools to cross-check against the data in the ABA-LSAC Guide 2013 and eliminate any data discrepancies (although I thought I had done so for the schools listed in the chart).
A number of people have asked for the complete spreadsheet. I have now gone back and compiled the complete spreadsheet using data solely from the ABA-LSAC Guides for 2012 and 2013. I have provided the complete spreadsheet, organized alphabetically, to the folks at Law School Transparency where it is now or will shortly be available for viewing.
The macro points remain fairly consistent with a couple of small changes. Working with the 194 schools in the contiguous 48 states and Hawai’i originally included in the U.S. News and World Report database (excluding the three Puerto Rico schools), the new database using only data from the ABA-LSAC Guides for 2012 and 2013 shows the following:
PROFILES IN DECLINE -- Between 2010 and 2011, 114 law schools had a decline in their LSAT/GPA profile, 55 had an increase in profile, and 25 had a mixed profile.
ENROLLMENT IN DECLINE – Between 2010 and 2011, 142 law schools had a decline in enrollment (of which 65 had a decline of 10% or more), 29 had an increase in enrollment (of which 8 had an increase of 10% or more), and 23 had flat enrollment (within +/- 1% of 2010 enrollment). This means over 70% of schools had a decline in enrollment and that one-third had a decline in enrollment of 10% or more. The decline in enrollment totaled roughly 4100 students or roughly 8 percent.
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, 81 schools (slightly over 40%) saw declines in enrollment and in their LSAT/GPA profiles, of which 39 schools saw declines in enrollment of greater than 10% and saw declines in their LSAT/GPA profiles. These 39 schools are highlighted here Download 2011-2010 Comparison August Version abalsac dataset Detailed LSATandGPA 39 schools
(This updated chart reflects one subtraction and two additions from what was originally posted. Charleston was incorrectly included in the initial chart (resulting in 38 schools being listed) and now has been removed. Its enrollment was down only 5.4%. (In the composite dataset with which I had been working its 2011 enrollment and profile was initially based only on full-time students, overstating the percentage decline). Baylor and Willamette were not included in the initial chart, but are included here. Baylor’s total first-year enrollment is hard to estimate off its webpage because of three admissions cycles, fall, spring and summer and uncertainty about which three “count” for a given year. Willamette had a slight change in enrollment from 146 (listed on its webpage) to 141 in the ABA-LSAC Guide. This change shifted it from a decline of less than 10% to a decline of slightly more than 10%. I have apologized to Dean Abrams at Charleston for my error in including Charleston in the initial chart.)
[posted by Jerry Organ]
Wednesday, August 8, 2012
A recent posting by Paul McGreal at The Faculty Lounge and an article in the National Law Journal by Matt Leichter (discussed in July here on the Legal Whiteboard) raise issues about the enrollment challenges law schools began facing last year, are facing now, and likely will face next year. This post summarizes the comparative data for the 2010 and 2011 entering classes covering the 197 schools ranked by USNews.
PROFILES IN DECLINE -- Between 2010 and 2011, 111 law schools had a decline in their LSAT/GPA profile, 59 had an increase in profile, and 27 had a mixed profile. (A decline means across six possible data points, 75th, median, and 25th for LSAT and GPA, more scores went down then up; an increase means more scores went up than down; a mixed profile means the same number of scores went up as went down. For example, if a school had an LSAT/GPA profile in 2010 of 160/156/153 and 3.82/3.65/3.45 and an LSAT/GPA profile in 2011 of 160/156/152 and 3.83/3.64/3.43, this would be a decline in profile – down on three parameters and up on one parameter.) The average 75th LSAT has dropped from 160.2 to 159.9, while the average 25 LSAT has dropped from 155.2 to 154.3. The median scores for the 75 and 25 fell from 160 and 155 for LSAT to 159 and 153.
ENROLLMENT IN DECLINE – Between 2010 and 2011, 141 law schools had a decline in enrollment (of which 63 had a decline of 10% or more), 30 had an increase in enrollment (of which 6 had an increase of 10% or more), and 26 had flat enrollment (within +/- 1% of 2010 enrollment). This means over 70% of schools had a decline in enrollment and that nearly one-third had a decline in enrollment of 10% or more. The decline in enrollment totaled roughly 4000 students or roughly 8 percent.
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, 75 schools (roughly 38%) saw declines in enrollment and in their LSAT/GPA profiles, of which 37 schools saw declines in enrollment of greater than 10% and saw declines in their LSAT/GPA profiles. These 37 schools are highlighted here -- (original chart has been deleted and replaced by an updated chart reflecting 39 schools as described in post on August 16). Four of the schools are ranked in the top-50, while the other 33 schools are relatively evenly divided between the second-50, the third-45 and the alphabetical schools. There is some geographic concentration, with five Ohio schools (plus Northern Kentucky), three Illinois schools and four of the six Missouri and Kansas schools on the list. Notably, 16 of the 37 are state law schools, several of which are relatively low-tuition schools that should conceivably fare better in the current climate in which prospective students are increasingly concerned about the cost of legal education.
FORECAST FOR 2012-- Given that LSAC has estimated a decline of roughly 14.4% in the number of applicants for fall 2012, from 78500 to roughly 67000, and given that the decline has been greatest among those with higher LSAT scores, one should anticipate further declines in enrollment and further erosion of entering class LSAT/GPA profiles for fall 2012. The admit rate will be the highest it has been this millennium, probably exceeding 75% and possibly exceeding 80% (after increasing from 55% to 71% between 2004 and 2011).
IMPACT FELT ACROSS THE RANKINGS CONTINUUM, BUT WORSE FOR LOWER-RANKED SCHOOLS -- While the decline in enrollment and in profiles was experienced across the board, it was more pronounced among lower ranked schools.
-Among the top 100 schools, 55 schools (over one-half) had a decline in profile, while 67 (two-thirds) had a decline in enrollment, with 27 experiencing a decline in enrollment of 10% or more. Notably, 35 schools saw a decline in enrollment and in profile (over one-third) of which 15 schools saw declines in enrollment of 10% or more and a decline in profile. Overall enrollment was down roughly 6%.
-Across the bottom 97 schools then, 56 saw a decline in profile while 74 (more than three-quarters) saw a decline in enrollment, of which 36 (nearly 40%) saw a decline in enrollment of 10% or more. Notably 40 schools saw a decline in enrollment and a decline in profile, of which 22 saw a decline in enrollment of 10% or more and a decline in profile. Overall, enrollment was down nearly 10%.
[Posted by Jerry Organ]
Friday, July 27, 2012
A really compelling way to convey a lot of important information. I continue to be blown away by the volume of innovation I am seeing, mostly around interconnectivity. (H/T: Greg Voakes at Business Insider)
[posted by Bill Henderson]
Monday, July 16, 2012
That is the title of a just-posted essay by Catherine Rampell at the NY Times Economix Blog. She studies several years of the bi-modal distribution. It is refreshing to have a capable journalist review the data and marvel at the strange ways of our industry.
[posted by Bill Henderson]
Sunday, July 15, 2012
I created the graphic below to depict the shrinking right mode of the bi-modal distribution since its 2007 high water mark (measured in February 2008).
[Note: The difference between the mean and adjusted mean in the 2011 distribution is due to the fact that law grads who fail to report their salaries tend to have have less lucrative employment; so NALP makes a prudent statistical correction --basically a weighted average based on practice settings.]
From a labor market perspective, the class of 2007 entry level salary distribution was extraordinary and anomalous. Why? Because we can safely assume that legal ability, however it might be defined, is normally distributed, not bi-modal. So when such a distribution appears in a real labor market, something is significantly out of kilter.
Why did the entry level market become bi-modal? As the legal economy boomed from the mid-90s through the mid-00s, many large law firms (NLJ 250, AmLaw 200) were trying to make the jump from regional dominant brands to national law firms. For decades, going back to the early to mid-20th century, these firms followed a simple formula: hire the best and brightest from the nation's elite law schools. As they continued to enjoy growth, they reflexively followed that same formula. Yet, by 2000s, the demand for elite law graduates finally outstripped supply.
This micro-level logic ("let's not tinker with our business model") produced a macro-level bidding war. This is how the right mode came to be. Yet, because it was a macro-level phenomenon, clients, led by industry groups such as the Association of Corporate Counsel (ACC), reacted by saying, "Don't put any junior level lawyers on my matters --they are overpriced." Outsourcing and e-discovery vendors have also eaten into the work that used to go to entry level lawyers. So the volume of BigLaw hiring has collapsed, hence the melting of the right mode. For a more detailed overview, see NALP, Salary Distribution Curve.
Long Term Structural Change in Big Law
That said, it is not just the entry level market that is under stress -- the fundamental economics of Big Law are also changing. Consider the chart below (from Henderson, Rise and Fall, Am Law June 2012), which shows that revenues per lawyer at AmLaw 100 firms has gone flat and moved sideways since 2007, breaking a pattern of steady growth that dates back to the pre-Am Law 100 days.
Stagnant revenue is a source of enormous worry for law firm managers. Without higher profits to distribute--and growing the top line is the usual profitability fomula--their biggest producers might leave, causing a run on the bank ala Dewey, Howrey, Wolf Block, etc. So the dominant strategy now has nothing to do with entry level hiring. Rather, the goal is to keep and acquire lateral partners with portable books of business. After all, clients aren't protesting the value of most senior level lawyers. And seniors lawyers are plentiful, thanks to the excellent health of baby boom lawyers and the poor health of their retirement accounts.
This strategy may work fine for this fiscal year, but over the middle to long term, BigLaw is going to get older and dumber. Further, this dynamic produces substantial ripple effects on legal education -- albeit ripple effects that feel like tremors.
The long term solution -- for both law firms and law schools -- is for the price of entry level talent to come down to the point where young lawyers are more cost-effective to train. And that price point is not $160,000. This inflated pay scale (which has supported ever higher tuitions at law schools) only persists because large firms are deathly afraid of adjusting their salary scales and being labeled second rate. So the solution is keep the entry pay high but hire very few law school graduates. This is not a farsighted or innovative business strategy.
It's been 100 years since law firms engaged in sophisticated business thinking. And that last great idea was the Cravath System, which was method of workplace organization that performed expert client work while simultaneously developing more and better human capital. See Henderson, Three Generations of Lawyers: Generalista, Specialists, Project Managers. According to the Cravath Swaine & Moore firm history, published in 1948, the whole point of the Cravath System was to make "a better lawyer faster."
I think the next great model for a legal service organization (law firm may not be the right term) likewise will be based on the idea that there is a large return to be had by investing in young lawyers. As my friend Paul Lippe likes to say, "When it appears, it will look obvious."
[posted by Bill Henderson]
Thursday, July 5, 2012
This is a simple question of great practical importance to many law schools, yet very few law school administrators understand how to answer it. Who would have thought that clarity would be supplied free-of-charge by an underemployed recent law school graduate?
But that is what is happening now, in "Tough Choices Ahead for Some High-Ranked Law Schools," an Am Law Daily essay written by Matt Leichter, one of the silver linings of the declining legal job market -- and there aren't too many. Matt is a J.D.-M.A. in law and international affairs from Marquette University who passed the New York bar in 2008, finished his masters work in 2009, and then moved to the Big Apple as the bottom was falling out of the entry level market. Unable to find conventional legal employment, Matt started doing freelance writing on law-related topics.
With plenty of time on his hands, Matt turned his graduate-level quantitative skills to the task of analyzing a law school education market that seemed unsustainable. Matt first put his analyses on display at the Law School Tuition Bubble. His writings eventually attracted the attention of The American Lawyer, which has now published several of his data-driven essays.
Here is what sets Matt apart.
- He digs very deep for facts and, in turn, uses one of his biggest asset --time -- to build datasets that answer important and relevant questions
- He is non-ideological. Just facts and factual analysis.
- He writes about complex technical stuff in an accessible, credible way
Matt has all the core skills of a truly great lawyer. Finding no takers, the entire legal education establishment benefits by Matt channeling his time, energy, and considerable intellect into relevant topics crying out for dispassionate analysis.
His "Tough Choices" essay is a real gem. Here is the bottomline: This year's applicant cycle likely will deliver its greatest blow to US News Tier 1 schools who generally admit students who were angling to get into even higher ranked schools. This inference can be teased out of the ratio of applicants to offers (selectivity), and offers to matriculants (yield).
To conduct this analysis, Matt had to cull data, school-by-school, from several years of the ABA-LSAC Official Guide to Law Schools (aka "the Phonebook"). But it enables him to produce the chart below:
What this chart says is that admissions officers have to read more applications and make more offers to fill their entering classes. Based on the data in Matt's chart, in 2004, for all ABA-accredited law schools, there was a 24% acceptance rate, and a 31% yield from those offers. In 2010, the acceptance rate went up to 31% (schools were being less selective) and the yield went down to 25% (fewer showed up to enroll).
Applicant volume may be declining, but the trends above suggest that there is a lot more "competitive shopping" going on. Why? Because information costs are going down and prospective students are adapting. And this year is bound to be the most aggressive year ever. According to this NLJ story, It's a Buyers' Market for Law School, virtually every student is now negotiating for scholarship money.
Declining applicant volume, shifting yields, and highly informed consumers make it very difficult for law school administrators to lock in their LSAT and UGPA numbers, which schools generally fixate on because of U.S. News ranking. This produces pain in one of three ways:
- The school shrinks the entering class (announced by at least 10 schools), which severely tightens the budget
- The school buys its class through financial aid, which blows a hole in the budget (happening here)
- The school significantly relaxes the LSAT and UGPA and braces for a drop in the rankings because its peers are pursuing strategies #1 or #2.
#1 and #2 may seem like the prudent course, but a central university won't (more likely can't) provide a financial backstop for more than a year or two, if that. If the admissions environment does not change dramatically, which seems unlikely, some combination of layoffs, rankings drop, or closures will have to be put on the table.
Matt's ingenuity is on full display when he demonstrates, with data, the profile of the most vulnerable schools -- and its a far cry from the bottom portion of the U.S. News rankings.
- Low accept/high yield (think Yale and Stanford) are safe.
- High accept/high yield are also fine. They are nonprestigious but have strong regional niches or missions. Tier 3 or 4 designation means nothing.
- Low accept/low yield crowd -- a bunch of Tier 1 schools -- are vulnerable to significant rankings volatility. If they drop, next year's applicant volume will be affected, making it very difficult to rebound.
- High accept/low yield are the most likely to close.
Until August and September, when the wait lists finally clear, nobody really know the depth of market shift. Only then can the budget holes be finalized. Deans will then have candid conversations with their central administrations to answer the question, "Is this downward trend permanent?"
[posted by Bill Henderson]
Monday, July 2, 2012
[by Bill Henderson, originally published in The National Jurist, January 2011 (PDF)]
Over the last three years, the majority of my research has focused on lawyer competencies, or what I prefer to call lawyer success factors. This research has fundamentally changed my perceptions of legal education, primarily because the majority of success factors are not taught, assessed, or measured during law school. It is not that we law professors are deliberating ignoring something important. Rather, we are not even aware that something beyond legal knowledge and technical skills are necessary for success.
Based upon my own observation, and a fair amount of time sifting through data, I think the single best predictor of both success and satisfaction as a lawyer is the ability to become truly client focused. Unfortunately, this client-focused mindset is completely absence in the large law school classes that are the core of the law school curriculum.
[by Bill Henderson, originally published in The National Jurist, September 2011 (PDF)]
Every year as the on-campus interview process gears up, another class of high performing law students fret over their chances of getting an offer from a cadre of firms that, a year earlier, they had never heard of. The thought process goes something like this. “Oh, these types of firms pay a lot of money. And among these firms, some are harder to get hired at than others – they are more prestigious. If I can land a job at one of these firms, the entire legal world will know that I am smart. That would feel great. And I can quickly pay off my student loans and keep my options open.”
Money and peer pressure are a potent mix. They make it virtually impossible to remember the original reason for applying to law school.
During the dot.com bubble of the late 1990s, I was a student at the University of Chicago Law School. In the classroom, I was deeply intimidated by my classmates. But as we ploughed through the OCI process, I was astonished to see my fellow students anguishing over Skadden versus Latham. Or gnashing their teeth that they might have missed the Chicago grade cutoff for Gibson Dunn. Weren’t these firms more alike than they were different? And what made them so great beside the relative difficulty of securing a callback?
The prevailing analysis during OCI seemed shallow and bereft of reliable facts. We were taking our cues from each other. Yet, I could sense my own irrational desire to compete and win. I wish I could say that I was smarter than my classmates. But that’s not true. I was just older, and life had already thrown water on my face.
[By Bill Henderson, originally published in The National Jurist, January 2012 (PDF)]
Many law students spend their 1L year fearing that they might be the admissions mistake. I was one of them. The only feedback is what can be gleaned from the professor-student dialogue. In turn, everyone uses this information (if you can call it that) to handicap their likelihood of making law review or otherwise getting the grades needed to get the most coveted jobs. The whole process seems very binary: Am I smart enough to be a successful lawyer, yes or no?
When I became a law professor, my research on law firms and legal education eventually brought me to the topic of lawyer success. I started collecting examples of lawyers with sterling credentials who failed to develop a significant practice; and those with less impressive pedigree who ended up becoming go-to experts and indispensible lynchpins of their organizations. What explained these divergent outcomes?
The research of Carolyn Dweck, a cognitive psychologist at Stanford University, provides some important insights to this question. Before delving into these insights, however, ask yourself whether the following statement is true: “A lawyer’s skill set is determined primarily by innate ability—you either have enough or you don’t.”
[by Bill Henderson, originally published in The National Jurist, March 2012 (PDF)]
For over a century, law schools have suffered from an inferiority complex. We have masked it well, but its consequences are finally coming home to roost. Like most psychological conditions, our lives will be much better and healthier when we deal with its root cause. Further, when law students understand this history, they will better understand the changing nature of the legal economy. They can even help law schools with the cure.
In 1918, the renowned economist Thorstein Veblen famously quipped, “the law school belongs in the modern university no more than a school of fencing or dancing.”
Veblen, like many of his academic colleagues, believed that universities should be citadels for science-based learning and the production of knowledge. Law, in contrast, was a trade. Indeed, in the early 1900s, a substantial portion of the practicing bar had obtained their skill and knowledge through office apprenticeships. When law schools did begin to appear, they were just as likely to be proprietary law schools operating out of a local YMCA than to be part of an established university.
Despite the skepticism of the academic class, there were a host of practical reasons for universities to create (or, in some cases, acquire) a law school. First, the law was the primary occupation for many elected officials, which held out the prospect of reflected glory. (Veblen recognized this motivation, which compounded his worry.) Second, a handful of law schools at prestigious universities had begun adopting the so-called case method, which purported to find objective legal rules and principles akin to a scientist working in a laboratory. The perceived rigor of the case method provided at least a veneer of science. Third, with their large lecture halls filled with tuition-paying law students, law schools made money.
Tuesday, June 26, 2012
Just a few days ago, the ABA Section on Legal Education and Admission to the Bar posted on its website, in downloadable spreadsheet format, the employment outcomes for the Class of 2011 (here). In a few short years, these data are going to fundamental reshape our industry. The changes will make the industry better and stronger, but the journey to this better place is going to painful and disorienting for all law schools—that’s right, even the elite national law schools will be affected.
This is worth explaining in very simple and concrete terms. The Class of 2011 employment data consists of 134 variables on 200 ABA-accredited law schools-- 26,800 discrete data points, which is enough fill a phonebook. For a long time, the policy of the ABA was to do just that – publish a phonebook of data in the form of the ABA-LSAC Official Guide to Law Schools. Well, decisions on where to attend law school are not free. They require time and effort. When a prospective student has to wade through a phonebook to assemble relevant data to make important decisions, many (most) will forgo the exercise altogether.
This has two very important effects:
- The quality of enrollment decisions goes down because, from the student perspective, the costs are too high. That’s error #1. But it is forgivable—decisionmaking is a skill taught in a top-notch legal program, not a prerequisite for applying.
- To simplify their decisions, students gravitate to U.S. News ranking, which is a compact 4-page table that contains easy-to-understand comparative data.
Yes, the U.S. News has serious flaws; and every year, overreliance on them produces tragic consequences in the form of excessive student debt. Now, with the ABA employment "phonebook" in spreadsheet format, those a with modicum statistical skills and an internet connection can analyze, simplify and publish relevant statistics that will better inform the decision to attend law school.
Consider the chart below:
When I created this simple pie chart, I started with a simple premise: If I am applying to law school, my minimum hope is that nine months after graduation I will be able to obtain a full-time, permanent professional job. The phonebook has three columns of data that speak to this hope:
- Bar Passaged Required Jobs, FTLT (i.e., Full-time, Long-Term)
- JD-Advantage Jobs, FTLT
- Professional Jobs, FTLT.
All the other myriad data columns, parsing things by part-time, short-term, non-professional, unemployed, unknown, etc., do not meet the minimum hope. So they are lumped together as “Other Outcomes.” Clearly, for 1/3 of the Class of 2001, their full-time, permanent professional ambitions have not yet materialized.
A reasonable next question is how these figures vary by U.S. News rank. The answer is reflected in the chart below.
- Least surprising. The outcomes are better at “national law school”--almost 90% are FTLT professional jobs. I used the T14 cutoff because the composition of this group has not changed in two decades of U.S. News rankings. Few people would disagree that these schools have strong pull among legal employers.
- Most surprising. There is a whole lot of Purple--i.e., “Other Outcomes”--throughout Tiers 1, 2, 3, and 4. For over 90% of law schools, this is a very challenging legal market.
- Biggest reality check. The JD-Advantaged and Professional jobs appear to be, on balance, less desirable than those requiring bar passage. They increase nearly three times in relative proportion as we move from T14 to Tier 1 to Tier 2. Many are likely compromise jobs—not as good as practicing law, but better than non-professional alternatives.
With these relative benchmarks in place, a prospective law student can look for law schools that are outperforming their U.S. News rankings. And there are quite few.
For example, in Tier 4, St. Mary’s (Texas) has 78.3% Bar Passage Required placement; Mississippi College’s figure is 75.3%; and Campbell (NC) is 71.4%. These schools aren’t feeding BigLaw, but their graduates appear to be full-time practicing lawyer nine months after graduation. What accounts for their success? Most of their graduates are probably in cities and towns far away from corporate practice. Nonetheless, these schools have a clear niche they are filling.
In contrast, there are 20 schools in Tiers 1 and 2 that have less than 50% Bar Passage Required jobs. What do they have in common? Many are in big cities in the Northeast and Mid-Atlantic or California--large urban markets that are attractive for young professionals. It is likely that too many young lawyers are chasing after a finite set of legal jobs. It is worth noting, however, that these same schools also have rates of placements in JD Advantage and Professional jobs that are higher than other law schools at statistically significant levels.
So many young law graduates are voting with their feet. Better to stay in the city as a non-lawyer professional than to move to the South to be country lawyer doing small firm practice. Although I suspect a large proportion of these grads will fare quite well, it is important to keep in mind that JD-Advantaged and Professional jobs are not a panacea—they are also in short-supply. At most schools in Tiers 1, 2, 3, and 4, between 30% and 42% of graduates are either unemployed and underemployed in jobs that are either nonprofessional, part-time, or short-term. Indeed, 4.3% (1,874) of all jobs for the Class of 2011 were funded by the law schools themselves!
So what is going to happen? Notwithstanding the heady optimism of the “Kaplan kids”, the ABA employment data, thanks to the blogosphere, is going to reduce information costs, making it easier for prospective law students to determine whether law school is a good investment. The needle is going to move, just not as fast as a Chicago School economist might predict.
Further, expect students to aggressively negotiate for scholarship money. Whether schools become more generous in merit aid, admit fewer students, or both, all signs point to shrinking budgets for law schools.
The utter transparency of a changing and stagnant legal market has potentially more dire consequences for law schools. The lifeblood of the entire legal education establishment, including elite law schools, is federal student loans. Our students get the same generous terms as graduates of medical and dental schools, who are not struggling to make six figure incomes. The graphs above suggest that a large proportion of our students will be on Income-Based Repayment (IBR), which is – functionally – insurance in the event a high income fails to materialize in the years following graduation. The downside risk of that insurance – lack of repayment of expected principal and interest—is borne by U.S. taxpayers.
Right now, it is possible to estimate the size and probability of this downside risk. All the Federal Government has to do is add-up the shortfall between the repayment of principal and interest in normal repayment versus the monies actually being collected. What percentage of graduates are on IBR? What portion of their current principal and interest are they able to pay? These are simple numbers that some enterprising journalist will eventually request. Further, they are legitmate public policy questions that we, the legal academy, should face long before the journalists get there.
Lawyers and law schools are not a favored interest group on Capitol Hill. We need to plan for the extremely high probability that the financing of law schools will be dramatically altered in the years to come. The longer we wait, the more painful and disastrous the transition. Every law school will need a damn good story to justify continued federal loans. And right now, many of us lack that story – being in Tier 1 or T14 (where debt loads tend to be the highest) won't mean anything if the math falls short.
In summary, our ivory tower is crumbling. With the ABA putting the employment data in downloadable format on its website, law schools will have to do something completely new and scary to us—we are going to have to compete to keep our jobs and stay in business. The litmus test is going to be the ability of our graduates to obtain remunerative professional work in a highly competitive global economy. This is very serious work.
[Posted by Bill Henderson]
Saturday, May 19, 2012
That's the headline of a story in the Sunday edition of The Age, one of Australia's leading daily newspapers. Here is the nut:
ALMOST two-thirds of Australia's law graduates are not working as lawyers four months after they have completed their degrees, according to a study.
The Graduate Careers Australia survey of 1313 recent graduates from all over the country found that 64 per cent were not practising law between 2010 and 2011.
There was ''no way'' law firms could accommodate all the graduates from Australia's 31 law schools, La Trobe University's director of undergraduate studies, Heather King, said. ''It's a well-acknowledged fact that 40-50 per cent will not end up in a traditional law practice.''
These statistics may seem even bleaker than those that describe the U.S. legal market. Yet, for two key reasons, these Australian students are far better off. First, the Australias follow the LLB model, which has some substantial advantages. According to my Australian colleagues, a law undergraduate degree is often combined with a major in another field or discipline, such as business, accounting, sociology, or literature. So a student's commitment to law as career is often tentative and, in many cases, hedged by another career interest. Second, higher education in Australia enjoys a large national subsidy. So law graduates typically graduate with little or no debt.
Ironically, as the story reports, some Australian universities are moving toward the J.D. model, essentially concluding the law is best taught to more mature students as a graduate discipline.
I agree that students ought to have a cost-effective way to opt out of law. I also agree that law is best taught as a graduate discipline to students with some substantial life experience. I don't, however, see an easy way to cost-effectively achieve both. The fact that the solution is not easy will, conversely, make the solution quite valuable.
[posted by Bill Henderson]
Monday, May 14, 2012
[Update: I edited the review below to remove three paragraphs from my analysis. It was a metaphor that was not key to my review of Brian's book yet could be fairly viewed as insulting to readers I both respect and hoped to persuade. I am sorry about that. It was a substantial change, so I am acknowledging it here. wdh.]
Many legal academics are going to dismiss Brian Tamanaha's book, Failing Law Schools, without ever reading a page. A larger number may simply ignore it. That is ironic, because this is the response one would expect if Tamanaha's account of a corrupt, self-indulgence academic culture were true.
I have lived inside this culture since I joined the academy in 2002. And I can attest that very few people inside the academy believe that we are living the high life on the backs of our students. But in the year 2012, that perception does not matter very much. Rather, the perception that matters is the one from the outside looking in.
Over the last eighteen months or so, The New York Times, The Wall Street Journal, The Washington Post, The Atlantic, the legal press and countless blogs (many written by unhappy students) have relentlessly hammered away at law schools.
The lay public, including most practicing lawyers, are looking for a definitive account that can explain the legal education's maelstrom. Tamanaha's account is a veritable Brandeis Brief on what went wrong, chocked full of facts and history and persuasive analysis.
It begins with a deal between the ABA and AALS to join forces to persuade the state bars to restrict entry to ABA accredited law schools (the ABA's goal) and thereby to elevate the stature of the legal professoriate (the AALS's goal). Once this deal was struck -- in the early 20th century -- pretty much every change accrued to the benefit of the law faculties: higher salaries, lower teaching loads, the advent of administrators to lighten the burden of governance, and more freedom to pursue scholarly interests. When U.S. News & World Report ranking appear in the early 1990s, the law schools are forced to make choices. And our collective behavior suggests that vanity and prestige are all-too-likely to trump important principles like student diversity or honesty in reporting data.
For us law professors, here is our conundrum. From the outside looking in, things look bad, even corrupt. Yet we don't feel we have done anything wrong. We are certain that we lack the intent to cheat or defraud. But that, unfortunately, is error #1. As we all know, establishing intent is always a matter of circumstantial evidence. So let's review that evidence from the perspective of the neutral fact finder.
Life is objectively good for us: We have high salaries, social prestige, lots of travel, job security, and near absolute freedom to organize our time outside the three to six hours a week we teach, 30 weeks a years. Against this backdrop, there is consensus among legal employers that we are not very good at practical training including, in the eyes of many, basic legal writing. Moreover, the overproduction of lawyers creates problems for the legal profession as a whole. Similarly, our students are saddled with enormous debt and nothing we are doing curricularly seems geared to solving their burgeoning unemployment or underemployment problem. The federal government finances this "system." And through Income-Based Repayment programs, the U.S. taxpayers are backstopping our high costs.
Because law faculty seems to be getting the long end of the bargain here, our subjective feelings of honesty and rectitude are unlikely to be viewed by many students, practicing lawyers, or the broader public as credible. In fact, they may be viewed as insincere or out of touch. How did things get so badly out of kilter?
But for Tamanaha, some pesky journalists, angry students, and the ticking time-bomb of law students debt, I am confident that we law professors could coast along on our present track for another several decades. As an insider, I can honestly testify that we believe--sincerely beheve--that we care about our students, the quality of their education, their debt loads, and their future job prospects. But looking at the same set of facts, history will draw its own conclusions. And Tamanaha, akin to a lawyer building a case, offers up a very compelling narrative that the dispassionate observer is likely to find convincing.
Other bloggers and news outlets have commented on Tamanaha's book, often drawing very different conclusions. Compare Brian Leiter's Law School Updates and Orin Kerr at Volokh Conspiracy (Tamanaha's argument has merit, particularly when he suggests that lower ranked law schools should consider changing their models), with Scott Greenfield at Simple Justice (here and here) (Tamanaha describes an insular, out-of-touch professoriate from the top down that distains the input of practicing lawyers) and the Chronicle of Higher Education (subscription req'd) (describing Tamanaha's thesis, "Law schools are bloated with too many underworked, overpaid professors whose salaries are supported by tuition increases that are making law school a losing bet for many students").
What are the proper inferences to draw?
In late 2011, I reviewed a copy of Tamanaha's book as part of the peer-review process for University of Chicago Press. My primary advice to Brian, communicated directly to him as well as his editors, was "to condemn the sin, not the sinner." Legal academics may seem culpable for privileging their interests ahead of students, I said, but these are the same folks who need to be relied upon to fix the problem. (The alternative is that nearly all of U.S. legal education will collapse under the weight of high costs and fewer entry level legal jobs; and on many days, I think the latter is just as likely as the former.)
Frankly, I don't know if my "condemn the sin, not the sinner" recommendation was good advice. In order to change, the legal academy may need more pressure brought to bear from outside forces. This may happen if the legal academy is painted as more selfish, insular, elitist and out of touch than we already look now. Congress and the Department of Education hold the ultimate trump card, and Tamanaha's book provides the essential supporting evidence for radical action. If and when this happens, law faculties will be forced to pick sides.
History is now playing out right before our eyes. I believe there is a good chance that Brian Tamanaha's book will be viewed--by history at least--as a great act of courage. The implication, of course, is that the rest of us will look foolish.
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occuring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
[posted by Bill Henderson]