Sunday, June 30, 2013
As noted in Part I of this post, the competitive dynamics among law schools are about to change due to a combination of two factors: (1) the ABA's collection and publication more granular data on school-level employment outcomes, and (2) the decision by U.S. News to make JD Bar Passage Required and JD Advantaged the primary measures for the employed-at-9-months input to its rankngs formula.
The histogram below reveals a near perfect bell curve for this revamped US News
input [click on to enlarge]. This is a huge change from prior years
when schools were all bunched at the 95% level because employment of any
kind was all that mattered. Under the old methodology, any law school that
limited itself to full-time, professional law-related jobs would have
plummeted in the rankings 10 to 50 spots.
Because spring 2013 was the first year with the new methodology, the impact of the change is not well understood. The most stark fact of the new environment is that the full-time, professional law-related jobs are in short supply. Among the class of 2011 (the stats used for the 2013 rankings), this desirable outcome was achieved by only 63.0% of graduates. When we subtract out full-time, long-term law-related professional jobs funded by law schools -- a luxury that only a small number of mostly first-tier law schools can afford -- the total drops to 61.9%.
Digging deeper, some other significant patterns emerge.
The vast majority of law schools feed into the regional labor markets where they are located. In places like California, those markets are saturated.
Among the ABA-accredited law schools in California, 46.5% of the class of 2011 obtained full-time JD Bar Passage Required jobs. The comparable figure for the remaining ABA-accredited law schools was 56.0%. Likewise, there is also a disparity for JD Advantage jobs: 6.2% in California versus 8.3% for schools in all other states. In fact, among the 19 ranked California law schools, only four -- Stanford, UC Berkeley, USC, UCLA -- are above the 63.0% average for full-time, professional law-related jobs.
Based on these data, it should come as no suprise that no law school located in California went up in the 2013 U.S. News rankings. Stanford, USC, and Santa Clara hung onto their ranking, but 11 California law schools dropped, with an average decline of 11 spots. Five other Calfornia schools remained in the unranked fourth-tier category.
In contrast, some of the biggest winners in the methodology change were flagship public law schools that are relatively big fish in smaller regional markets. Students at these schools tend to stay in-state and get JD Bar Passage Required jobs at rates far higher than the 54.9% average for the class of 2011 average.
Below are the top 15 non-national public law schools based on the proportion of FT Bar Passage Required jobs.
Between 2012 and 2013, the average rankings gain for the above schools was +9 spots. Among this group, the only school to go down in the rankings was ASU Law (-3). And that decline was largely due to the fact that ASU reported a 98% employed-at-nine-months figure for the class of 2010--a figure that drew suggestions of aggressive gaming. See Brian Tamanaha, When True Numbers Mislead, Balkanization, April 2, 2012.
The heavier weighting for JD Bar Passage Required jobs also benefits a handful of lower-ranked private law schools that are practice-oriented and tend to feed smaller firms within their regional areas.
- Campbell (71.4% FT bar passage jobs) went from unranked to #126.
- South Texas (64.4% FT bar passage jobs) went from unranked to #144
- St. Mary's (78.3% FT bar passage jobs) went from unranked to #140.
Part-Time Law Schools Dominate JD Advantaged Jobs
JD Advantaged Jobs count the same as JD Bar Passage Required Jobs. But what, exactly, is included in this category? According to the ABA,
A position in this category is one for which the employer sought an individual with a J.D., and perhaps even required a J.D., or for which the J.D. provided a demonstrable advantage in obtaining or performing the job, but which does not itself require bar passage or an active law license or involve practicing law.
See ABA Class of 2012 (definitions). Many professionals enroll in law school on a part-time basis to improve their career prospects. It should be no surprise, then, that schools with part-time programs tend to be the largest producers of graduates with full-time JD Advantage jobs. In many cases, it is the full-time job that the student held during law school -- and presumably retains upon graduation -- that confers the advantage.
Of the top 10 schools based on the percentage of JD Advantage law school jobs, eight had part-time programs and the other two were located in a state capital, which tends to increase the number of opportunities related to government and public policy.
The schools listed above gained an average of 3.5 spots in the rankings, albeit the average is pulled down by the inclusion of Southwestern, which had to weather the brutal California legal market.
It is worth noting that the percentage of JD Advantage jobs is negatively correlated with the percentage of JD Bar Passage Required Jobs (-.33) .The table below summarizes the differences between schools with Part-time versus Full-Time only programs.
The higher percentage of JD Advantage jobs (10.1% versus 6.9%) for schools with part-time programs is unlikely the results of chance, as the differences in means are statistically signficant at p < .001. But what does this inverse relationship mean?
programs tend to be affiliated with lower ranked law schools, which in turn would produce a lower average percentage of JD Bar
Passage Required jobs. Yet, part-time programs are also in larger,
urban locations. Thus, in addition to the continued employment of
part-time students with their current employers, the sheer proximity to
large, specialized regional economies probably increases the proportion
of JD Advantage jobs. Indeed, any school in an large metro area would
be foolish to ignore the human capital needs of non-legal employers, as
knowledge of the law is very helpful in navigating through an ever more
complex, regulated, and interconnected world.
What is the Best Strategy for Maximizing Full-Time, Professional Law-Related Jobs?
Largely through happenstance, the ABA and U.S. News have created an environment where law schools have to ask this basic but very important question. Part-time jobs will no longer cut it. And few law schools have the cash to hire their own grads full-time for a year past graduation -- and if they do, there are probably better uses for the millions of dollars needed annually to prop up a school's ranking.
The new gold standard employment outcome is full-time, long-term professional law-related jobs. The issue of how to maximize this outcome is so pressing and intricate that it may warrant trade-offs in the admissions process, favoring students will lower credentials but more rock-solid employment prospects on the backend at graduation. This is the topic I will take up in Part III.
[posted by Bill Henderson]
Friday, June 28, 2013
NALP recently released the employment outcome data for the class of 2012. The good news is that the absolute number of JD Bar Passage Required jobs went up from the prior year. The bad news is that a significantly larger class of entry-level lawyers were competing for those jobs. The class of 2011 totaled 41,623, versus 44,339 in 2012 (+2,716, or +6.5%). And note, the class of 2013 is likely to be even bigger -- roughly +1.6% based on the size of the entering 1L classes in the fall of 2010 (see ABA enrollment data).
Setting aside the year-over-year flucuations, the trendlines suggest a relatively large and persistent shortfall in the number of full-time, professional law-related jobs. I assembled the graph below from NALP data [click on to enlarge].
[Methodological notes: NALP used the JD-Preferred category until the class of 2011, when NALP and the ABA collaborated on the creation of the JD Advantage category. According to NALP, the jobs in the two categories are "largely the same." See NALP, Detailed Analysis of JD Advantage Jobs (April 2013). The figures for 2012 are estimates of full-time employment calculated from (a) NALP's just released figures for 2012 class size and the percentage breakdowns by job category, and (b) the percentage breakdowns of full-time versus part-time from the prior year, which also relied on the new JD Advantage definition. In short, basic algebra.]
A reasonable expectation of a 3-year, $100,000+ financial commitment is that nine months after graduation, the entry-level lawyer has secured a full-time professional job. See Legal Whiteboard, June 26, 2007. Those outcomes are reflected in the blue-red-green bars above. Since 2007 (the first year that NALP collected data on full-time versus part-time employment), the percentage of jobs fitting these criteria has fallen from 85.0% to 73.9%. So the overall size of the purple bar -- part-time jobs, nonprofessional, unemployment, etc. -- has grown from 15% to 26.1%.
Unfortunately, the pain does not end there. With a limited pool of full-time professional jobs and the number of graduates trending upward, the law of supply and demand kicks in. Consider this arc of median entry-level salaries of employed graduates: $65,748 for class of 2007, $72,000 for 2008, $72,000 for 2009, $63,000 for 2010, $60,000 for 2011, $61,245 for $2008. So, in short, the odds of landing a full-time professional job have gone down, and so has the starting pay. Yet, tuition and student debt continue to edge up. These unsustainable trends have made law schools fair game for criticism by the media and law student bloggers.
That said, a market correction is clearly underway. A considerable number of prospective law students are deciding (rationally) not to apply to law school -- from 98,700 when the class of 2007 enrolled in the fall of 2004 to an estimated 58,424 for the fall of 2013. Likewise, law schools, to the extent they can afford it, are enrolling fewer students. From the high water mark in the fall of 2010 (49,700), law schools only enrolled 41,400 1Ls in the fall of 2012, and the numbers are sure to be even lower this fall. See Jerry Organ's estimates, Legal Whiteboard, May 20, 2013. To weather this storm, law schools are running significant deficits or drawing down their endowments.
So, can we conclude that the market correction will be complete when the relatively small class of 2017 enters the job market four years from now? I certainly think the smaller number of graduates will help. But I would argue that two things have fundamentally changed:
1. Revenues versus credentials. Law schools are struggling with the need to balance their desire to hang onto respectable LSAT/UGPA medians with a need to generate sufficient revenue to cover their operating costs. If a law school favors revenues this year, its US News rankings could drop, affecting its applicant pool in future years. On the other hand, the combination of shrinking 1L classes and lavish scholarships -- a strategy being pursued by dozens of law schools -- is unsustainable over the medium to long term. A decision to enroll fewer students this year is a three-year commitment to lower revenue. If the smaller entering class is repeated next fall, the budget pain doubles. Do it three years running, and the revenue shortfall triples. Many law schools are not trying to outrun the bear; they are trying to outrun other law schools in their regional market. Some law schools may not make it out of this trough.
2. Competition over full-time, professional law-related jobs. If there is one silver lining that has emerged from this troubled period in U.S. legal education, it is the willingness of the ABA to collect and publish more granular employment outcome data at the law school level. In turn, U.S. News has incorporated these data into its rankings formula. Instead of propping up our rankings by hiring our own students or benefiting when they got jobs nine months out working as a retail manager or a cab driver, under the new 2013 U.S. News rankings formula, only full-time, long-term jobs that are JD Bar Passage Required or JD Advantaged are given "full weight."
It is this second point that is going to push change in how law schools do business--we now have an employment outcome in which the ranking payoff is now fully in allignment with what law students want--full-time, professional law-related jobs.
Specifically, the employed-at-nine-months input to the U.S. News rankings formula is currently given 14% weight. According to the U.S. News law school rankings methodology, the magazine is weighting 22 of the 35 employment outcomes collected and published by the ABA. Among these 22 factors, we don't know the internal weighting. What we do know based on the "full weight" given to JD Bar Passage Required and JD Advantage jobs, is that the highest employed-at-nine-month scores will go to law schools with the highest percentages in these two categories. This is a completely new world for law schools -- one that incentivizes what law students care about when they make the decision to enroll.
Part II to follow ...
[Posted by Bill Henderson]
Wednesday, June 5, 2013
For those trying to better understand how legal education can better prepare law students for the world that awaits them, I would encourage you to take a look at the draft article my colleague, Neil Hamilton, Director of the Holloran Center for Ethical Leadership in the Professions at the University of St. Thomas School of Law, recently posted on SSRN. The article is entitled Law-Firm Competency Models and Student Professional Success: Building on a Foundation of Professional Formation/Professionalism. Here is some of the description from the abstract:
A law student who understands legal employer competency models can differentiate him or herself from other graduates by using the three years of law school to develop (and to create supporting evidence to demonstrate) specific competencies beyond just knowledge of doctrinal law, legal analysis, and some written and oral communication skills. . . .
In Part I below, this essay analyzes all available empirical research on the values, virtues, capacities and skills in law firm competency models that define the competencies of the most effective and successful lawyers. Part II examines empirical evidence on the competencies that clients evaluate. Part III evaluates the competencies that make the most difference in fast-track associate and partnership promotions. These data and analyses lead to several bold propositions developed in Part IV:
1. Law students and legal educators should identify and understand the values, virtues, capacities and skills (the competencies) of highly effective and successful lawyers in different types of practice (one major example is law firm competency models analyzed below in Part I);
2. Each student should use all three years of experiences both inside and outside of law school (including the required and elective curriculum, extracurricular activities, and paid or pro bono work experiences) to develop and be able to demonstrate evidence of the competencies that legal employers and clients want in the student’s area of employment interest;
3. Law schools should develop a competency-based curriculum that helps each student develop and be able to demonstrate the competencies that legal employers and clients want; and
4. Both law students and law schools should understand that the values, virtues, capacities and skills of professional formation (professionalism) are the foundation for excellence at all of the competencies of an effective and successful lawyer.
The article presents far more useful information than can be summarized here, and different readers may be struck by different things discussed in the article. One of the most significant takeaways for me, however, is the convergence around an array of competencies frequently not taught in law school. The article analyzes competency models used to assess associate development at 14 medium to large law firms in the Twin Cities and compares that with some other literature on competencies clients look for in attorneys. The analysis demonstrates that in addition to traditionally understood technical skills – legal analysis, oral and written communication, and knowledge of the law – there is significant convergence around several competencies frequently not taught in law school – 1) Ability to initiate and maintain strong work and team relationships; 2) Good judgment/common sense/problem-solving; 3) Business development/marketing/client retention; 4) Project management including high quality, efficiency, and timeliness; 5) Dedication to client service/responsive to client; and 6) Initiative/ambition/drive/strong work ethic.
Whether law schools are going to be able to find efficient ways to offer students opportunities to develop these competencies, it is imperative that we make our students aware that they need to be developing these competencies to give themselves the greatest likelihood of professional success.
[posted by Jerry Organ]
June 5, 2013 in Data on legal education, Data on the profession, Important research, Innovations in legal education, Law Firms, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (0)
Wednesday, May 29, 2013
There has been a bit of a flutter recently regarding law school admissions in light of data from the LSAC Current Volume Summary for May 17, 2013, suggesting that the size of the applicant pool will be larger than earlier projections had suggested. It appears that a larger number of applicants are showing up later in the application cycle than last year. This has generated blog postings on TaxProf Blog, The Faculty Lounge and Lawyers, Guns & Money. While I will be posting my projections for the fall 2013 entering class on this blog in the next couple of days, I first wanted to recap (to the extent available data allows) the situation in which law schools have found themselves as of the fall 2012 entering class.
In November, I posted a preliminary, unofficial comparison of enrollment data for 140 law schools and profile data for 128 law schools that had such information posted on their websites as of November 15, 2012. Now, several months later, I have an updated analysis based on enrollment data from 188 law schools and profile data from 173 law schools that had published on their websites sufficient profile data on which to make meaningful year-to-year comparisons as of May 28, 2013. Please note that this data remains unofficial, having been taken from law school websites, not from any ABA publication. When the ABA posts the digital version of the Official Guide in the coming weeks, I will be able to run an official comparison across all schools.
DECLINING ENROLLMENT – Between 2010 and 2012, 147 of the 188 law schools with available enrollment information (roughly 78%) had a decline in enrollment of at least 5%. Of these 147 law schools down at least 5% in enrollment, nearly half – 73 --- were down 20% or more:
-52 of the 188 law schools with available enrollment information (nearly 39%) had a decline in enrollment of between 20% and 30%.
-21 of the 188 law schools with available enrollment information (roughly 11%) had a decline in enrollment of 30% or more, with 11 seeing a decline in enrollment between 30% and 40% and 10 seeing a decline in enrollment of more than 40%.
Notably, only 16 schools declined between 2% and 5%, only 16 schools were flat (a change between -2% and +2%) and only 9 schools had an increase in enrollment of at least 2%. Across these 188 schools, first-year enrollment declined from 47854 in 2010, to 44141 in 2011, to 40297 in 2012, an overall decline of 7557 or 15.8% between 2010 and 2012.
DECLINING PROFILES -- Among the 173 law schools with complete profile information available for their fall 2012 entering first-year class, the average LSAT profile has declined over the last two years, from a 160.6/158.3/155.4 to 159.8/157.2/153.8. The average GPA profile also has declined, from a 3.64/3.43/3.15 to 3.62/3.40/3.13. In addition, the number of law schools with a median LSAT in the 140s has more than doubled from 9 to 19 between 2010 and 2012.
DECLINING ENROLLMENT WITH DECLINING PROFILES – Perhaps most significantly, of the 73 law schools with declines in enrollment of 20% or more, 52 of those schools also saw a decline in their LSAT/GPA profiles between 2010 and 2012. That means roughly 30% of law schools with available enrollment and profile information for 2012 (52/173) had declines in enrollment of 20% or more and saw their LSAT/GPA profile decline. Notably, seven of these 52 law schools were in the 2012 USNews top-50, 13 were ranked between 51-100, 13 were ranked between 101-145 and 19 were in the alphabetical listing of schools. The declining interest in law school, therefore, is impacting law schools across the rankings, but is more dramatically impacting alphabetical schools than top-ranked schools.
As noted above, I am planning on posting a projection on fall 2013 first-year enrollment in the coming days. I also am planning on posting an analysis of scholarship retention information across all law schools sometime in the coming days.
Monday, May 20, 2013
This week's National Law Journal has a Special Report section on the challenges facing law schools. Karen Sloan has several stories on how law schools are finding alternative sources of revenues beyond tuition dollars for JD degrees (masters's degrees for nonlawyers, online LLMs, and lawyer executive education).
I contributed an essay entitled "The Calculus of University Presidents." Although the essay is posed as the letter I would write to a university president seeking advice on how to handle a significant, unexpected shortfall in law school revenues, the intended audience is lawyers and legal educators seeking to get a handle on the brutal economics that are now threatening the survival of a large swath of law schools.
From the perspective of many, it would be nice if things would go back to the way they used to be. But that is not going to happen. Good lawyers understand that we gain no long-term advantage from hiding from these facts. Instead, we need to confront them honestly and proactively.
[posted by Bill Henderson]
Sunday, March 24, 2013
Each year, the instructors in Indiana Law's 1L Legal Professions class coordinate with Indiana Law's Office on Career and Professional Development (OCPD) to run the Career Choices Speakers Series -- 16 lunchtime forums on Thursdays and Fridays throughout the second semester. It has been an enormous hit with students. Although our 1Ls are required to attend at least three, a huge proportion of the 1Ls attend over ten.
Below is a photo of this Thursday's pizza run for the session on Direct Service Public Interest Lawyers -- 22 pizzas and the laptop/scanner used for attendance. Over the course of semester, we will purchase well over 300 pizzas. Who pays for all of this food and equipment (plus about a dozen dinners for students and alums that occur before and after these events)? An Indiana Law alumni who profoundly believes in the role of ethics and integrity to achieve personal and professional success in life. And he has done so quietly, behind the scenes, every year for the last five.
I thought our alum would enjoy seeing the pizza gurney. Thank you! You are opening students' eyes and helping them make better decisions, all through relationships with other lawyers.
[photo credit, 1L Dakota Scheu, via iPhone]. For additional information on this highly effective program, see my prior post, A New Tool for Lawyer Professional Development.
[posted by Bill Henderson]
Wednesday, March 13, 2013
I was at the ReInvent Law Silicon Valley event last week. Following up on Jerry's thorough remarks, I can honestly say it was unlike any legal education and lawyer conference I have ever attended (the only thing close is Law Without Walls). There is a new guard in the legal academy taking shape, and it is led -- truly led -- by Dan Katz and Renee Knake at Michigan State.
Admittedly, Dan and Renee lean heavily toward my bias. Most of us law professors talk. Dan and Renee, in contrast, are doers. Shortly after becoming assistant professors, they each moved quickly from ideas to action to actually having the audacity to attempt to build new and relevant institutions. Moreover, they both did it untenured--Dan is only in his second year of teaching and Renee just cleared the tenure hurdle earlier this year. They did all of this without a net. To my mind, they are winning the "Game of Life." If other junior faculty follow their example, the legal academy is going to truly change. And right now, that is what we need.
One of my favorite Paul Lippe quotes is this, "In hindsight, the new solutions are all going to look obvious." ReInvent Law was 40 speakers tied together by a common interest in experimentation. Were all the ideas good? If history is any guide, and the criteria is moving from concept to implementation to financial and institutional sustainability, the answer is surely no. But it was invigorating to be in a room of doers who are all willing to risk failure. That is the courage and leadership we need right now. To me, it looked obvious that we need a place like ReInvent Law where insurgent ideas can be expressed with enthusiasm, even if only a handful or fewer will transform the legal landscape.
I was fortunate to be one of the presenters. Dan Katz was kind enough to take my picture when I gave my Ted-style talk (all the talks were Ted-style or "Ignite"). If you zoom-in on me, I look ridiculous. I am no showman. But you have to admit that the lighting is pretty spectacular. The green screen, by the way, is the running twitter feed, an idea that I can assure you was not stolen from the ABA or the AALS.
Amidst all these "revolutionary" ideas, I think my presentation was probably the most conservative. My central claim is that 100 years ago, as the nation struggled to find enough specialized lawyers to deal with the rise of the industrial and administrative state, some brilliant lawyers in cities throughout the U.S. created a "clockworks" approach to lawyer development. These clockworks filled the enormous skills and knowledge gap. Firms like Cravath, Swaine & Moore, through their "Cravath System," finished what legal educators started. (I use the Cravath System as my exemplar because its elegant business logic was written out so meticulously in the firm's 3-volume history.)
The whole purpose of the clockworks was to create a "better lawyer faster." This is a quote from volume II. The company I co-founded, Lawyer Metrics, incorporated it into our trademark -- the value promise is that compelling. See the slides below.
Here is the Slideshare description:
The original Cravath System circa 1920 demonstrated the power of a "clockworks" approach to lawyer development. The system was a meticulously designed and mechanized way to create specialized lawyers who could service the needs of America's rapidly growing industrial and financial enterprises -- lawyers who were in perennial short supply because the requisite skill set could only be learned by doing. The System endured for a century because it solved the specialized lawyer shortage by making every stakeholder better off -- junior lawyers (received training), partner-owners (large, stable profits), and clients (world class service and value).
Today's legal employers and legal educators would benefit by revisiting this system's powerful business logic. The clockworks approach to lawyer development still works. The only difference is that the specifications for a great lawyer have changed. Like the original Cravath System, a new clockworks would create a "better lawyer faster."
[posted by Bill Henderson]
March 13, 2013 in Current events, Data on legal education, Data on the profession, Fun and Learning in the classroom, Innovations in law, Law Firms, Legal Departments, New and Noteworthy, Structural change | Permalink | Comments (0)
Wednesday, February 13, 2013
My previous post on Washington & Lee's 3L Program stirred a lot of interest and commentary, including some disbeleiving critics. Fortunately, Professor Jim Moliterno agreed to write a reply essay, below, that completes the cycle. [Bill Henderson]
Jim Moliterno Replies [This is a long reply, so a PDF version online here]
A number of comments to Bill’s January 28 post and posts regarding it on other blogs cause me to enter this conversation.
Are students really coming to W&L because of the new curriculum? Yes, to a significant extent. How do we know? Because the entering students say so. As do many law schools, we administer a questionnaire to our enrolling students. Among the questions asked is the obvious one: why are you here?
In the most recent such survey the students were asked to rank the strengths of the law school. Here are the top ten, in order, according to the entering students:
- Third Year Curriculum
- Ranking / Prestige
- Quality of Life
- National Reputation
- Job Placement
- General Cirriculum
- Clinical Program
- Financial Aid Award
- Size of Lexington
The curriculum reform was first. Financial aid awards were 9th, just ahead of the “size of Lexington.” The data does not support the unsubstantiated claims of some bloggers that students are choosing W&L because of the generosity of financial aid awards.
The curriculum reform has steadily moved higher on the “strength” rankings given by enrolled students since 2009. The 2011 and 2012 surveys are nearly identical, and the written comments of students about their reasons for coming to W&L (none reprinted here), are more striking than the numbers themselves.
I don’t know of any better data on this proposition but the statements of those whose reasons are under study. If that data is unsatisfying to some, then they will continue to be unsatisfied.
Are there other reasons students come to W&L? Of course. W&L has a highly productive, highly visible faculty engaged in scholarship and projects at the highest levels. Some students undoubtedly value W&L’s faculty prowess. W&L is highly ranked. Some students undoubtedly are affected by a top 25 ranking. It has an excellent reputation as a small, closely-knit academic community. Some students select for the sense of community and size. No reason will ever be the only reason for prospective students to choose a law school. Changes made by law schools will affect student choices for or against a particular law school. The W&L curriculum reform is positively affecting a significant number of students’ calculus about choosing W&L.
And some do come because of the financial aid package they were offered. But the financial aid reason is unlikely to explain the increase in applications since 2008. Some students, the recipients of aid, undoubtedly come in part because of the aid. That is no different than the students who choose [insert name of any school] because of the financial aid they were awarded. In 2012, about the same number of offers of admission were made as in previous years, but instead of the usual 130 or 135 admittees choosing to attend, more than 260 made deposits. Some were asked to defer their attendance until 2013 and once the dust settled we had a class of 187 instead of the usual 130 to 135. This same class entering in 2012 listed the curriculum reform first and financial aid ninth as strengths of the law school.
What else was happening in 2008 and 09 when the applications increased by nearly 33% per year?
In 2009 and 10, while W&L applications were on the rise, the US News ranking fell from 25-34 (while its reputation rank among academics stayed steady). It has now recovered to 24. If anything, that should have led to a drop in applications during 2008-2011 rather than the sharp increases that actually occurred.
Can we exclude all other possible explanations than those previously mentioned? Of course not. It could be that being in a small, beautiful mountain town is all the rage among young adults and 33% more students want that now than wanted it in 2007. I know of no data to prove or disprove that proposition, so it remains one that could be true. The reality is that the students who have come in recent years rate the curriculum reform among the top reasons (often the most important reason) for their attendance at W&L. That matters.
There is empirical evidence that the W&L curriculum reform is engaging students more than in the traditional “no plan” third year curriculum. Is it perfect evidence? Of course not. Is it definitive evidence that has no flaw? Of course not. Is anything ever supported by perfect, definite evidence that has no flaw? Not to my knowledge. We make all of our most important decisions in life based on the best available evidence. As long as the evidence is empirically sound and statistically significant, it is worthy of respect. The evidence of W&L 3L engagement increases is sound and statistically significant and marks a path toward further research and verification.
One commenter suggested that the data is suspect because the peer schools have not been identified. Their data belongs to them, not W&L. LSSSE does not make specific school data available to other schools. So W&L has only a composite score for those peer schools. And it would be unseemly for W&L to reveal the specific schools. I will not do so here. But to be sure, W&L asked LSSSE to calculate the data from a list of schools because they are the schools with whom W&L competes for students and competes in the rankings. It would not have served W&L’s research interests to learn how it compares with a list of schools that it does not compete with in the marketplace. No one at W&L has the data for any specific school.
Nonetheless, do not be mistaken, the schools with whom W&L is compared in LSSSE data are the schools anyone would expect them to be: schools that by their geography, rank and quality compete with W&L in the relevant markets for students and placement.
One observation: in the legal profession and legal education in particular, the status quo never seems to need empirical justification. Only change is suspect and wrong until proven definitively to be otherwise. Is there any empirical evidence that the status quo third year is the best possible third year except that it has been done that way for a long time? None that I know of. The old adage, “if it ain’t broke don’t fix it” does not apply here. The third year of legal education is “broke”.
Amid calls for its abandonment by some, dating back at least to the early 1970s report by Paul Carrington, the third year is widely acknowledged to be of the least value among the three years. (See below on W&L’s largely unchanged approach to years 1 and 2.) The Roman Legions (and more than a few other military powers) have found out that the mere fact that something has been successfully done before is not sufficient evidence that it will prevail in the present or future. Arguing in favor of the status quo based on no empirical evidence, . . . based only on instinct and the argument that it is the way things are currently done, is an approach doomed to failure. Just ask Kodak. (And see my forthcoming book: “The American Legal Profession In Crisis,” Oxford, March 2013.)
How about the claim that “[W&L’s LSAT has] gone down every year since [the new curriculum was announced], while its GPA rank has, after a plunge, more or less returned to where it was.” The blogger made that claim, once again without any data, let alone empirically credible data. Actually the W&L median LSAT was steady at 166 from 2005-2010, dropped 2 points to 164 in 2011 and stayed at 164 for 2012. It has not “gone down every year since [the new curriculum was announced in 2008].” Meanwhile, the GPA of entering classes, which was in the 3.5 and 3.4 range in 2008-2010, has gone up to the 3.6 range (3.65 and 3.62) in 2011 and 2012. The two modest changes in LSAT and GPA have essentially off-set one another in US News points. Hardly the reason for pause suggested by the blogger.
It seems that as long as someone is arguing against change, no rules apply to the arguments’ underpinnings.
Here is what the empirical evidence from the LSSSE surveys shows and what it does not show: students are more engaged in their work and their work includes more writing, more collaboration and more problem solving. Here are a few charts even more striking than those Bill used in his post. Together they say that significantly more than their peers or their predecessors at W&L, current third year students are working more, writing more, collaborating more, applying law to real world problems more, and preparing for class more often. Overall, they describe a harder-working, more engaged student body. And they are working harder at acquire the skills that matter to success as a lawyer.
February 13, 2013 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Structural change | Permalink | Comments (6)
Tuesday, January 29, 2013
Here it is in a nutshell. There is empirical evidence that Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students—significantly better than prior graduating classes at W&L, and significantly better than W&L’s primary competitors. Moreover, at a time when total law school applicants are on the decline, W&L’s getting more than its historical share of applicants and getting a much higher yield. When many schools are worried about revenues to survive next year and the year after, W&L is worried about creating the bandwidth needed to educate the surplus of students who enrolled in the fall of 2012, and the backlog of applicants that the school deferred to the fall of 2013.
[This is a long essay. If you want it in PDF format, click here.]
Alas, now we know: There is a market for high quality legal education. It consists of college graduates who don’t want to cast their lot with law schools who cannot guarantee students entree to meaningful practical training. Some might argue that W&L is not objectively better-- that the 3L curriculum is a marketing ploy where the reality falls well short of promotional materials and that, regardless, prospective students can't judge quality.
Well, in fact there is substantial evidence that the W&L 3L program delivers comparative value. The evidence is based on several years' worth of data from the Law School Survey of Student Engagement (LSSSE). I received permission from Professor James Moliterno, someone who took a leadership role in building W&L’s third year program, to share some of the key results (each school controls access to its LSSSE data.) They are below.
But before getting into empirical evidence, I want to put squarely on the table the most sobering finding that likely applies to virtually all of legal education. It is this: On several key LSSSE metrics, W&L has made impressive gains vis-à-vis its own historical benchmarks and its primary rival schools. But even for this leader, there remains enormous room for improvement. More on that below.
Here is the bottom line: Traditional legal education, when it is measured, does not fare very well. Yet, as W&L shows, substantial improvement is clearly possible. We law professors can respond to this information in one of two ways:
- Don’t measure, as it may disconfirm our belief that we are delivering a great education.
- Measure—even when it hurts—and improve.
I am in the second camp. Indeed, I don’t know if improvement is possible without measurement. Are we judging art work or the acquisition of key professional skills needed for the benefit of clients and the advancement of the public good?
Moving the Market
I doubt I will ever forget Jim Moliterno’s September 2012 presentation at the Educating Tomorrow’s Lawyers (ETL) conference at the University of Denver. He presented a single graph (chart below) showing W&L actual applicant volumes since 2008 versus what would have happened at W&L if its applicant volume had followed national trends.
While law school applicants crested a few years ago, W&L enjoyed a large run-up in volume of applicants, presumably due to the launching of their new 3L program. This larger applicant pool effectively served as a buffer when applicant declines began in 2011 and 2012. Since 2008, overall law school applicants are down -19%, yet W&L is up overall +33%.
But much more significantly, after their experiential 3L year was up and running and the overall legal job market continued to stagnate, W&L yields spiked. Ordinarily they would enroll 135 students. But for the fall of 2012, they received enrollment commitments from well over 260 students. Indeed, at the ETL conference Jim Moliterno said the school had to offer financially attractive deferments to get the class to approximately 185 incoming students -- a 50 student bulge.
When Jim Moliterno showed the above graph and explained the corresponding changes in yield, my good friend Gillian Hadfield, a skeptical, toughminded, evidence-demanding economist who teaches at USC Law, leaned over and said to me, “that is the single most important takeaway from this entire conference.” I agreed. The market for a legal education with practical training is, apparently, much more inelastic than the market for traditional JD programs.
Yet, what is perhaps most remarkable is that a large proportion of incoming students at W&L were enrolling based on little more than faith. Nobody knew for sure if W&L had the ability to pull off their ambitious 3L curriculum. The program relies on a large cadre of adjunct professors, after all, and W&L is located in remote Lexington, Virginia. Many law faculty outside of W&L, and perhaps some inside, thought (or perhaps think) that the program could not live up to the hype. Well, as shown below, the program appears to have produced meaningful gains.
The only data-driven critique anyone can muster is that the gains remain significantly short of perfection. But that critique bites harder on the rest of us. To use a simple metaphor, W&L is tooling around in a Model-T while the rest of us rely on horse and buggy. What ought to be plain to all of us, however, is that, just like automobile industry circa 1910, we are entering a period of staggering transformation that will last decades. And transformation will be roughly equal parts creation and destruction. See Schumpeter.
W&L Data, Internal Historical Benchmark
LSSSE is a phenomenally rich dataset – nearly 100 questions per year on a wide variety of topics related to student classroom experience, faculty interaction, type and quantity of assessments, time allocation, and perceived gains on a variety of dimensions related to personal and professional development. The survey instrument is online here.
Aside from a host of questions related to demographics, career goals, and debt, major sections in the LSSSE include:
- Section 1, Intellectual Experience (20 questions)
- Section 2, Examinations (1 question)
- Section 3, Mental Activities (5 questions)
- Section 4, Writing (3 questions)
- Section 5, Enriching Educational Experiences (9 questions)
- Section 6, Student Satisfaction (7 questions)
- Section 7, Time Usage (11 questions)
- Section 8, Law School Environment (10 questions)
- Section 9, Quality of Relationships (3 questions)
- Section 10, Educational and Personal Growth (16 questions)
W&L deserves to be a detailed case study. But frankly, legal education can’t wait. So I will do the best I can to cover the landscape in a blog post. I hope every law faculty member who reads this post makes a strong plea to their dean to enroll in LSSSE. Why? So your school can benchmark itself against the detailed LSSSE case studies that are bound to flow out of W&L and other innovative law schools. Though they don’t get much press, there are, in fact, other innovative law schools.
Friday, January 18, 2013
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occurring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
I tried to write Part II, but a blog post just was not up to the task. Further, I sensed that my colleagues were in no mood for half-baked solutions. There has been enormous criticism of legal education on the blogs and in the media, but very little in the way of detailed prescriptions to improve the situation. I felt an obligation to back off on the criticism and focus on solutions. So, in essence, Part II of my Tamanaha review became an article.
I just posted to SSRN an article entitled "A Blueprint for Change" forthcoming in the Pepperdine Law Review. It is both a diagnosis and a proposed solution -- a solution I am actively pursuing. Here is the abstract:
This Article discusses the financial viability of law schools in the face of massive structural changes now occurring within the legal industry. It then offers a blueprint for change – a realistic way for law schools to retool themselves in an attempt to provide our students with high quality professional employment in a rapidly changing world. Because no institution can instantaneously reinvent itself, a key element of my proposal is the “12% solution.” Approximately 12% of faculty members take the lead on building a competency-based curriculum that is designed to accelerate the development of valuable skills and behaviors prized by both legal and nonlegal employers. For a variety of practical reasons, successful implementation of the blueprint requires law schools to band together in consortia. The goal of these initiatives needs to be the creation and implementation of a world-class professional education in which our graduates consistently and measurably outperform graduates from traditional J.D. programs.
I have a large backlog of shorter articles and analyses that I have not posted because I wanted my own detailed solution in the public domain. I hope to tie all of these ideas together over the coming weeks.
Thank you, Brian Tamanaha, for writing an book that required me to think in terms of solutions.
[posted by Bill Henderson]
January 18, 2013 in Current events, Data on legal education, Data on the profession, Innovations in legal education, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (2)
Wednesday, November 28, 2012
In August, I posted to this blog a narrative analysis comparing the 2010 and 2011 enrollment and profile data among law schools based on the data published in the 2012 ABA-LSAC Guide and the 2013 ABA-LSAC Guide. In response to recent comments on the 2012 enrollment situation, see ABA Journal Weekly Newsletter and the discussion at The Faculty Lounge, and the further drop in LSAT test-takers in June/October 2012 recently discussed at Tax Prof Blog, I thought it might make sense to update the enrollment and profile analysis to account for 2012 enrollment and profile data, to the extent that it is available, and to offer some thoughts on 2013.
As of November 15, only 140 law schools had published enrollment data on their webpages and only 128 had published sufficient profile data on which to make meaningful year-to-year comparisons. Please note that this analysis is based on "unofficial data," having been taken from law school webpages, not from any ABA publication, and having been taken from law school webpages prior to the LSAC certification of enrollment and profile data which the LSAC is undertaking this year for the first time.
ENROLLMENT IN DECLINE – Between 2010 and 2012, only 12 schools were flat (a change between -1% and +1%) or had an increase in enrollment; 128 of the 140 law schools had a decline in enrollment (a decrease greater than 1%), of which
-89 had a decline of 10% or more, of which
-59 had a decline in enrollment of 20% or more, and of which
-15 had a decline in enrollment of 30% or more.
This means over 90% of law schools for which 2012 enrollment information is available had a decline in enrollment and that more than 40% had a decline in enrollment of 20% or more.
Based on the data published in the 2012 ABA-LSAC Guide, in 2010, these 140 law schools had 33,952 first-years (68.3% of the 49,700 total 1L enrollment (LSAC matriculants)). Based on the data published in the 2013 ABA-LSAC Guide, in 2011, these 140 law schools had 31,082 first-years (68.2% of the 45,600 total 1L enrollment (LSAC matriculants)). In 2012, based on data from law school webpages, these 140 law schools had 28,380 first-years.
The decline in first-year enrollment was roughly 8.45% percent across these 140 schools between 2010 and 2011 (slightly more than the national decline of 8.25%), while the decline in first-year enrollment was roughly 8.69% across these 140 schools between 2011 and 2012.
If enrollment at these 140 schools represents 68.25% of total first-year enrollment for 2012 (the average of 2010 and 2011), that would suggest that total first-year enrollment (LSAC matriculants) for fall 2012 may be as low as 41,500-41,600, a decline of roughly 8.8% from 2011 and a decline of roughly 16% since 2010. (The LSAC certification of enrollment and profile information may come in even slightly lower than this estimate as it is going to be based on snapshots of enrollment on October 5, 2012, which would exclude students who began classes but withdrew prior to October 5, 2012. This group of students might number a few hundred if there were one to three such students at each law school.)
PROFILES IN DECLINE – Between 2010 and 2012, 93 of the 128 law schools with available profile information had a decline in their LSAT/GPA profile (more indicators down then up), 23 had an increase in profile (more indicators up then down), and 12 had a mixed profile (same number of indicators up and down).
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, of the 128 law schools with both enrollment and profile information available for fall 2012, 85 law schools (nearly two-thirds) saw declines in enrollment and in their LSAT/GPA profiles between 2010 and 2012.
Of these 85 law schools, 38 law schools saw declines in enrollment of greater than 20% and saw declines in their LSAT/GPA profiles. That means nearly 30% of law schools with available enrollment and profile information for 2012 had declines in enrollment of 20% or more and saw their LSAT/GPA profile decline. It also means that over 75% of the 50 law schools with declines in enrollment greater than 20% and for which 2012 profile information is available had declines in profile for 2012.
Notably, five of these 38 law schools were in the USNews top-50, 10 were ranked between 51-100, 10 were ranked between 101-145 and 13 were in the alphabetical listing of schools. The declining interest in law school, therefore, is impacting law schools across the rankings, but is more dramatically impacting alphabetical schools than top-ranked schools.
FURTHER THOUGHTS ON 2012 – According to the LSAC Volume Summary, applications to law school slid from 87,900 in 2010 to 78,500 in 2011 to approximately 68,000 for 2012 (although the 2012 numbers have not been finalized). Over the last nine years, law schools, on average, have admitted roughly 56,800 students per year, with a low of 55,500 in 2007 and in 2008. The “admit” rate – which was only 56% for fall 2004 – had climbed to 71% for fall 2011. For the last several years, however, matriculants have averaged roughly 82% of admitted students. So if we did have 41,600 matriculants this fall (as estimated above), and if matriculants represented roughly 82% of admitted students, that would mean we had roughly 50,700 admitted students, the lowest number this millennium, with an admit rate of nearly 75%, the highest this millenium. (Alternatively, if matriculants declined as a percentage of admitted students, it is possible that a larger number of applicants were admitted.)
PROJECTIONS FOR 2013 -- June and October LSAT administrations suggest that there may be fewer than 60,000 applicants for fall 2013. There were 93,341 June/October test-takers in 2009 (for the 2010 admissions cycle) (resulting in 87,900 applicants – 94.2% of tests administered in June/October). There were 87,318 June/October test-takers in 2010 (for the 2011 admissions cycle) (resulting in 78,500 applicants – 89.9% of tests administered in June/October). There were 71,981 June/October test-takers in 2011 (for the 2012 admissions cycle) (resulting in roughly 68,000 applicants – 94.5% of tests administered in June/October).
That is a three-year average in which the number of applicants in a cycle represented roughly 92.9% of the tests administered in June/October. There were 63,003 June/October test-takers in 2012 (for the 2013 admissions cycle). If the 2013 cycle results in a number of applicants representing 92.9% of June/October test-takers, law schools can anticipate there being only roughly 58,530 applicants to law schools for fall 2013. (Notably, in the admissions cycles from 2007-2009, the number of applicants in a cycle represented, on average, roughly 111% of the June/October test-takers, so the estimate of 58,530 may understate the number of possible applicants.)
If there are only 58,530 applicants for fall 2013 (which would represent nearly a 14% decline from fall 2012 -- the third consecutive double-digit decline in applications), and if law schools admit only 50,700 of these applicants, the same as the estimate above for fall 2012, across all law schools over 86% of all applicants to law school would receive offers of admission. If 82% of these admitted students were to matriculate, that would mean a first-year enrollment for fall 2013 that once again would be around 41,500-41,600. Alternatively, if law schools remain somewhat selective and were to admit only 48,000 of the 58,530 estimated applicants, that still would be an admit rate of 82%. If 82% of those 48,000 matriculated, the first-year enrollment would decline to roughly 39,400, a decline of about 5.3% from the fall 2012 estimate set forth above.
There are two competing tensions law schools must weigh in making admissions decisions in a declining market – revenue and LSAT/GPA profile. Do you take the number of students you need to meet revenue projections (even if that means profile slides) or do you take a smaller number of students (and take a revenue hit) in an effort to maintain LSAT/GPA profile?
What the 2011 and 2012 classes demonstrate is that in the current market, for a large number of schools, even taking significantly fewer students did not allow them to maintain their profiles. Given that many schools already have lost significant revenue due to shrinking enrollments in 2011 and/or 2012 (for just one example see the recent discussion of Vermont Law School in the National Law Journal) they will be hard-pressed to shrink enrollment further to maintain profiles. As a result, I think when enrollment and profile data is evaluated in fall 2013, we will see even more widespread declines in profile than was manifested in 2011 and 2012, possibly along with some ongoing declines in enrollment. It seems likely that several more schools may experience both significant declines in enrollment and in profile.
[posted by Jerry Organ]
Monday, November 19, 2012
Law schools care deeply about their academic reputation. If this were not true, my Indiana Law mailbox would not be stuffed full with glossy brochures sharing the news of faculty publications, impressive new hires, areas of concentration, and sundry distinguished speaker series, etc.
Because of the timing of these mailings – I got nearly 100 in Sept and October—I am guessing that the senders hoped to influence the annual U.S. News & World Report Academic Reputation survey. Cf. Michael Sauder & Wendy Espeland, Fear of Falling: The Effects of U.S. News & World Report Rankings on U.S. Law Schools 1 (Oct 2007) (reporting "increases in marketing expenditures aimed toward raising reputation scores in the USN survey"). But does it work? A recent study by Larry Cunningham (St. Johns Law) suggests that the effect is, at best, decimal dust.
Glossy brochures may not reliably affect Academic Reputation, but I have uncovered four factors that are associated with statistically significant increases and decreases of USN Academic Reputation. To illustrate, consider the scatterplot below, which plots the 1993 ordinal rank of USN Academic Reputation against the 2012 ordinal rank [click on to enlarge].
Four sets of dot (Red, Blue, Orange, and Green), each representing distinctive shared features of law schools, tend to be above or below the regression line. These patterns suggest that changes in USN Academic Reputation over time are probably not the result of random chance. But we will get to the significance of the Red, Blue, Orange, and Green dots soon enough.
The primary takeaway from the above scatterplot is that 2012 USN Academic Reputation is overwhelmingly a function of 1993 USN Academic Reputation. Over 88% of the variation is explained by a school's starting point 20 years earlier. Part of this lock-in effect may be lateral mobility. That is, there are perks at higher ranked schools: they tend to pay more; the teaching loads are lighter; and the prestige is greater, etc. So school-level reputations rarely change, just the work addresses of the most productive scholars. This is, perhaps, the most charitable way to explain the enormous stickiness of USN Academic Reputation.
That said, the scatterplot does not show a perfect correlation; slightly less than 12% of the variation is still in play to be explained by influences other than starting position. A small handful of schools have made progress over these 20 years (these are the schools above the regression line), and a handful have fallen backwards (those below the line).
The Red circles, Blue rectangles, Orange diamonds, and Green circles represent four law school-level attributes. The Reds have been big gainers in reputation, and so have the Blues. In contrast, the Oranges have all experienced big declines; and as as a group, so have the Greens. When the attributes of the Red, Blue, Orange, and Green Schools are factored into the regression, all four are statistically signficant (Red, p =.000; Blue, p = .001; Orange, p = .012; Green, p = .000) and the explained variation increases 4% to 92.3%. As far as linear models goes, this is quite an impressive result.
Before you look below the fold for answers, any guesses on what is driving the Red and Blue successes and Orange and Green setbacks?
Thursday, September 20, 2012
NALP just announced that the median salary for first year associates in Big Law has dropped from $160K to $145K. I think that is very significant. We are now back to to the entry level price point of 2007.
But to my mind, there is much bigger story here. In 2011, firms of 500+ attorneys hired 2,856 entry level lawyers. In 2007, that figure was 4,745. So, after five years, Big Law is paying the same wage but hiring 40% fewer lawyers. Compare 2007 NALP Nat'l Summary with 2011 NALP Nat'l Summary.
Here is another important piece of NALP data, generated from the print versions of the July 2012 NALP Bulletin. It shows the percentage of entry level law jobs that are private practice.
Two takeaways here: (1) there is a longterm trendline showing a declining number of private practice jobs--and that is the economic engine that enables law schools to exist at current tuition levels, and (2) the cliff-like dropoff in 2010 and 2011 is likely Big Law, and that hurts.
[posted by Bill Henderson]
Monday, September 3, 2012
NALP notes that for the Class of 2010 -- and the Class of 2011 -- two-thirds of all employed graduates were employed in the state in which their law school was located. This suggests location matters.
Is location important to employment results at a large number of schools? Are some law schools more national than others? Are some states more “local” in hiring than other states? The answers are yes and yes and yes.
ANALYZING SCHOOL SPECIFIC DATA -- This analysis is based on the Class of 2010 and Class of 2011 employment outcome data reported on the ABA Section of Legal Education website, excluding the law schools in Puerto Rico. This means there are 195 law schools in this analysis (if the two Widener campuses are combined).
The law schools were asked to report the three states with the most employed graduates and the number of employed graduates in each of those three states. Taking those totals as a percentage of employed graduates, and paying attention to the states identified, one can get some idea of which schools are “regional” and which schools might actually have a more “national” footprint. The simple result of the analysis is that the vast majority of schools are “regional” rather than “national.”
- For both the Download Class of 2010 and the Download Class of 2011, there were 117 law schools for which more than 67 percent of their employed graduates are employed in the state in which the law school is located.
- For the Classes of 2010 and 2011, there were 144 and 145 law schools, respectively, for which more than 67 percent of their employed graduates are located in the state in which the law school is located or an adjacent state, and 104 law schools for which more than 80 percent of their employed graduates are located in the state in which the law school is located or an adjacent state.
- There were only 46 law schools for which less than 67 percent of their employed graduates were employed in the state in which the law school is located or an adjacent state for both the Classes of 2010 and 2011.
Notably, 28 of these 46 law schools are in the USNews top-50, for which it is easily imaginable that the employment geography is much more national than regional. For many of these 46 law schools, two of the three states with the most employed graduates generally are not adjacent to the state in which the law school is located, suggesting some national reach. The three non-adjacent jurisdictions reflected most frequently should not be surprising – California, the District of Columbia and New York. Of the 18 other law schools, nine law schools are ranked in the alphabetical list of schools -- schools one generally would consider regional – while nine are ranked between 51 and 145 in USNews.
Perhaps most significantly, due to the incomplete nature of some of the data sets, this summary probably understates the number of law schools for which the employment outcome data suggests the law school is more regional than national. Several of these 46 law schools come in with 60% or more of their employed graduates employed in the state of the law school or an adjacent state for both years -- Boston College, Minnesota, NYU, Ohio State and Penn State – and if the data were to include graduates employed in all adjacent states, the total for these schools well might exceed 67 percent.
In sum, then, more than 76% of all law schools and more than 87% of law schools outside the USNews top-50 had more than 67% of their employed graduates in the state in which the law school is located or an adjacent state for either the Class of 2010 or the Class of 2011.
LOOKING AT STATE SPECIFIC DATA -- NALP also notes that for the Class of 2010, there are 30 states in which two-thirds or more of the jobs were taken by graduates from law schools in those states. (Jobs & JDs, Class of 2010, p. 69) Taking NALP’s state-specific data for the Class of 2010 in conjunction with the ABA’s data for the Class of 2010, there actually are 35 states in which two thirds or more of the jobs were taken by graduates of law schools in those states or an adjacent state and 30 states in which three-quarters or more of the jobs within the state were taken by graduates of the law schools in the state or in an adjacent state.
Again, this data likely understates the results. For example, in Arizona, Colorado, Connecticut, Maryland, Tennessee, and Virginia, roughly 65-75 percent of jobs within the state were taken by graduates from law schools within the state or an adjacent state. But with several schools in adjacent states not counted in the tallies because these states were not one of the top three states for employed graduates from those schools, one could infer that were graduates from all schools from adjacent states included the percentage might exceed 75 percent. (Notably, 13 of the 15 states with less than 67 percent of jobs taken by graduates of the law school in the state or law schools in adjacent states are states with modest populations and only one law school (or no law school) – Alaska, Delaware, Hawai’i, Idaho, Maine, Montana, Nevada, New Hampshire, New Mexico, Rhode Island, South Dakota, Vermont, and West Virginia. The other two states are Utah and Virginia. The District of Columbia also falls into this category.)
LOCATION MATTERS -- In sum then, location matters. For the vast majority of law students at the vast majority of law schools, the vast majority of reasonable employment prospects associated with going to a given law school are going to be in the state in which the law school is located or an adjacent state. In the absence of a unique or specific aspect of a law school's program that might make a particular law school very appealing, this suggests that location should matter when considering a law school, perhaps more than ranking.
For example, if a prospective student has a choice between going to a higher ranked regional law school in a state in which the student does not anticipate practicing or living (and perhaps paying more in tuition), or a lower ranked regional law school in the location in which he or she hopes to live and work professionally (and perhaps paying less in tuition), the prospective law student should give serious consideration to attending the lower-ranked regional law school in the location in which he or she hopes to live and work professionally. This will make it easier to begin networking while in law school and to facilitate employment opportunities in the region in which the student is interested in practicing law and living. (And it may help the prospective student save money if the lower-ranked regional school happens to cost less (if it is a public school, for example), or if the prospective student has a more competitive LSAT/GPA profile at the lower-ranked regional school such that the student may be eligible for a scholarship.)
[Posted by Jerry Organ]
Thursday, August 16, 2012
The initial posting I made on August 9 was based on a “composite” database consisting of information gleaned over several months from different sources – initially from law school webpages, supplemented with information from U.S.News (when LSAT or GPA datapoints were not available on webpages) supplemented more recently with information from the ABA-LSAC Guide 2013 to fill in any remaining gaps (enrollment data and some medians). At the time of posting, I had not gone back through all the data for all the schools to cross-check against the data in the ABA-LSAC Guide 2013 and eliminate any data discrepancies (although I thought I had done so for the schools listed in the chart).
A number of people have asked for the complete spreadsheet. I have now gone back and compiled the complete spreadsheet using data solely from the ABA-LSAC Guides for 2012 and 2013. I have provided the complete spreadsheet, organized alphabetically, to the folks at Law School Transparency where it is now or will shortly be available for viewing.
The macro points remain fairly consistent with a couple of small changes. Working with the 194 schools in the contiguous 48 states and Hawai’i originally included in the U.S. News and World Report database (excluding the three Puerto Rico schools), the new database using only data from the ABA-LSAC Guides for 2012 and 2013 shows the following:
PROFILES IN DECLINE -- Between 2010 and 2011, 114 law schools had a decline in their LSAT/GPA profile, 55 had an increase in profile, and 25 had a mixed profile.
ENROLLMENT IN DECLINE – Between 2010 and 2011, 142 law schools had a decline in enrollment (of which 65 had a decline of 10% or more), 29 had an increase in enrollment (of which 8 had an increase of 10% or more), and 23 had flat enrollment (within +/- 1% of 2010 enrollment). This means over 70% of schools had a decline in enrollment and that one-third had a decline in enrollment of 10% or more. The decline in enrollment totaled roughly 4100 students or roughly 8 percent.
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, 81 schools (slightly over 40%) saw declines in enrollment and in their LSAT/GPA profiles, of which 39 schools saw declines in enrollment of greater than 10% and saw declines in their LSAT/GPA profiles. These 39 schools are highlighted here Download 2011-2010 Comparison August Version abalsac dataset Detailed LSATandGPA 39 schools
(This updated chart reflects one subtraction and two additions from what was originally posted. Charleston was incorrectly included in the initial chart (resulting in 38 schools being listed) and now has been removed. Its enrollment was down only 5.4%. (In the composite dataset with which I had been working its 2011 enrollment and profile was initially based only on full-time students, overstating the percentage decline). Baylor and Willamette were not included in the initial chart, but are included here. Baylor’s total first-year enrollment is hard to estimate off its webpage because of three admissions cycles, fall, spring and summer and uncertainty about which three “count” for a given year. Willamette had a slight change in enrollment from 146 (listed on its webpage) to 141 in the ABA-LSAC Guide. This change shifted it from a decline of less than 10% to a decline of slightly more than 10%. I have apologized to Dean Abrams at Charleston for my error in including Charleston in the initial chart.)
[posted by Jerry Organ]
Wednesday, August 8, 2012
A recent posting by Paul McGreal at The Faculty Lounge and an article in the National Law Journal by Matt Leichter (discussed in July here on the Legal Whiteboard) raise issues about the enrollment challenges law schools began facing last year, are facing now, and likely will face next year. This post summarizes the comparative data for the 2010 and 2011 entering classes covering the 197 schools ranked by USNews.
PROFILES IN DECLINE -- Between 2010 and 2011, 111 law schools had a decline in their LSAT/GPA profile, 59 had an increase in profile, and 27 had a mixed profile. (A decline means across six possible data points, 75th, median, and 25th for LSAT and GPA, more scores went down then up; an increase means more scores went up than down; a mixed profile means the same number of scores went up as went down. For example, if a school had an LSAT/GPA profile in 2010 of 160/156/153 and 3.82/3.65/3.45 and an LSAT/GPA profile in 2011 of 160/156/152 and 3.83/3.64/3.43, this would be a decline in profile – down on three parameters and up on one parameter.) The average 75th LSAT has dropped from 160.2 to 159.9, while the average 25 LSAT has dropped from 155.2 to 154.3. The median scores for the 75 and 25 fell from 160 and 155 for LSAT to 159 and 153.
ENROLLMENT IN DECLINE – Between 2010 and 2011, 141 law schools had a decline in enrollment (of which 63 had a decline of 10% or more), 30 had an increase in enrollment (of which 6 had an increase of 10% or more), and 26 had flat enrollment (within +/- 1% of 2010 enrollment). This means over 70% of schools had a decline in enrollment and that nearly one-third had a decline in enrollment of 10% or more. The decline in enrollment totaled roughly 4000 students or roughly 8 percent.
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, 75 schools (roughly 38%) saw declines in enrollment and in their LSAT/GPA profiles, of which 37 schools saw declines in enrollment of greater than 10% and saw declines in their LSAT/GPA profiles. These 37 schools are highlighted here -- (original chart has been deleted and replaced by an updated chart reflecting 39 schools as described in post on August 16). Four of the schools are ranked in the top-50, while the other 33 schools are relatively evenly divided between the second-50, the third-45 and the alphabetical schools. There is some geographic concentration, with five Ohio schools (plus Northern Kentucky), three Illinois schools and four of the six Missouri and Kansas schools on the list. Notably, 16 of the 37 are state law schools, several of which are relatively low-tuition schools that should conceivably fare better in the current climate in which prospective students are increasingly concerned about the cost of legal education.
FORECAST FOR 2012-- Given that LSAC has estimated a decline of roughly 14.4% in the number of applicants for fall 2012, from 78500 to roughly 67000, and given that the decline has been greatest among those with higher LSAT scores, one should anticipate further declines in enrollment and further erosion of entering class LSAT/GPA profiles for fall 2012. The admit rate will be the highest it has been this millennium, probably exceeding 75% and possibly exceeding 80% (after increasing from 55% to 71% between 2004 and 2011).
IMPACT FELT ACROSS THE RANKINGS CONTINUUM, BUT WORSE FOR LOWER-RANKED SCHOOLS -- While the decline in enrollment and in profiles was experienced across the board, it was more pronounced among lower ranked schools.
-Among the top 100 schools, 55 schools (over one-half) had a decline in profile, while 67 (two-thirds) had a decline in enrollment, with 27 experiencing a decline in enrollment of 10% or more. Notably, 35 schools saw a decline in enrollment and in profile (over one-third) of which 15 schools saw declines in enrollment of 10% or more and a decline in profile. Overall enrollment was down roughly 6%.
-Across the bottom 97 schools then, 56 saw a decline in profile while 74 (more than three-quarters) saw a decline in enrollment, of which 36 (nearly 40%) saw a decline in enrollment of 10% or more. Notably 40 schools saw a decline in enrollment and a decline in profile, of which 22 saw a decline in enrollment of 10% or more and a decline in profile. Overall, enrollment was down nearly 10%.
[Posted by Jerry Organ]
Friday, July 27, 2012
A really compelling way to convey a lot of important information. I continue to be blown away by the volume of innovation I am seeing, mostly around interconnectivity. (H/T: Greg Voakes at Business Insider)
[posted by Bill Henderson]
Monday, July 16, 2012
That is the title of a just-posted essay by Catherine Rampell at the NY Times Economix Blog. She studies several years of the bi-modal distribution. It is refreshing to have a capable journalist review the data and marvel at the strange ways of our industry.
[posted by Bill Henderson]
Sunday, July 15, 2012
I created the graphic below to depict the shrinking right mode of the bi-modal distribution since its 2007 high water mark (measured in February 2008).
[Note: The difference between the mean and adjusted mean in the 2011 distribution is due to the fact that law grads who fail to report their salaries tend to have have less lucrative employment; so NALP makes a prudent statistical correction --basically a weighted average based on practice settings.]
From a labor market perspective, the class of 2007 entry level salary distribution was extraordinary and anomalous. Why? Because we can safely assume that legal ability, however it might be defined, is normally distributed, not bi-modal. So when such a distribution appears in a real labor market, something is significantly out of kilter.
Why did the entry level market become bi-modal? As the legal economy boomed from the mid-90s through the mid-00s, many large law firms (NLJ 250, AmLaw 200) were trying to make the jump from regional dominant brands to national law firms. For decades, going back to the early to mid-20th century, these firms followed a simple formula: hire the best and brightest from the nation's elite law schools. As they continued to enjoy growth, they reflexively followed that same formula. Yet, by 2000s, the demand for elite law graduates finally outstripped supply.
This micro-level logic ("let's not tinker with our business model") produced a macro-level bidding war. This is how the right mode came to be. Yet, because it was a macro-level phenomenon, clients, led by industry groups such as the Association of Corporate Counsel (ACC), reacted by saying, "Don't put any junior level lawyers on my matters --they are overpriced." Outsourcing and e-discovery vendors have also eaten into the work that used to go to entry level lawyers. So the volume of BigLaw hiring has collapsed, hence the melting of the right mode. For a more detailed overview, see NALP, Salary Distribution Curve.
Long Term Structural Change in Big Law
That said, it is not just the entry level market that is under stress -- the fundamental economics of Big Law are also changing. Consider the chart below (from Henderson, Rise and Fall, Am Law June 2012), which shows that revenues per lawyer at AmLaw 100 firms has gone flat and moved sideways since 2007, breaking a pattern of steady growth that dates back to the pre-Am Law 100 days.
Stagnant revenue is a source of enormous worry for law firm managers. Without higher profits to distribute--and growing the top line is the usual profitability fomula--their biggest producers might leave, causing a run on the bank ala Dewey, Howrey, Wolf Block, etc. So the dominant strategy now has nothing to do with entry level hiring. Rather, the goal is to keep and acquire lateral partners with portable books of business. After all, clients aren't protesting the value of most senior level lawyers. And seniors lawyers are plentiful, thanks to the excellent health of baby boom lawyers and the poor health of their retirement accounts.
This strategy may work fine for this fiscal year, but over the middle to long term, BigLaw is going to get older and dumber. Further, this dynamic produces substantial ripple effects on legal education -- albeit ripple effects that feel like tremors.
The long term solution -- for both law firms and law schools -- is for the price of entry level talent to come down to the point where young lawyers are more cost-effective to train. And that price point is not $160,000. This inflated pay scale (which has supported ever higher tuitions at law schools) only persists because large firms are deathly afraid of adjusting their salary scales and being labeled second rate. So the solution is keep the entry pay high but hire very few law school graduates. This is not a farsighted or innovative business strategy.
It's been 100 years since law firms engaged in sophisticated business thinking. And that last great idea was the Cravath System, which was method of workplace organization that performed expert client work while simultaneously developing more and better human capital. See Henderson, Three Generations of Lawyers: Generalista, Specialists, Project Managers. According to the Cravath Swaine & Moore firm history, published in 1948, the whole point of the Cravath System was to make "a better lawyer faster."
I think the next great model for a legal service organization (law firm may not be the right term) likewise will be based on the idea that there is a large return to be had by investing in young lawyers. As my friend Paul Lippe likes to say, "When it appears, it will look obvious."
[posted by Bill Henderson]
Thursday, July 5, 2012
This is a simple question of great practical importance to many law schools, yet very few law school administrators understand how to answer it. Who would have thought that clarity would be supplied free-of-charge by an underemployed recent law school graduate?
But that is what is happening now, in "Tough Choices Ahead for Some High-Ranked Law Schools," an Am Law Daily essay written by Matt Leichter, one of the silver linings of the declining legal job market -- and there aren't too many. Matt is a J.D.-M.A. in law and international affairs from Marquette University who passed the New York bar in 2008, finished his masters work in 2009, and then moved to the Big Apple as the bottom was falling out of the entry level market. Unable to find conventional legal employment, Matt started doing freelance writing on law-related topics.
With plenty of time on his hands, Matt turned his graduate-level quantitative skills to the task of analyzing a law school education market that seemed unsustainable. Matt first put his analyses on display at the Law School Tuition Bubble. His writings eventually attracted the attention of The American Lawyer, which has now published several of his data-driven essays.
Here is what sets Matt apart.
- He digs very deep for facts and, in turn, uses one of his biggest asset --time -- to build datasets that answer important and relevant questions
- He is non-ideological. Just facts and factual analysis.
- He writes about complex technical stuff in an accessible, credible way
Matt has all the core skills of a truly great lawyer. Finding no takers, the entire legal education establishment benefits by Matt channeling his time, energy, and considerable intellect into relevant topics crying out for dispassionate analysis.
His "Tough Choices" essay is a real gem. Here is the bottomline: This year's applicant cycle likely will deliver its greatest blow to US News Tier 1 schools who generally admit students who were angling to get into even higher ranked schools. This inference can be teased out of the ratio of applicants to offers (selectivity), and offers to matriculants (yield).
To conduct this analysis, Matt had to cull data, school-by-school, from several years of the ABA-LSAC Official Guide to Law Schools (aka "the Phonebook"). But it enables him to produce the chart below:
What this chart says is that admissions officers have to read more applications and make more offers to fill their entering classes. Based on the data in Matt's chart, in 2004, for all ABA-accredited law schools, there was a 24% acceptance rate, and a 31% yield from those offers. In 2010, the acceptance rate went up to 31% (schools were being less selective) and the yield went down to 25% (fewer showed up to enroll).
Applicant volume may be declining, but the trends above suggest that there is a lot more "competitive shopping" going on. Why? Because information costs are going down and prospective students are adapting. And this year is bound to be the most aggressive year ever. According to this NLJ story, It's a Buyers' Market for Law School, virtually every student is now negotiating for scholarship money.
Declining applicant volume, shifting yields, and highly informed consumers make it very difficult for law school administrators to lock in their LSAT and UGPA numbers, which schools generally fixate on because of U.S. News ranking. This produces pain in one of three ways:
- The school shrinks the entering class (announced by at least 10 schools), which severely tightens the budget
- The school buys its class through financial aid, which blows a hole in the budget (happening here)
- The school significantly relaxes the LSAT and UGPA and braces for a drop in the rankings because its peers are pursuing strategies #1 or #2.
#1 and #2 may seem like the prudent course, but a central university won't (more likely can't) provide a financial backstop for more than a year or two, if that. If the admissions environment does not change dramatically, which seems unlikely, some combination of layoffs, rankings drop, or closures will have to be put on the table.
Matt's ingenuity is on full display when he demonstrates, with data, the profile of the most vulnerable schools -- and its a far cry from the bottom portion of the U.S. News rankings.
- Low accept/high yield (think Yale and Stanford) are safe.
- High accept/high yield are also fine. They are nonprestigious but have strong regional niches or missions. Tier 3 or 4 designation means nothing.
- Low accept/low yield crowd -- a bunch of Tier 1 schools -- are vulnerable to significant rankings volatility. If they drop, next year's applicant volume will be affected, making it very difficult to rebound.
- High accept/low yield are the most likely to close.
Until August and September, when the wait lists finally clear, nobody really know the depth of market shift. Only then can the budget holes be finalized. Deans will then have candid conversations with their central administrations to answer the question, "Is this downward trend permanent?"
[posted by Bill Henderson]