Monday, April 17, 2017
Legal Whiteboard Ceasing Publication
On April 30th, The Legal Whiteboard will cease publication. During our 5+ years of operations, we generated some very good content, focusing on facts, trends and ideas affecting the legal industry. We made the ABA LawBlawg 100 in 2012 (year 1) and 2016 (year 5). In particular, some of the most widely read posts were written by Jerry Organ, who focused on legal education. Jerry painstakingly built numerous datasets to answer important questions related to conditional scholarships, the transfer market, and bar passage. It was a privilege to be associated with this work. I am personally grateful to Blog Emperor Paul Caron for providing us this platform and graciously agreeing to continue to archive our content on the Law Professor Blog Network.
I was the primary editor who launched The Legal Whiteboard. It was also my decision to shut it down. The reason is not lack of interest in the blogging medium — in fact, the opposite is true. For the last several years, mediums that started as blogs have been siphoning off readership – and thus power and influence -- from traditional media. Online publication also facilitates connections with people outside one’s academic silo. For over a decade now, my online writing has connected me with numerous professionals in law firms, legal departments, bar associations, and legal start-ups. In most cases, I am trading information with my readers, collecting local experience in exchange for macro-level observations. These connections produced countless friendships, enriched my teaching and research, and changed how I viewed the world.
There is a tension between what counts as serious work inside the academy (e.g., placement and citations in prestigious journals; mentions in the New York Times, etc.) and how serious people in and outside of the academy are accessing information to help them do their work. This is an observation, not a complaint. Professional and social norms evolve slowly, often to the point of feeling static. But they do evolve, and generally in the right direction.
I am shutting down The Legal Whiteboard so I can make a more ambitious investment in online publishing. For the next year or so, and perhaps longer if the experiment works, virtually all of my professional efforts outside of teaching will be focused on building an online publication for my core research on the legal industry. The publication will be called Legal Evolution.
At this point in my career, I am very interested in doing applied research – i.e., research targeted at solving practical, real world problems. Examples of applied research include rural sociology (agricultural production), industrial/organizational psychology (worker productivity), public health (health outcomes). Online publication drops the cost of doing this type of work while increasing its potential impact – that is a powerful reason to give it a try.
Legal Evolution will be focused on the practical problem of lagging legal productivity in a world of rapidly increasing complexity. Lagging productivity among lawyers is a serious industry-level issue because it means that solving legal problems is becoming, in a relative sense, more expensive over time. In the individual client market, more citizens are going without access to legal services. In the corporate market, heavy reliance on fee discounts is straining client-lawyer relations, as they have yet to see that the only long-term solution is to improve productivity through better systems and more sophisticated sourcing. The second-order effects of lagging legal productivity are now impacting legal education through stagnant entry-level salaries and historically low enrollment levels. I don't think the law professoriate fully appreciates this linkage.
We lawyers and law professors lack the skills and expertise to solve the legal productivity problem by ourselves. Whatever form the solutions take, we can be 100% certain that the inputs will be multidisciplinary. Lawyers and law professors who collaborate with professionals from other disciplines will move a lot faster than those trying to stay at the top of the food chain. The ultimate goal of Legal Evolution is to accelerate this transition by curating examples of what is working in the field, including contextual knowledge to help readers make better decisions within their own organizations.
Applied research needs to be driven by theory. Legal Evolution’s editorial strategy will be grounded in the research of the late sociologist Everett Rogers, whose seminal book, Diffusions of Innovations, is one of the most cited books in all of the social sciences. The first edition of Rogers' book was published in 1962. In turn, he spent much of his 40+ year career updating subsequent editions with ever richer examples that (a) supported a general theory of innovation diffusion, and (b) demonstrated how knowledge of diffusion theory could be used to accelerate the adoption of innovation, often for important, socially beneficial ends.
In my own career, shutting down The Legal Whiteboard feels like the end of era, albeit it is necessary to make room for something new. In the fall of 2008, as I assembled my tenure file at Indiana Law, I remember creating a final attachment ("Attachment 7") that summarized my “internet writings.” It was a list 216 blog posts I had published between April 2006 (when I joined the Empirical Legal Studies Blog) and Labor Day 2008. For visual effect, I created a hyperlink for all 216 posts. I can remember one of my advisors telling me that I didn’t need the summary and besides, it wouldn’t count toward tenure. I replied, “I know I don’t need it. I know it won’t count. But I am putting it in because I think this work is valuable. At some point in the future, it ought to count.”
I wrote that nearly 10 years ago. I have learned a lot since then. With some luck, maybe I can nudge legal academic norms in a positive direction.
Over the next couple of weeks, we will be reposting some of our favorite LWB stuff. After April 30th, I hope to see you on the other side. Many thanks for your readership.
April 17, 2017 in Blog posts worth reading, Current events, Innovations in law, Scholarship on legal education, Structural change | Permalink | Comments (2)
Wednesday, March 29, 2017
New Learning Outcomes Database -- A Searchable Clearinghouse of Law School Learning Outcomes
The Holloran Center for Ethical Leadership in the Professions at the University of St. Thomas School of Law (Minnesota) is pleased to announce the availability of a new, searchable, web-based clearinghouse of information regarding law school learning outcomes – the Learning Outcomes Database: https://www.stthomas.edu/hollorancenter/resourcesforlegaleducators/learningoutcomesdatabase/
The Holloran Center has compiled all law school learning outcomes that have been published and are accessible on law school websites and is making them all available in one location.
The Learning Outcomes Database is organized in three categories structured around the language of ABA Standard 302. To the extent that law schools have identified learning outcomes more robust than the minimum required by Standard 302, each category lists the full array of learning outcomes with an identification of the law schools that have adopted such learning outcomes along with a delineation of where, within each law school’s learning outcomes, one can find the specific language associated with a specific learning outcome.
The database of learning outcomes also is searchable by law school.
The Holloran Center plans on doing quarterly updates. The Center will “sweep” law school websites looking for more law schools with learning outcomes and checking to see whether law schools change their learning outcomes. The Center anticipates updating the Learning Outcomes Database in May, August, November, and February. To the extent that law schools change their learning outcomes, the Center will be maintaining an archive that will allow those interested to see how law school learning outcomes evolve over time.
March 29, 2017 in Current events, Data on legal education, Innovations in legal education, Scholarship on legal education | Permalink | Comments (0)
Saturday, March 18, 2017
Revisiting the Market for Transfer Students Based upon the 2016 Standard 509 Reports
This blog posting updates my blog postings of December 2014 and December 2015 regarding what we know about the transfer market. With the release of the 2016 Standard 509 Reports, we know have three years of more detailed transfer data from which to glean insights about the transfer market among law schools.
NUMBERS AND PERCENTAGES OF TRANSFERS – 2011-2016
The number of transfers dropped to 1749 in 2016, down from 1,979 in 2015, and from 2,187 in 2014 and 2,501 in 2013. The percentage of the previous fall’s entering class that engaged in the transfer market also dropped to 4.6%, the lowest it has been since 2011. In other words, there is no reason to believe the transfer market is “growing” as a general matter. It has been consistently in the 4.6% to 5.6% range for the last six years.
2011 |
2012 |
2013 |
2014 |
2015 |
2016 |
|
Number of Transfers |
2427 |
2438 |
2501 |
2187 |
1979 |
1749 |
Previous Year First Year Enrollment |
52,500 |
48,700 |
44,500 |
39700 |
38600 |
37900 |
% of Previous First-Year Total |
4.6% |
5% |
5.6% |
5.5% |
5.2% |
4.6% |
SOME LAW SCHOOLS CONTINUE TO DOMINATE THE TRANSFER MARKET
The following two charts list the top 15 law schools participating in the transfer market in descending order in Summer 2014 (fall 2013 entering class), Summer 2015 (fall 2014 entering class), and Summer 2016 (fall 2015 entering class). One chart is based on “numbers” of transfers and the other chart is based on the number of transfer students as a percentage of the prior year’s first year class.
Note that in these two charts, the “repeat players” are bolded – those schools in the top 15 for all three years are in black, those schools in the top 15 for two of the three years are in blue. Ten of the top 15 have been on the list all three years. The top six for 2016 have had pretty consistent transfers numbers for each of the last three years.
Largest Law Schools by Number of Transfers from 2013-2015
School |
Number in 2014 |
School |
Number in 2015 |
|
Number in 2016 |
Georgetown |
113 |
Georgetown |
110 |
Georgetown |
111 |
George Wash. |
97 |
George Wash. |
109 |
George Wash |
106 |
Arizona St. |
66 |
Arizona St. |
65 |
Arizona St. |
66 |
Idaho |
57 |
Harvard |
55 |
Columbia |
50 |
Cal. Berkeley |
55 |
Emory |
51 |
Emory |
49 |
NYU |
53 |
NYU |
51 |
UCLA |
43 |
Emory |
50 |
Cal. Berkeley |
49 |
Loyola Marymount |
43 |
Columbia |
46 |
Rutgers |
45 |
NYU |
43 |
American |
44 |
Columbia |
44 |
Florida |
36 |
UCLA |
44 |
Miami |
44 |
Houston |
36 |
Wash. Univ. |
44 |
UCLA |
43 |
Harvard |
35 |
Texas |
43 |
Texas |
37 |
Cal. Berkeley |
33 |
Minnesota |
37 |
American |
33 |
Miami |
31 |
Northwestern |
35 |
Florida St. |
32 |
American |
30 |
Harvard |
33 |
Minnesota |
31 |
Florida St. |
30 |
|
817 |
|
799 |
|
741 |
|
37.4% |
|
40.4% |
|
42.3% |
Largest Law Schools by Transfers as a Percentage of Previous First Year Class - 2014-2016
School |
% 2014 |
School |
% 2015 |
School |
% 2016 |
Arizona State |
51.6 |
Arizona State |
45.5 |
Arizona State |
30.3 |
Idaho |
51.4 |
Emory |
22.9 |
George Wash. |
21.6 |
Washington Univ. |
23.3 |
George Wash. |
20.2 |
Emory |
20.9 |
Emory |
22.9 |
Miami |
19.2 |
Georgetown |
19.3 |
Georgetown |
20.8 |
Georgetown |
19 |
Florida St. |
17.1 |
George Wash. |
20.2 |
Cal. Berkeley |
17.9 |
Houston |
16.7 |
Cal. Berkeley |
19.4 |
Florida St. |
17 |
Loyola Marymount |
16.0 |
Florida St. |
18.2 |
Florida Int’l |
16.7 |
Southern Cal |
14.7 |
Rutgers – Camden |
17.1 |
Minnesota |
16.1 |
UCLA |
14.7 |
Southern Cal. |
17.1 |
Utah |
16 |
UNLV |
14.2 |
Minnesota |
16.7 |
UNLV |
14.3 |
Columbia |
12.9 |
Utah |
15.9 |
UCLA |
13.7 |
SMU |
12.0 |
Northwestern |
15.3 |
Texas |
12.3 |
Northwestern |
11.8 |
UCLA |
15 |
Chicago |
12.1 |
Florida Int’l |
11.8 |
Seton Hall |
14.5 |
Rutgers |
12.1 |
Florida |
11.6 |
Interestingly, the number of law schools welcoming transfers representing more than 20% of their first-year class has fallen from nine in 2013 (not shown), to six in 2014, then to only three in 2015 and 2016.
Nonetheless, as shown in the following chart, we are continuing to see a modest increase in concentration in the transfer market between 2011 and 2016 as the ten law schools with the most students transferring in captured an increasing share of the transfer market, from 23.5% in 2011 to 33.3% in 2016.
Top Ten Law Schools as a Percentage of All Transfers
2011 |
2012 |
2013 |
2014 |
2015 |
2016 |
|
Total Transfers |
2427 |
2438 |
2501 |
2187 |
1979 |
1749 |
Transfers to 10 Law Schools with Most Transfers |
570 |
587 |
724 |
625 |
623 |
583 |
Transfers to 10 Law Schools with Most Transfers as % of Total Transfers |
23.5% |
24.1% |
28.9% |
28.6% |
31.5% |
33.3% |
NATIONAL AND REGIONAL MARKETS
Starting in 2014, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring law schools with more than twelve transfers in to report not only the number of students who have transferred in, but also the law schools from which they came (indicating the number from each law school) along with the 75%, 50% and 25% first-year, law school GPAs of the students who transferred in. This allows us to look at where students are coming from and are going to and to look at the first-year GPA profile of students transferring in to different law schools. The following chart focuses on the ten law schools that have been among the top-15 in terms of transfers in for each of the last three years, presented in descending USNews ranking. It indicates the extent to which these law schools were attracting transfers from their geographic region and also identifies the law school(s) that provided the largest number of transfers to each listed law school in 2016 as well as the percentage of transfers that came from that school.
Percentage of Transfers from Within Geographic Region 2014-2016 and Top Feeder School(s) for 2016 at the Ten Law Schools Among the Top-15 for Transfers in 2014, 2015 and 2016
School |
# of Transfers 2014/2015/2016 |
Region |
Regional # of Transfers 14/15/16 |
Regional % of Transfers 14/15/16 |
School from Which Largest Number of Transfers Came in 2016 |
#/% of Transfers from Largest School 2016 |
Harvard |
33/55/35 |
NE |
6/15/13 |
18/27/37 |
GWU |
3/9% |
Columbia |
46/44/50 |
NE |
19/19/24 |
41/43/48 |
Fordham |
6/13% |
NYU |
50/51/43 |
NE |
20/15/16 |
40/29/37 |
Fordham/GWU |
6/14% |
Berkeley |
55/49/33 |
CA |
43/29/22 |
78/59/67 |
Santa Clara |
5/15% |
Georgetown |
113/110/111 |
Mid-Atl |
49/43/36 |
43/39/32 |
John Marshall |
10/9% |
UCLA |
44/43/43 |
CA |
31/26/25 |
70/60/58 |
Pepperdine/GWU |
7/16% |
Emory |
53/51/49 |
SE |
40/31/25 |
75/61/51 |
Atlanta’s John Marshall |
11/22% |
GWU |
97/109/106 |
Mid-Atl |
78/70/77 |
80/64/73 |
American |
51/48% |
Arizona St. |
66/65/66 |
SW |
51/48/57 |
77/74/86 |
Arizona Summit |
48/73% |
American |
44/33/30 |
Mid-Atl |
14/6/10 |
32/18/33 |
Univ. Dist. Col. |
6/20% |
For these top 10 law schools for transfer students in 2016, five law schools (Berkeley, UCLA, Emory, George Washington and Arizona State) obtained most of their transfers (51% or more) from within the geographic region within which the law school is located during each of the last three years. On the other hand, five law schools (Harvard, Columbia, NYU, Georgetown and American) had fewer than 49% of their transfers from within the region in which they are located in each of the last three years.
Moreover, two of the ten law schools had a significant percentage of their transfers from one particular feeder school. For George Washington, roughly 48% of its transfers came from American University, while for Arizona State, 73% of its transfers came from Arizona Summit.
The chart below shows the tiers of law schools from which these 10 law schools in the transfer market received their transfer students. Five of the ten law schools that consistently have high numbers of transfers are ranked in the top 15 in USNews, while nine of the ten are ranked in the top 25. Only five had 75% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, UCLA and George Washington. Two additional schools, Berkeley and Georgetown, had at least 50% of their transfers from law schools ranked between 1 and 99. The remaining two law schools ranked in the top 25 in USNews (Emory and Arizona State), along with American, had at least half of their transfer students from law schools ranked 100 or lower, with two of those law schools (Arizona State and American) having 85% or more of their transfers from law schools ranked 100 or lower.
Percentage of Transfers from Different Tiers of School(s) for 2014, 2015 and 2016 at the Ten Law Schools Among the Top-15 for Transfers in 2014, 2015 and 2016
(Bolded data indicates the modal response for each law school.)
# of Trans 14/15/16 |
Top 50 # --------- % 14/15/16 |
51-99 # ------- % 14/15/16 |
100-200 # ------- % 14/15/16 |
||||
Harvard |
33/55/35 |
23/41/28 |
70/75/80 |
10/13/7 |
30/24/20 |
0/1/0 |
0/2/0 |
Columbia |
46/44/50 |
29/30/33 |
63/68/67 |
14/10/16 |
30/23/33 |
3/4/1 |
7/9/2 |
NYU |
50/51/43 |
41/40/35 |
82/78/81 |
7/10/8 |
14/20 |
2/1/0 |
4/2/0 |
Berkeley |
55/49/33 |
17/15/11 |
31/31/33 |
27/26/8 |
49/53/24 |
11/8/14 |
20/16/42 |
Georgetown |
113/110/111 |
27/30/32 |
24/27/29 |
38/30/41 |
34/27/37 |
48/50/38 |
42/45/34 |
UCLA |
44/43/43 |
15/15/18 |
34/35/41 |
23/23/21 |
52/53/49 |
6/5/4 |
14/12/10 |
Emory |
53/51/49 |
3/5/3 |
6/10/6 |
7/8/17 |
13/16/35 |
43/38/29 |
81/75/59 |
GWU |
97/109/106 |
13/21/15 |
13/19/14 |
73/63/68 |
75/58/64 |
11/25/23 |
11/23/22 |
Arizona St. |
66/65/66 |
4/0/3 |
6/0/5 |
5/6/7 |
8/9/11 |
57/59/56 |
86/91/85 |
American |
44/33/30 |
2/0/0 |
5/0/0 |
14/1/2 |
32/3/7 |
28/32/28 |
64/97/93 |
If one focuses just on the reported GPAs from these ten schools, one quickly sees that the six law schools ranked in the USNews top-20 have a 50th GPA for transfers in 2016 that is a 3.6 or above (except for UCLA at 3.56), and a 25th GPA of 3.52 and above (except for NYU at 3.41). Once you drop out of the top-20, however, the other four law schools have a 75th GPA that drops below 3.5, a 50th GPA that drops below 3.3, and a 25th GPA that drops below 3.05 for three of the four law schools. Harvard clearly is accepting transfers who could have been admitted to Harvard in the first instance. While they make a less compelling case, Columbia, NYU, Berkeley, Georgetown and UCLA likely are accepting transfers whose entering credentials largely would have made them possible candidates for acceptance at those law schools. By contrast, Emory, George Washington, Arizona State and American are welcoming as transfers students whose entering credentials likely are sufficiently distinct from each of those law schools’ entering class credentials that the transfers they are admitting would not have been admitted as first-year students in the prior year.
First-Year Law School 75th/50th/25th GPA of Transfers at the Ten Law Schools Among the Top-15 for Transfers in 2014, 2015 and 2016
GPA 75th |
GPA 50th |
GPA 25th |
|
14/15/16 |
14/15/16 |
14/15/16 |
|
Harvard |
3.95/3.98/4.0 |
3.9/3.92/3.94 |
3.83/3.85/3.88 |
Columbia |
3.81/3.82/3.84 |
3.75/3.76/3.71 |
3.69/3.66/3.6 |
NYU |
3.74/3.76/3.72 |
3.62/3.68/3 |
3.47/3.52/3.41 |
Berkeley |
3.9/3.87/3.92 |
3.75/3.81/3.8 |
3.68/3.69/3.75 |
Georgetown |
3.77/3.77/3.76 |
3.67/3.66/3.63 |
3.55/3.59/3.54 |
UCLA |
3.73/3.7/3.67 |
3.58/3.58/3.56 |
3.44/3.46/3.52 |
Emory |
3.42/3.45/3.41 |
3.27/3.3/3.16 |
2.93/3.06/3.02 |
GWU |
3.53/3.46/3.45 |
3.35/3.32/3.26 |
3.21/3.15/3.14 |
Arizona St. |
3.51/3.5/3.4 |
3.23/3.17/3.09 |
2.97/2.95/2.96 |
American |
3.25/3.04/3.17 |
2.94/2.89/2.99 |
2.78/2.74/2.81 |
STILL MANY UNKNOWNS
As I noted in each of the last two years, the more detailed transfer data that law schools are now required to publish should be very helpful to prospective law students and pre-law advisors, and to current law students who are considering transferring. The more detailed data give them a better idea of what transfer opportunities might be available depending upon where they go to law school (or are presently enrolled as a first-year student).
Even with this more granular data now available, however, there still are a significant number of unknowns relating to transfer students, regarding gender and ethnicity of transfer students and regarding performance of transfers students at their new law school (both academically and in terms of bar passage and employment).
March 18, 2017 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)
Monday, May 2, 2016
Changes in Reporting and Classifying of Law-School-Funded Positions Result in Decline in Number of Graduates in Full-Time, Long-Term Law-School-Funded Bar-Passage-Required Positions
This blog posting summarizes how recent changes in the definition and reporting of law-school-funded positions have impacted the number of law-school-funded positions classified as full-time, long-term or full-time, short-term bar-passage-required positions for graduates in the Class of 2015. Comparisons between results for the Class of 2014 and the Class of 2015 show a significant decline in the number of full-time, long-term bar-passage-required positions that are law-school-funded (from 831 to 398) and a significant increase in the number of full-time short-term bar-passage-required positions that are law school funded (from 184 to 277). Overall the number of law-school-funded bar-passage-required positions declined by one-third, from 1015 to 675, as a result of these changes.
Changes in Reporting Framework and Definition
In March 2015, the Council for the Section of Legal Education and Admissions to the Bar approved a change in the reporting of law-school-funded positions to take effect this spring with reporting of employment outcomes for the Class of 2015. Previously, law schools included law-school-funded positions within all other employment categories “above the line” and then delineated “below the line” the number of law-school-funded positions in each category. Under this approach, between the Class of 2012 and the Class of 2014, the number of full-time, long-term bar-passage-required positions that were law-school-funded increased from 517 to 831, an increase of more than 60%.
With the change, however, the Council added “Employed – Law School Funded” as a separate “above the line” category such that law-school-funded positions no longer are included in other categories (e.g., Employed – Bar Passage Required or Employed – JD Advantage), although law schools are still required to provide more detailed information about the different categories of law-school-funded jobs “below the line” on the employment summary report.
In July 2015, the Council also approved a change in the definition of when a law-school-funded position may be classified as a “long-term” position, requiring that it be a position the employer expects to last at least one year with a salary of at least $40,000 per year.
Long Term. (OLD DEFINITION) A long-term position is one that does not have a definite or indefinite term of less than one year. It may have a definite length of time as long as the time is one year or longer. It may also have an indefinite length of time as long as it is expected to last one year or more.
Long-term. (NEW DEFINITION) A long-term position is one that the employer expects to last one year or more. A law school/university funded position that the law school expects to last one year or more may be considered long-term for purposes of this definition only if the graduate is paid at least $40,000 per year. . . .”
This change also took effect with the reporting of employment outcomes this spring for the Class of 2015.
An example might help explain how these changes might impact classification of a given position. Assume you have a graduate of Law School A in 2014 who took a one-year position as a lawyer with a public interest law firm as part of a “bridge-to-practice” program, working on a full-time basis and receiving a stipend of $24,000 paid partly by the law school. Law School A agreed to subsidize a portion of the stipend for a year but the law school continued to support the graduate’s ongoing effort to seek other gainful employment during the year.
In the Class of 2014 reporting format, this graduate could have been classified and probably would have been classified “above the line” in the full-time, long-term Employed – Bar Passage Required category because the job had a defined duration of one year even though the student might not be planning on staying in the position for the entire year. (This graduate also would have been listed separately “below the line” in the law-school-funded category as having a full-time, long-term bar-passage-required position).
Following the March 2015 changes, a Class of 2015 graduate in the same job, working as a lawyer with a public interest law firm on a full-time basis and receiving a stipend of $24,000 paid partly by the law school, would have been classified “above the line” in the full-time, long-term Employed – Law School Funded category and not in full-time, long-term Employed -- Bar Passage Required. (As was the case with the Class of 2014 graduates, this graduate also likely would have been listed separately “below the line” in the law-school-funded category as having a full-time, long-term bar-passage-required position).
Following the July 2015 changes, a Class of 2015 graduate in the same job, working as a lawyer with a public interest law firm on a full-time basis and receiving a stipend of $24,000 paid partly by the law school, would be classified “above the line” in the full-time, short-term Employed – Law School Funded category because under the new definition of “long-term” either or both the lack of an employer expectation that the job would last for one year or more or the lack of a stipend of at least $40,000 would mean that this job would not qualify as “long-term” and therefore would be classified as “short-term.” (This graduate also would be listed separately “below the line” in the law-school-funded category as having a full-time, short-term bar-passage- required position).
Consequences of the Change in Reporting Framework and Definition
With the ABA’s release of its Employment Summary report reporting employment for graduates of the Class of 2015 ten months after graduation, we can compare law-school-funded positions for the Class of 2015 with law-school-funded positions for the Class of 2014. The following table includes results from all law schools listed in the ABA’s Employment Summary spreadsheets for the Class of 2014 and for the Class of 2015.
Law School Funded Bar Passage Required, Full-Time, Long-Term and Full-Time, Short-Term Positions for the Class of 2014 and Class of 2015
YEAR |
FTLT BPR LSF |
FTST BPR LSF |
TOTAL BPR LSF |
Class of 2014 |
831 |
184 |
1015 |
Class of 2015 |
398 |
277 |
675 |
Full-time, long-term bar-passage-required positions that were law-school-funded declined by more than 50% from 831 to 398. Meanwhile, full-time, short-term bar-passage-required positions that were law-school-funded increased by roughly 50% from 184 to 277. Overall, however, law-school-funded positions that were in one of these two categories declined by 340 or by roughly 33%, from 1015 to 675.
Although it is not easy to know for sure, the most plausible explanation for these changes is that some of the jobs previously classified as full-time, long-term bar-passage-required positions had a stipend or salary lower than $40,000 per year and that law schools offering such positions could not increase the salary sufficiently to continue to have such positions classified as full-time, long-term bar-passage-required positions under the new regime. Alternatively, or additionally, some positions may not have been classified as full-time, long-term bar-passage-required positions if the employers with graduates with law-school-funded positions did not expect that the position would last for at least one year. These possibilities would explain the shift of some positions from full-time, long-term to full-time, short-term, but they would not necessarily explain the complete loss of so many law-school-funded bar-passage-required positions.
The loss of roughly one-third of the law-school-funded bar-passage-required positions might be explained partly by the decline in the number of graduates passing the July 2015 bar exam compared with July 2014.
Additionally, a portion of the loss of roughly one-third of the law-school-funded bar-passage-required positions also might be explained by the reality that there was more perceived “value” in a law school being able to claim a law-school-funded positon as a full-time, long-term bar-passage-required position than a full-time, short-term bar-passage-required position. With the change in reporting framework and definition, some law schools may have concluded that further investment in law-school-funded positions was not justifiable, particularly given how USNews accounts for these positions in its rankings (a point highlighted by Derek Muller in his post about these changes in law-school-funded positions).
Different Responses across Different Law Schools
- The Top-25 Law Schools for Full-Time, Long-Term Law-School-Funded Bar- Passage-Required Positions for the Class of 2014
The decline in law-school-funded bar-passage-required positions was manifested most particularly at several law schools. The top-25 law schools for full-time, long-term, bar-passage-required positions that were law-school-funded for the Class of 2014 (those schools with 10 or more law-school-funded positions classified as full-time, long-term bar-passage-required positions), are responsible for the vast majority of the decline in such positions for the Class of 2015. Across these 25 law schools, the number of graduates in full-time, long-term bar-passage-required positions that were law-school-funded fell from 676 to 295, a drop of 381 out of the total decline of 440 or nearly 87% of the total decline in such positions. Across these 25 law schools, the number of graduates in full-time, short-term bar-passage-required positions that were law-school-funded increased from 11 to 213, far exceeding the actual increase in such positions (which was counter-balanced by several schools greatly reducing the number of full-time, short-term bar-passage-required positions that were law-school-funded).
- 14 Law Schools in the Top-25 for Law-School-Funded Positions that Saw Significant Changes in Law-School-Funded Bar-Passage-Required Positions Between the Class of 2014 and the Class of 2014
There was a subset of 14 law schools within this group that saw the most significant changes between the Class of 2014 and the Class of 2015, being responsible for 359 of the 440 decline in full-time, long-term bar-passage-required positions that were law-school-funded and being responsible for an increase from 8 to 202 in the full-time, short-term bar-passage-required positions that were law-school-funded. These 14 law schools are set forth in the following table in descending order of the full-time, long-term bar-passage-required law-school-funded positions in the Class of 2014.
School |
2014 LSF FTLT BPR |
2015 LSF FTLT BPR |
2014 LSF FTST BPR |
2015 LSF FTST BPR |
2014 Graduates |
2015 Graduates |
George Washington |
78 |
6 |
0 |
19 |
584 |
465 |
Georgetown |
64 |
35 |
0 |
53 |
626 |
678 |
Emory |
52 |
0 |
0 |
20 |
268 |
308 |
American |
44 |
4 |
0 |
40 |
460 |
464 |
Michigan |
33 |
2 |
5 |
21 |
390 |
354 |
Southern California |
31 |
7 |
0 |
20 |
217 |
213 |
Texas |
23 |
11 |
1 |
1 |
351 |
354 |
Vanderbilt |
22 |
0 |
0 |
12 |
194 |
185 |
Notre Dame |
22 |
4 |
1 |
0 |
179 |
179 |
California-Berkeley |
20 |
11 |
1 |
2 |
287 |
278 |
William and Mary |
19 |
0 |
0 |
3 |
215 |
178 |
California-Davis |
19 |
9 |
0 |
0 |
169 |
185 |
Washington Univ. |
14 |
2 |
0 |
10 |
258 |
228 |
Cornell |
11 |
2 |
0 |
1 |
191 |
183 |
TOTAL |
452 |
93 |
8 |
202 |
4389 |
4252 |
Notably, across these 14 law schools, the total number of bar-passage-required positions that were law-school-funded declined from 460 (of which only eight were short-term) for the Class of 2014 to 295 (of which 202 were short-term) for the Class of 2015. At these 14 law schools, therefore, there not only was a decline of 165, over one-third, in the number of full-time, law-school-funded, bar-passage-required positions, there also was a dramatic shift in the ratio of full-time, long-term to full-time, short-term bar-passage-required positions, from over 98% to less than 33%.
- 11 Law Schools in the Top-25 for Law-School-Funded Positions that Did Not See Significant Changes in Law-School-Funded Bar-Passage-Required Positions Between the Class of 2014 and Class of 2015
At the other 11 law schools among the top-25 for law-school-funded positions that were bar-passage-required in the Class of 2014 there was not a significant decline in law-school-funded positions that were bar-passage-required for the Class of 2015. These 11 law schools are set forth in the following table in descending order of the full-time, long-term bar-passage-required law-school-funded positions in the Class of 2014.
School |
2014 LSF FTLT BPR |
2015 LSF FTLT BPR |
2014 LSF FTST BPR |
2015 LSF FTST BPR |
2014 Graduates |
2015 Graduates |
New York Univ. |
36 |
30 |
0 |
2 |
479 |
485 |
Virginia |
33 |
30 |
0 |
0 |
349 |
367 |
UCLA |
31 |
31 |
2 |
0 |
336 |
335 |
Columbia |
31 |
28 |
0 |
0 |
468 |
413 |
Harvard |
24 |
20 |
1 |
0 |
586 |
589 |
Illinois |
15 |
10 |
0 |
2 |
185 |
181 |
Boston University |
12 |
12 |
0 |
0 |
246 |
208 |
Brigham Young |
11 |
9 |
0 |
0 |
138 |
133 |
Chicago |
11 |
6 |
0 |
5 |
210 |
196 |
California-Irvine |
10 |
20 |
0 |
0 |
93 |
110 |
Stanford |
10 |
6 |
0 |
2 |
187 |
195 |
TOTAL |
224 |
202 |
3 |
11 |
3277 |
3212 |
These law schools either already had salaries of at least $40,000 for most of their law-school-funded bar-passage-required positions for the Class of 2014 or made the decision to make sure that the vast majority of their law-school-funded bar-passage-required positions for the Class of 2015 had salaries of at least $40,000, as the number of full-time, long-term law-school-funded positions that were bar-passage-required across these 11 law schools only declined by 22 while the number of full-time, short-term law-school-funded positions that were bar-passage-required increased only by eight. The ratio of full-time, long-term to full-time, short-term bar-passage-required positions across these 11 law schools changed very little, from over 98% to nearly 95%.
- The Remaining Law Schools
Across the remaining law schools, for the Class of 2014, there were only 57 law schools across which there were 155 law-school-funded positions that were full-time, long-term bar-passage-required positions. For the Class of 2015, there were only 46 law school across which there were 103 full-time, long-term positions that were bar-passage-required. Across this set of schools, therefore, there was a decline of 52 positions or roughly one-third in the number of full-time, long-term bar-passage-required positions.
Across all the remaining law schools, for the Class of 2014, there were only 24 law schools with a total of 173 full-time, short-term bar-passage-required law-school-funded positions. For the Class of 2015, there were only 19 law schools with a total of 64 full-time, short-term bar-passage-required, law-school-funded positions. Thus, full-time, short-term bar-passage-required positions that were law-school-funded declined across these law schools by over 100.
In total, then, these other law schools saw law-school-funded bar-passage-required positions decline from a total of 328 for the Class of 2014 to only 167 for the Class of 2015, a decline of nearly 50%.
Total Changes in Law-School-Funded Bar-Passage-Required Positions
Between the Class of 2014 and the Class of 2015
|
2014 LSF BPR FTLT |
2015 LSF BPR FTLT |
2014 LSF BPR FTST |
2015 LSF BPR FTST |
Top 25 (10 or more LSF BPR FTLT in 2014) |
676 |
295 |
11 |
213 |
11 |
224 |
202 |
3 |
11 |
14 |
452 |
93 |
8 |
202 |
Other Schools with LSF |
155 (57 schools) |
103 (46 schools) |
173 (24 schools) |
64 (19 schools) |
Total |
831 |
398 |
184 |
277 |
(I am very grateful to Janelle Chambers for her research assistance in compiling this data and am very grateful to Scott Norberg and Bernie Burk for helpful comments on earlier drafts of this blog posting.)
May 2, 2016 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Sunday, May 1, 2016
Mixed Signals from the Legal Employment Market – Preliminary Results for the Class of 2015
THIS BLOG UPDATES THE EARLIER BLOG POSTING TO INCORPORATE DATA FROM THE ABA's EMPLOYMENT SUMMARY SPREADSHEETS FOR THE CLASS OF 2014 and CLASS OF 2015 AS OF MAY 3, 2016, WITH DOUBLE-COUNTED DATA FOR MITCHELL|HAMLINE IN THE CLASS OF 2015 REMOVED AND WITH ALL LAW-SCHOOL-FUNDED POSITIONS FOR BOTH YEARS REMOVED FROM THE CALCULATIONS. THE 2015 NUMBERS NOW MATCH THOSE ON THE ABA's 2015 LAW GRADUATE EMPLOYMENT DATA SHEET RELEASED ON MAY 3 WHILE THE 2014 NUMBERS NOW MATCH THOSE FOR 2014 ON THE ABA's 2015 LAW GRADUATE EMPLOYMENT DATA SHEET ONCE LAW-SCHOOL-FUNDED POSITIONS ARE REMOVED.
The Class of 2015 employment summary reports have been posted by all ABA-accredited law schools, resulting in reporting of results for some states or regions. The ABA Section of Legal Education and Admissions to the Bar released the complete Employment Summary spreadsheet for all law schools on its website yesterday (May 2) and updated it today (May 3) and likely will be updating it again tomorrow (to eliminate the double-counting for Hamline, William-Mitchell and Mitchell|Hamline).
In this initial post I provide a brief summary of the Class of 2015’s employment outcomes compared with the Class of 2014’s employment outcomes based on data from these spreadsheets as of described above.
In a subsequent post (posted on May 2) I provide a summary of changes in the reported number of law-school-funded, bar-passage-required positions between the Class of 2014 and the Class of 2015 as a result of changes in the classification and reporting of such positions.
Changes in the Percentage of Graduates and Number of Graduates in Full-Time, Long-Term Bar-Passage-Required and JD Advantage Jobs
Across all law schools for which the ABA has released employment summary data for the Class of 2015, the percentage of graduates in full-time, long-term bar-passage-required positions and full-time, long-term JD advantage positions increased from 69% for the Class of 2014 to 70.1% for the Class of 2015. This would appear to be modestly good news. When you disaggregate the two categories, the full-time, long-term bar-passage required positions went from 58% to 59.2% while the full-time, long-term JD advantage positions went from 11% to 10.9%.
Because there was a significant decline in the number of graduates across these law schools between 2014 and 2015, however, this modest increase in the percentage of graduates in these positions masks an actual decline in the number of graduates in such positions. There were 39,984 graduates in the Class of 2015 compared with 43,832 graduates in the Class of 2014, a decline of 3,848 graduates, or 8.8%. There were 28,029 graduates in the Class of 2015 with full-time, long-term bar-passage-required or JD advantage positions, compared with 30,234 graduates in the Class of 2014 with such positions, a decline of 2,205, or 7.3%.
When these totals are disaggregated, full-time, long-term bar-passage-required positions declined from 25,417 for the Class of 2014 to 23,687 for the Class of 2015, a decline of 1,730, or 6.8%. For full-time, long term JD advantage positions, the total went from 4,817 to 4,342, a decline of 475, or 9.9%.
(Please note that numbers for both 2014 and 2015 exclude law-school-funded positions from both categories. The ABA's 2015 Law Graduate Employment Data sheet compares Class of 2014 INCLUDING law-school-funded positions with CLASS of 2015 EXCLUDING law-school-funded positions, which leads to slightly different results showing a more exaggerated decline in the number of graduates in full-time, long-term bar-passage-required and JD advantage jobs that also results in a decline in the percentage of graduates in such positions.)
Comparison of Full-Time, Long-Term Bar-Passage-Required Positions and JD Advantage Positions for the Class of 2014 and Class of 2015
Graduates |
# FTLT BPRJDA |
% FTLT BPRJDA |
# FTLT BPR |
% FTLT BPR |
# FTLT JDA |
% FTLT JDA |
|
Class of 2014 |
43,832 |
30,234 |
69% |
25,417 |
58% |
4,817 |
11% |
Class of 2015 |
39,984 |
28,029 |
70.1% |
23,687 |
59.2% |
4,342 |
10.9% |
Change |
(3,848) |
(2,205) |
(1,730) |
(475) |
Changes in the Number and Percentage of Graduates Whose Employment Status is Unknown or Who Were Classified as Unemployed Seeking or Unemployed Not Seeking
Looking at the other end of the employment outcomes continuum, however, both the number and percentage of graduates who had unknown employment outcomes, or who classified as unemployed seeking or unemployed not seeking, declined slightly between the Class of 2014 and the Class of 2015. For the Class of 2014, there were 5,778 graduates whose employment status was unknown or who were classified as unemployed seeking or unemployed not seeking. This represented 13.2% of the 43,832 graduates. For the Class of 2015, however, there were only 5,200 graduates whose employment status was unknown or who were classified as unemployed seeking or unemployed not seeking. This represented 13% of the 39,984 graduates.
Searching for Explanations
In the coming weeks and months, there likely will be a number of commentators offering suggestions for why the Class of 2015 might have seen a decline in the number of graduates obtaining full-time, long-term bar-passage-required or JD advantage positions.
Part of the decline likely is attributable to the decline in the number and percentage of graduates passing the July bar exam, as reported by the NCBE in its annual statistics publications for each of the last three years.
Year |
First-Time Bar Takers in July from ABA-Accredited Law Schools* |
First-Time Bar Passers in July from ABA-Accredited Law Schools |
July Pass Rate Among First-Time Takers from ABA-Accredited Law Schools |
2013 |
47,465 |
38,909 |
82% |
2014 |
44,282 |
34,333 |
78% |
2015 |
39,955 |
29,772 |
75% |
*Note that the NCBE’s classification of first-time takers is over-inclusive in that it reflects not just graduates from May who are taking the bar exam for the first time in July, but also graduates from a prior year who might be taking the bar exam for the first-time in a given jurisdiction even if they have previously taken the bar exam in another jurisdiction. Thus first-time bar passers includes some people who are not part of the graduating cohort in a given year.
In the two-year period, then, between 2013 and 2015, the number of first-time takers from ABA-accredited law schools taking the July bar exam who passed the exam and became eligible for jobs requiring bar passage declined by roughly 9,100 and by nearly 23.5%. Moreover, the percentage of all first-time bar takers taking the February exam rather than the July exam also increased slightly between 2013 and 2015 from 18.7% to 19.7%, which might mean slightly more May 2015 graduates might not have been positioned to accept a full-time, long-term bar-passage-required or JD advantage position as of March 15, 2016, because they may have been studying for and taking the February 2016 bar exam.
Part of the decline also likely is attributable to market conditions in some parts of the country. For example, a recent story about graduates of Texas law schools noted that the decline in oil prices and tort reform may have impacted hiring in the Texas legal market for graduates of the Class of 2015. Once the full set of employment outcomes is available, it will be easier to assess the extent to which certain states or certain regions might have seen better or worse results than other states or regions.
Part of the decline also may be a manifestation of the impact of technology on the legal services market, with the possibility that the legal services market will have slightly fewer entry level positions over the near term.
One Possible Counterpoint
If this decline in the number of full-time, long-term bar passage required positions is a manifestation of a weakening job market law graduates, then one would expect that salary data also would demonstrate weakness. Once NALP publishes its report on the employment results for the Class of 2015 later this summer, we will have a chance to assess the extent to which salary trends are consistent with a weakening legal services market or suggest that the market remains somewhat competitive. If this decline in graduates taking jobs that are full-time, long-term bar passage required or JD advantage jobs is counterbalanced by a continuation of the year-over-year modest increases in mean and median salaries in recent years for law graduates, it might suggest that that there is less market weakness than this initial employment summary might indicate.
Concluding Thoughts
For those thinking that the recent news about the improving situation with respect to applicants to law school is the beginning of an upward trend that will gradually return law schools to first-year class sizes in the 45,000 to 46,000 range, this employment outcomes data provides a cautionary tale. The fact that the employment market for law school graduates appears to have stagnated and even declined to some extent over the last two years may mean that risk averse potential law school applicants who focus on post-graduate employment opportunities when assessing whether to invest in a legal education may remain skittish about applying, such that this year’s good news on the applicant front may be somewhat short-lived.
(I am very grateful for the research assistance of Janelle Chambers in gathering data for this blog posting prior to the release of the ABA Employment Summary spreadsheet and for very helpful comments on earlier drafts of this blog posting from Scott Norberg and Bernie Burk and for the helpful insights of Debby Merritt as we worked on reconciling data in the ABA spreadsheets.)
May 1, 2016 in Current events, Data on legal education, Scholarship on legal education | Permalink | Comments (1)
Sunday, April 24, 2016
Projections for Law School Enrollment for Fall 2016
In this blog posting I am doing two things. First, I provide a detailed analysis to estimate the likely total applicant pool we can expect at the end of the current cycle based on trends from March through the end of the cycle in 2013 and 2014 and 2015. Second, given the increase in the strength of the applicant pool, I suggest that law schools in the top 60 or 70 of USNEWS ranking will see more enrollment growth and profile stability in comparison with law schools further down the rankings continuum.
ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS
Reviewing the 2013, 2014, and 2015 Cycles to Inform the 2016 Cycle
The table set forth below shows the number of applicants in the admissions cycle as of early March in 2013, 2014, 2015 and 2016 along with the projected total applicant pool (based on percentage of applicants at that point in the cycle in the previous year) and the actual total applicant pool at the end of each cycle (with an estimate of the 2016 total applicant pool).
2013 Current Volume Summary Date |
Applicants |
% of Cycle in Previous Year on This Date |
Applicant Pool |
Mar. 8, 2013 |
46,587 |
84% |
55,460 Projected as of March 8 based on % of Cycle |
End of Cycle |
59,400 Actual |
||
2014 Current Volume Summary Date |
Applicants |
% of Cycle in Previous Year on This Date |
Applicant Pool |
Mar. 7, 2014 |
42,068 |
79% |
53,250 Projected on March 7 based on % of Cycle |
End of Cycle |
55,700 Actual |
||
2015 Current Volume Summary Date |
Applicants |
% of Cycle in Previous Year on This Date |
Applicant Pool |
Mar. 6, 2015 |
39,646 |
76% |
52,160 Projected on March 6 based on % of Cycle |
End of Cycle |
54,500 Actual |
||
2016 Current Volume Summary Date |
Applicants |
% of Cycle in Previous Year on This Date |
Applicant Pool |
Mar. 4, 2016 |
42,981 |
76% |
56,553 Projected on March 4 based on % of Cycle |
End of Cycle |
57,500 Estimate |
In each of the last three years, a modest surge in late applicants meant the final total applicant count exceeded the March projections by more than 2000, with the amount by which the actual total applicant count exceeded the projected total applicant count getting smaller each year (dropping from roughly 4,000 in 2013 to roughly 2,300 in 2015). This “late surge” would suggest that the projection for fall 2016 based on the applicant pool as of March 4, 2016 (for just over 56,500) likely understates the end of cycle total applicant pool. To be somewhat conservative, I am estimating that the final total applicant pool in 2016 will exceed the early March projection by roughly 1,000, the smallest such increase in the last four years, resulting in an estimated total applicant pool of 57,500 (up about 5.5% from 2015). This would be the first increase in applicants since 2010.
ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS
The chart below shows the number of applicants, admitted applicants and matriculants over the last four years along with an estimate for fall 2016 based on the assumption above that we have a total of 57,500 applicants this cycle. With 3,000 more applicants than in 2014-15, I am assuming 2,400 more admitted applicants (roughly 80% of the additional applicants), and then assuming the number of matriculants will reflect close to the four-year average for the percentage of admitted applicants who matriculate – 87.6%. This would yield a first-year entering class of 39,150, up about 5.6% from 2015. (Using this process last April, I estimated a first-year enrollment of 36,975, 83 less than the actual first-year enrollment of 37.058.)
Estimates of Admitted Students and Matriculants for 2016 Based on Trends in 2012-2015
Applicants |
Admitted Students |
Percent of Applicants |
Matriculants |
Percent of Admitted |
|
2012 |
67,900 |
50,600 |
74.5% |
44,481 |
87.9% |
2013 |
59,400 |
45,700 |
76.9% |
39,675 |
86.8% |
2014 |
55,700 |
43,500 |
78.1% |
37,924 |
87.2% |
2015 |
54,500 |
42,300 |
77.6% |
37,058 |
87.6% |
2016 (est.) |
57,500 |
44,700 |
77.7% |
39,150 |
87.6% |
DIFFERENTIAL IMPACT ON ENROLLMENT AND PROFILES ACROSS DIFFERENT CATEGORIES OF LAW SCHOOLS
Earlier this year Ian Ayres noted that lower-ranked law schools have benefited from the rankings concerns of higher-ranked law schools. In the last few years, as higher-ranked law schools admitted fewer applicants in an effort to maintain their LSAT/GPA profiles, they left more applicants for lower-ranked law schools to admit. In this admissions cycle, the strength of the pool of applicants means things likely will swing the other way. Higher-ranked law schools likely will be admitting more students, leaving fewer students for lower-ranked law schools to admit.
INCREASES IN APPLICANTS WITH HIGH LSATs BODE WELL FOR HIGHER RANKED LAW SCHOOLS
For the first time in the last five years, we are seeing a year-over-year increase in the number of applicants with LSATs of 165 or higher. As of the April 15 Current Volume Summary, there were a total of 7,054 applicants with LSATs of 165 or higher, compared with 6,519 on April 17, 2015. Another 130 with LSATs of 165 or higher ended up applying during the balance of the 2014-15 admissions cycle, resulting in a total of 6,649. I am presently assuming there will be another 146 applicants with LSATs of 165 or higher in the balance of the 2015-16 admissions cycle for a total of 7,200. On average, over the past four years, 82.6% of these applicants have matriculated. I think it is going to be slightly higher this year as I think there are a number of top-60 or top-70 law schools dealing with revenue pressures from decreased enrollment in recent years that are going to take advantage of the stronger quality in this applicant pool to increase their first-year enrollment without seeing too much erosion in their entering class profile. Thus, I think we will see roughly 6,000 matriculants this year with LSATs of 165 or higher, an increase of nearly 500 from fall 2015.
Five-Year Trend in Applicants and Matriculants with LSATs of 165+ and Estimates for 2015
Applicants with LSATs of 165+ |
Matriculants with LSATs of 165+ |
Percent of Applicants Matriculating |
|
2010 |
12,177 |
9,477 |
77.8% |
2011 |
11,190 |
8,952 |
80% |
2012 |
9,196 |
7,571 |
82.3% |
2013 |
7,496 |
6,154 |
82.1% |
2014 |
7,477 |
6,189 |
82.8% |
2015 |
6,649 |
5,505 |
82.8% |
2016 (est.) |
7,200 |
6,000 |
83.3% |
In addition, the number of applicants with LSATs of 160-164 also has increased in this cycle, from roughly 6,500 at this point in 2014-15 to over 6,800 in 2015-16. This likely means that at the end of the cycle there will be at least 300 more applicants with LSATs of 160-164, which likely will generate an additional 240 matriculants (roughly 80% or the 300 more applicants) in this range than in the 2014-15 admissions cycle. Combining these categories, when this admissions cycle ends, there likely will be 740 more matriculants with LSATs of 160 or higher in the 2015-16 applicant pool than in the 2014-15 applicant pool – from roughly 11,200 to nearly 12.000.
This increase in quality in the applicant pool means law schools ranked in the top 60 or top 70 or so (those with median LSATs near or above 160), collectively could be able to welcome more than 1,200 more matriculants than last year without meaningfully impacting their profile. (If the top 70 law schools garner 600 of the 740 additional applicants with LSATs of 160 or higher, they also could admit almost as many additional applicants with LSATs below their median without impacting their profile. For top-70 law schools focused on profile AND revenue, every additional matriculant with an LSAT above 160 who helps the law school maintain its median LSAT allows the law school to add a matriculant with an LSAT of less than 160.)(Of course, not all law schools are going to have the financial strength to continue to use scholarship resources to attract top applicants, so there likely will be some variability among top 70 schools in terms of enrollment growth/decline and in terms of profile retention/erosion.)
Continuing But Slowing Declines in Applicants with LSATs Between 150-159 Likely Will Present Challenges for Some Law Schools with Median LSATs Between 150-159
Year |
LSAT of 140-144 |
LSAT of 145-149 |
LSAT of 150-154 |
LSAT of 155-159 |
2013 |
6114 |
9439 |
11430 |
10920 |
2014 |
5893 |
8428 |
10587 |
9919 |
2015 |
6214 |
8665 |
10518 |
9681 |
2016 (est.) |
6500 |
9000 |
10400 |
9600 |
Based on the numbers of applicants with LSATs between 150-159 as of the April 15 Current Volume Summary, the pool of applicants in this range is likely to remain flat or continue to show a modest decline as reflected in the table above. If law schools in the top-60 or top-70 do take advantage of the increase in applicants with LSATs of 160 or higher to increase their enrollment, then fewer of these 20,000 applicants with LSATs between 150-159 will be available to law schools with median LSATs in those ranges. This will put pressure on law schools with median LSATs of 150-159 to admit fewer applicants or to dip deeper into the applicant pool to fill their classes. (Note that while the pool of applicants with LSATs between 150-159 is flat to slightly down, the pool of applicants with LSATs between 140-149 appears to be increasing again this year, for the second year in a row.) Once again, enrollment results and profile results are likely to vary somewhat widely across law schools depending upon their relative financial strength and their ability to continue to use scholarship assistance to compete for qualified applicants.
CONCLUSION
If the estimates regarding applicants and matriculants above are accurate we will see roughly 2,100 more matriculants in the 2015-16 cycle. The increased strength of the applicant pool and the anticipated admissions strategies and efforts of top-ranked schools dealing with revenue pressures from reduced enrollment in the last few years likely will mean that most of the increase in matriculants, perhaps as many as 1,200 or more, will be among law schools that are relatively highly ranked – perhaps the top-60 or top-70.
This anticipated increase in enrollment among top law schools likely will decrease the number of applicants in the 150-159 LSAT range available to lower-ranked law schools, particularly given that the number of applicants with LSATs of 150-159 already looks like it could be slightly smaller this year. This likely will leave law schools outside the top-60 or top-70 facing challenging decisions of shrinking enrollment further to hold profile (and dealing with further revenue declines) or accepting declines in profile in exchange for stable or larger enrollments (and the corresponding revenue).
With continued growth in applicants between 140-149 to go along with the projection of a slight decline in the number of applicants with LSATs of 150-159, many law schools ranked outside the top-60 or top-70 may find it difficult to maintain their LSAT profiles as the pool of applicants from which they can draw their matriculants will be weighted more to the lower end of the LSAT distribution.
QUESTIONS TO CONSIDER
First, what might explain the growth in the number of applicants with LSATs of 160 or more for the first time in the last several years? This group had been the “market leaders” in walking away from legal education in recent years. Is this a one-time bounce or is this group going to continue to return to legal education in larger numbers?
Second, why is the middle group – those with LSATs of 150-159 -- not showing an uptick in applicants, when there is growth among those with LSATs of 160 or higher AND growth among those with LSATs of 140-149? The group of applicants with LSATs of 150-159 is more likely to be able to pass the bar exam upon completing law school than the group of applicants with LSATs of 140-149. With bar passage rates falling significantly, particularly from those graduates of law schools with lower LSAT profiles, one might have expected that fewer people with LSATs of 140-149 would be applying to law school (as they are most at risk of bar passage failure), but this cycle shows continued modest growth in that pool of applicants while the group of applicants with LSATs of 150-159 is flat to down slightly.
Third, will this strengthening of the quality of the applicant pool portend an improvement in bar passage results in July 2019? It is too early to answer this question. Once actual enrollment profiles are available in December, it will be easier to analyze the possible impact on bar passage results.
(I am very grateful for thoughtful comments from Bernie Burk and Scott Norberg on an earlier draft of this blog posting.)
April 24, 2016 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Friday, March 11, 2016
Conditional Scholarships Reprise – Of Sticks and Carrots and Asking Questions
A few years ago the Council for the Section of Legal Education and Admissions to the Bar mandated greater transparency regarding conditional scholarships, requiring law schools that offer conditional scholarships to publicize on their webpages, and to applicants receiving conditional scholarship offers, the number of conditional scholarships awarded to students and the number that had been reduced or eliminated over each of the prior three academic years.
Applicants previously had not been aware of how many students were getting conditional scholarships and didn’t know how likely they were to keep the conditional scholarships given the law school’s grading curve. They were generally unduly optimistic about their likelihood of retaining a scholarship. The mandated disclosure was designed to ameliorate this information asymmetry and optimism bias.
I have written about conditional scholarships on several occasions over the last several years, initially noting the need for greater transparency and then analyzing the data on conditional scholarships once its publication was mandated. I posted the most recent summary in December 2015, covering the 2014-15 academic year and comparing it with the 2011-12 academic year. Notably, over the last few years, while more than two dozen law schools have shifted away from using conditional scholarships, the percentage of first-year students with conditional scholarships remained at roughly 27%, although slightly fewer first-year students saw their scholarships reduced or eliminated (7.8% down from 9.4%).
With tuition deposits due in the next several weeks, prospective law students likely are comparing the varied opportunities they may have in terms of law schools and scholarship offers. I write at this time to highlight the need for applicants receiving conditional scholarship offers to ask questions of admissions officials regarding conditional scholarships at their law schools, both with respect to traditional conditional scholarships and with respect to a new type of conditional scholarship that apparently is being offered by at least one law school and perhaps others. Prospective students need to be proactive in combatting their own propensity for optimism bias. Pre-law advisors need to help students be more proactive in combatting their propensity for optimism bias.
The Need to Ask Questions with Respect to Traditional Conditional Scholarships that Function as a Stick
Traditional conditional scholarships operate as a “stick.” If a student doesn’t maintain a defined GPA or class rank, the student’s scholarship is reduced or eliminated.
Law schools are required to publish, and to provide to conditional scholarship recipients, the number of conditional scholarship recipients and the number whose scholarships were reduced or eliminated in each of the prior three years. This is helpful generally, but it doesn’t necessarily help specific applicants all that much.
For example, assume a law school’s published data indicates that 80 students received conditional scholarships in each of the prior three years and that 20 students saw their scholarships reduced or eliminated each year. At first blush, this makes it look like the average conditional scholarship recipient has a 75% (60/80) chance of retaining her scholarship. But who is the average conditional scholarship recipient? Assuming all students had to meet the same “condition” – perhaps maintain a first-year GPA of 3.0 -- it is likely that conditional scholarship recipients in the top quarter of the LSAT/GPA distribution for entering students at the law school had perhaps a 90-95% likelihood of retaining their scholarship, while conditional scholarship recipients near the middle or below the middle of the LSAT/GPA distribution for entering students at the law school had perhaps a 50-60% likelihood of retaining their scholarship.
Recognizing this likely disparity, conditional scholarship recipients should be asking the admissions officials at the law schools from which they are receiving conditional scholarship offers what additional information the admissions officials can provide about the extent to which a student with a comparable profile and comparable condition was likely to see his conditional scholarship reduced or eliminated. Were those at the top end of the LSAT/GPA distribution more likely to retain their conditional scholarship? Were those further down the LSAT/GPA distribution less likely to retain their conditional scholarships? How did the nature of the condition impact the likelihood that a student with a given profile retained her scholarship?
Law schools should have this information available and should be willing to provide answers to these questions. Prospective students need answers to these questions to be best positioned to calculate the expected value of a conditional scholarship over three years so that the student can make meaningful cost-comparisons across law schools.
The Need to Ask Questions with Respect to New Conditional Scholarships that Function as a Stick and a Carrot
At least one law school, and possibly others, have what appears to be a new type of “conditional” scholarship, which can best be described as a both a “stick” and a “carrot.” In addition to reducing or eliminating a student’s conditional scholarship if the student fails to maintain a given GPA or class rank, the “carrot” approach to the conditional scholarship offers students AN INCREASED SCHOLARSHIP if the student obtains a given GPA or class rank.
For example, assume a given law school has the same published information as in the previous example – 80 students received conditional scholarships and 20 students had their scholarships reduced or eliminated.
An applicant receives a conditional scholarship for 50% tuition and is informed that the scholarship will be eliminated if she fails to maintain a cumulative GPA of 2.5 at the end of the first year. But she also is informed that the scholarship will increase to 75% if she obtains a GPA of 3.5 and to 100% if she obtains a GPA of 3.7.
This student needs to ask several questions of the admissions officials at the law school. First, she needs to ask whether, given her LSAT/GPA profile, and her renewal threshold (2.5 GPA), she has the average likelihood of maintaining her scholarship (75%) or perhaps a higher or lower likelihood of maintaining her scholarship. (If the school offers 100% scholarships with a renewal condition of 3.5 and a 75% scholarship with a renewal condition of 3.0 and a 50% scholarship with a renewal condition of 2.5, it may be that the people with 50% scholarships have a higher likelihood of retaining their scholarships then those with larger scholarships but correspondingly higher conditions.)
Second, however, the student also needs to ask how many students in the previous two or three years who came into school with an LSAT/GPA profile comparable to hers managed to get a 3.5 GPA or a 3.7 GPA. For a prospective student with an LSAT/GPA in the bottom half of the entering class LSAT/GPA distribution, it well may be that few, if any, comparable students managed to get a 3.5 GPA or a 3.7 GPA at the end of the first year.
New Creative Efforts to Play on Optimism Bias of Applicants
This “carrot” approach to conditional scholarships is simply the newest technique for taking advantage of the optimism bias of prospective students. The Standard 509 disclosure obligations do not capture this type of conditional scholarship. Thus, law schools do not have an affirmative obligation to disclose the extent to which students in various ranges across the LSAT/GPA distribution of an entering class are likely to obtain a GPA of 3.5 or 3.7 at the end of the first-year.
Indeed, this “carrot” approach could be used by any law school – even law schools that do not generally offer conditional scholarships that trigger a reporting obligation. Such a law school could offer a slightly smaller unconditional scholarship on the front end along with the “carrot” condition – the prospect of a scholarship increase if certain GPA performance thresholds are met -- and perhaps entice students who optimistically believe they are going to outperform their LSAT/GPA profile to accept the law school’s scholarship offer rather than a comparable scholarship offer from another law school that did not offer a “carrot.”
Of course, this “carrot” approach to conditional scholarships presents another information asymmetry problem and optimism bias problem. The law school would know how few students meet the GPA threshold for an increased scholarship while the prospective students would optimistically, but unrealistically, believe they are capable of meeting the threshold.
But the fact that law schools do not have an affirmative obligation to disclose the likelihood of success in meeting the GPA threshold for the enhanced scholarship award does not mean that prospective students can’t ask for very specific information about the number of students with comparable LSAT/GPA profiles who actually obtained the GPA thresholds over the prior three years. Once again, law schools should have this information available and should be willing to disclose the information.
In any of these situations, if a prospective student asks for specific information about the scholarship retention or scholarship enhancement prospects of similarly-situated students in the three prior years and a law school claims not to have the information or is not willing to share the information, this should prompt suspicion on the part of the prospective student. Law schools have this information (or should have it) and should provide answers to these questions when asked.
March 11, 2016 in Data on legal education, Innovations in legal education, Scholarship on legal education | Permalink | Comments (1)
Sunday, December 6, 2015
The Opaqueness of Bar Passage Data and the Need for Greater Transparency
There has been a great deal of discussion lately over at The Faculty Lounge regarding declines in law school admissions standards, declines in bar passage rates, and the general relationship between LSAT scores and bar passage. Much of this discussion is clouded by the lack of meaningful data regarding bar passage results. In this blog posting I will delineate several questions that just cannot be answered meaningfully based on the presently available bar passage data.
The national first-time bar passage rate among graduates of ABA-accredited law schools fell significantly in 2014. According to the NCBE’s statistics, the average pass rate from 2007-2013 for July first-time test-takers from ABA-accredited law schools was 83.6%, but fell to 78% in 2014. (2015 data won’t be available until next Spring when it is released by the NCBE.)
While there might be some reasons to believe these results were somewhat aberrational given that the objective criteria of the entering class in 2011 was only modestly less robust than the objective criteria of the entering class in 2010, and given the ExamSoft debacle with the July 2014 bar exam, the results are concerning, given that the objective criteria of the entering classes in 2012, 2013 and 2014 showed continued erosion. As the last two years have seen declines in the median MBE scaled score among those taking the July bar exam, the changes in entering class credentials over time suggest further declines in median MBE scaled scores (and bar passage rates) may be on the horizon.
In 2010, there were roughly 1,800 matriculants nationwide with LSATs of 144 or less. In 2012, there were roughly 2,600 matriculants nationwide with LSATs of 144 or less. In 2014, there were roughly 3,200 matriculants nationwide with LSATs of 144 or less. Recognizing that law school grades will be a better predictor of bar passage than LSAT scores, I think it is safe to say that entering law students with LSATs in this range are more likely than entering law students with higher LSATs to struggle on the bar exam. Because the number of those entering law school with LSAT scores of 144 or less has grown substantially (particularly as a percentage of the entering class, more than doubling from less than 4% in 2010 to more than 8% in 2014), many are concerned that bar passage rates will continue to decline in the coming years.
While there has been a great deal of discussion regarding declines in admission standards and corresponding declines in bar passage standards, this discussion is profoundly limited because the lack of meaningful bar passage data presently provided by state boards of law examiners and by the ABA and ABA-accredited law schools means that we do not have answers to several important questions that would inform this discussion.
- What number/percentage of graduates from each law school (and collectively across law schools) sits for the bar exam in July following graduation and in the following February? Phrased differently, what number/percentage of graduates do not take a bar exam in the year following graduation?
This is a profoundly important set of questions as we look at employment outcomes and the number/percentage of graduates employed in full-time, long-term bar passage required positions. Given that only those who pass the bar exam can be in full-time, long-term bar passage required positions, it would be helpful to know the number/percentage of graduates who “sought” eligibility for such positions by taking a bar exam and the number/percentage of graduates who did not seek such eligibility. It also would be helpful to understand whether there are significant variations across law schools in terms of the number of graduates who take a bar exam (or do not take a bar exam) and whether those who do not take a bar exam are distributed throughout the graduating class at a given law school or are concentrated among those at the bottom of the graduating class. At present, however, this information simply is not available.
- What is the first-time, bar passage rate for graduates from ABA-accredited law schools?
One might think this would be known as ABA-accredited law schools are required to report first-time bar passage results. But the way in which first-time bar passage results are reported makes the data relatively unhelpful. Law schools are not required to report first-time bar passage for all graduates or even for all graduates who took a bar exam. Rather, law schools are only required to report first-time bar passage results for at least 70% of the total number of graduates each year. This means we do not know anything about first-time bar passage results for up to 30% of graduates of a given law school. Across all law schools, reported results account for roughly 84% of graduates, leaving a not insignificant margin of error with respect to estimating bar passage rates.
People would have been flabbergasted if the ABA had required reporting of employment outcomes for only 70% of graduates. Now that the ABA is requiring reporting on employment outcomes for all graduates, there is no good reason why the ABA should not be requiring bar passage accounting for all graduates, requiring law schools to note those who didn't take a bar exam, those who took and passed a bar exam, those who took and failed a bar exam, and those for whom bar status is unknown. (Up until recently, some boards of law examiners were not reporting results to law schools, but my understanding is that the number of state boards of law examiners not reporting results to law schools is now fairly small.)
Notably, for 2011, 2012, and 2013, the average bar passage rate for first-time takers from all ABA-accredited law schools based on data reported by the law schools was consistently higher than the data reported by NCBE for the corresponding years (2011 – 83.8% v. 82%, 2012 – 81.8% v. 79%, 2013 – 82.4% v. 81%. (Moreover, first-time takers are not measured equivalently by the ABA and by the NCBE. The ABA reporting requirement focuses on graduates who took any bar exam for the first-time. The NCBE defines as first-time takers any person taking a bar exam in a given jurisdiction for the first-time. Thus, the NCBE set of first-time takers is broader, as it includes some people taking a bar exam for the second time (having taken the bar exam in another jurisdiction previously).
- What is the “ultimate” bar passage rate for graduates from ABA-accredited law schools?
Even though a number of commenters have noted that “ultimate” bar passage is more important than first-time bar passage, there is no publicly available data indicating the ultimate bar passage rate on a law school by law school basis for the graduates of each ABA-accredited law school. What number/percentage of graduates of a given law school who take a bar exam pass after the second attempt? What number/percentage of graduates of a given law school who take a bar exam pass after the third attempt? What number/percentage of graduates of a given law school never pass a bar exam? This information just is not publicly available at present.
While Standard 316, the bar passage accreditation standard, allows schools to meet the standard by demonstrating that 75% or more of those graduates who sat for a bar exam in the five most recent calendar years passed a bar exam, this “ultimate” bar passage data is not publicly disseminated. Thus, while first-time bar passage data is limited and incomplete for the reasons noted above, “ultimate” bar passage data on a law school by law school basis is actually not available.
The modest amount of information available on “ultimate” bar passage rates is not very helpful. The LSAC National Longitudinal Bar Passage Study contains some analysis of "ultimate" bar passage rates, but it focused on the entering class in the fall of 1991, which it described as being “among the most academically able ever to enter” law school based on entering class statistics (page 14), a description that could not be used with the classes that have entered in the last year or two or three. It also does not contain any information about "ultimate" bar passage for graduates of individual law schools. In addition, Law School Transparency has recently received some information from at least one law school that has requested anonymity. Much better “ultimate” bar passage information is needed to better inform many of the discussions about the relationship between entering class credentials and bar passage.
- How can we compare bar passage results from one jurisdiction to another?
Most state boards of law examiners do not present data regarding bar passage that allows reasonable bases for analyzing the results in ways that provide meaningful insight and a meaningful basis for comparison. Fewer than one-third of states publicly provide information in which a delineation is made between first-time takers and repeat takers on a law school by law school basis and only a few of these provide information about MBE scores on a school by school basis. Accordingly, it is very difficult to make meaningful comparisons of year-over-year results in the months following the July bar exam, because data is rarely reported in a consistent manner. The NCBE does provide statistics annually (in the spring) which includes a delineation of bar passage rates by state based on first-time test takers from ABA-accredited schools, but the NCBE does not provide MBE scores on a state by state basis (although it seemingly should be able to do this).
Conclusion
There is a need for much greater transparency in bar passage data from boards of law examiners and from the ABA and ABA-accredited law schools. It well may be that some law schools would be a more meaningful investment for "at-risk" students, those whose entering credentials might suggest they are at risk of failing the bar exam, because those law schools have done a better job of helping "at risk" students learn the law so that they are capable of passing the bar exam at higher rates than graduates of other law schools with comparable numbers of at risk students. It may well be that some jurisdictions provide "at risk" students a greater likelihood of passing the bar exam. At the moment, however, that information just isn’t available. Much of the disagreement among various commentators about the relationships between admission standards and bar passage rates could be resolved with greater transparency – with the availability of much better data regarding bar passage results.
December 6, 2015 in Current events, Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Friday, October 2, 2015
Part Two - The Impact of Attrition on the Composition of Graduating Classes of Law Students -- 2013-2016
In late December 2014, I posted a blog entitled Part One – The Composition of the Graduating Classes of Law Students – 2013-2016. That blog posting described how the composition of the entering classes between 2010 and 2013 has shifted. During that time, the percentage at or above an LSAT of 160 dropped by nearly 20% from 40.8% to 33.4%. Meanwhile, the percentage at or below an LSAT of 149 increased by over 50% from 14.2% to 22.5%.
But this reflects the composition of the entering classes. How do the graduating classes compare with the entering classes? This depends upon the attrition experienced by the students in a given entering class. This much belated Part Two discusses what we know about first-year attrition rates among law schools.
I have compiled attrition data from all of the fully-accredited ABA law schools outside of Puerto Rico for the last four full academic years. I have calculated average attrition rates for the class as a whole and then broken out average attrition rates by law schools in different median LSAT categories – 160+, 155-159, 150-154 and <150.
In a nutshell, overall first-year attrition increases as the median LSAT of the law school decreases. Over the last few years, while “academic attrition” has declined for law schools with median LSATs of 150 or greater, “other attrition” has increased modestly, particularly for law schools with median LSATs <150, resulting in a slight increase in overall first-year attrition between 2010 and 2013.
Overall First-Year Attrition Rates Have Increased Slightly
In calculating attrition rates, I wanted to capture those students who are no longer in law school anywhere. Thus, for these purposes, “attrition” is the sum of “academic attrition” and “other attrition.” “Academic attrition” occurs when a law school asks someone to leave because of inadequate academic performance. “Other attrition” occurs when a student departs from the law school volitionally. Both of these categories exclude “transfers.”
The following chart shows that despite the declining “LSAT profile” of the entering classes between 2010 and 2013, there has been no meaningful change in the average “academic attrition” rate. The modest increase in overall first-year attrition over this period, from roughly 5.8% to roughly 6.6%, is largely due to a growth in the “other attrition” category from roughly 2.5% to roughly 3.2%.
Overall First-Year Attrition for Classes Entering in 2010, 2011, 2012, and 2013
|
Beg. Enrollment |
Academic Attrition |
% Academic |
Other Attrition |
% Other |
Total Attrition |
% Attrition |
2010-11 |
50408 |
1673 |
3.32 |
1256 |
2.49 |
2929 |
5.81% |
2011-12 |
46477 |
1551 |
3.34 |
1262 |
2.72 |
2813 |
6.06% |
2012-13 |
42399 |
1461 |
3.45 |
1186 |
2.8 |
2647 |
6.25% |
2013-14 |
38837 |
1316 |
3.39 |
1236 |
3.18 |
2552 |
6.57% |
(Calculating attrition rates for 2010-11, 2011-12 and 2012-13, is a little more complicated than one might think. For ABA reporting years of 2011, 2012, and 2013, “academic attrition” was reported separately, but “other attrition” included “transfers out.” Thus, to generate the real “other attrition” number, one needs to “subtract” from “other attrition” the numbers associated with “transfers out.” Because some schools occasionally listed transfers out in “second year” “other attrition,” this analysis should be understood to have a little fuzziness to it for years 2010-11, 2011-12 and 2012-13. For ABA reporting year 2014, transfers out were not commingled with “other attrition,” so the calculations were based solely on the sum of “academic attrition” and “other attrition.” Beginning with reporting this fall, “academic attrition” will include both involuntary academic attrition as well as voluntary academic attrition (students who withdrew before completing the first-year, but were already on academic probation).)
Academic Attrition Rates Increase as Law School Median LSAT Decreases
Notably, there are different rates of attrition across law schools in different LSAT categories. The following chart breaks down attrition by groups of law schools based on median LSAT for the law school for the entering class each year. For each year, the chart shows the average first-year attrition rates for law schools with median LSATs of 160 or higher, for law schools with median LSATs of 155-159, for law schools with median LSATs of 150-154 and for law schools with median LSATs less than 150. In addition, it breaks out “academic attrition” and “other attrition” as separate categories for each category of law school and for each year and then provides the total overall attrition rate each year along with the four-year average attrition rate.
Average Attrition Rates by Category of Schools Based on Median LSAT
|
2010-11 |
2011-12 |
2012-13 |
2013-14 |
|
||||||||
Median LSAT |
Acad |
Other |
Total |
Acad |
Other |
Total |
Acad |
Other |
Total |
Acad |
Other |
Total |
Four-Year Average |
160+ |
0.6 |
1.7 |
2.3 |
0.6 |
1.9 |
2.5 |
0.4 |
2.0 |
2.4 |
0.3 |
1.5 |
1.8 |
2.3 |
155-159 |
2.9 |
2.6 |
5.5 |
2.2 |
2.8 |
5.1 |
2.1 |
2.9 |
5.1 |
1.7 |
3.2 |
4.9 |
5.2 |
150-154 |
6.3 |
3.8 |
10.1 |
6.2 |
3.4 |
9.6 |
6.0 |
3.7 |
9.7 |
4.2 |
4.3 |
8.5 |
9.4 |
<150 |
10.1 |
2.4 |
12.5 |
9.4 |
3.8 |
13.2 |
9.1 |
3.0 |
12.2 |
9.7 |
4.7 |
14.4 |
13.1 |
When looking at this data, some things are worth noting.
First, across different LSAT categories, overall attrition increases as you move from law schools with higher median LSATs to law schools with lower median LSATs, going from an average over the four years of 2.3% for law schools with median LSATs of 160+, to 5.2% for law schools with median LSATs of 155-159, to 9.4% for law schools with median LSATs of 150-154, to 13.1% for law schools with median LSATs of <150. “Academic attrition” consistently increases as median LSAT decreases, while “other attrition” is mixed. (Although this analysis is focused on four LSAT categories, the trend of having overall attrition increase as median LSAT decreases continues if you add a fifth LSAT category. In 2010-11 there was only one law school with a median LSAT of 145 or less, with only 320 students. By 2013-14, however, there were nine law schools with a median LSAT of 145 or less, with 2,075 students. The overall first-year attrition rate (encompassing academic attrition and other attrition) at these nine schools in 2013-14 was 15.9 percent. The overall attrition rate at the other 24 law schools with a median LSAT less than 150 was 13.6 percent.)
Second, over the period from 2010-2013, “academic attrition” generally appears to be flat to decreasing for schools in all LSAT categories except for 2013-14 year for law schools with median LSATs <150, where it increased slightly (largely because of the larger number of schools with median LSATs of 145 or less). By contrast, “other attrition” presents more of a mixed record, but generally appears to be increasing between 2010 and 2013 for schools in most LSAT categories. Nonetheless, average overall first-year attrition is lower in 2013-14 for law schools in the top three LSAT categories.
Third, if you are wondering why the average overall attrition could be increasing while the overall attrition rates for the top three LSAT categories are decreasing, the answer is because of the changing number of students in each category over time. As noted in Part I, the number of students and percentage of students in the top LSAT category has declined significantly, while the number of students and percentage of students in the bottom LSAT category has increased significantly. This results in the average overall attrition rate increasing even as rates in various categories are decreasing.
Thoughts on Attrition Rates
It makes sense that “academic attrition” increases as law school median LSAT decreases. It seems reasonable to expect that law schools with median LSATs of <155 or <150 will have higher “academic attrition” rates than those with median LSATs of 155-159 or 160 and higher.
It may make less sense, however, that “academic attrition” generally decreased across all four categories of law schools between 2010-11 and 2013-14 (with the exception of law schools with a median LSAT <150 in 2013-14), even as the LSAT profile of each entering class continued to decline. With an increase in the number and percentage of law students with LSATs of <150, particularly those with LSATs of <145, one might have anticipated that the average rate of “academic attrition” would have increased, particularly among law schools with median LSATs of 150-154 (who might have seen an increase in the number of students with LSATs less than 150) and among law schools with median LSATs of <150, given the increase in the number of law schools with median LSATs of 145 or less.
Cynics might argue that from a revenue standpoint, law schools are making a concerted effort to retain a higher percentage of a smaller group of students. But this assumes a degree of institutional purposefulness (coordination among faculty) that is rare among law schools. Moreover, my sense is that there are much more benign explanations.
First, if law schools have not adjusted their grading curves to reflect a different student profile, then the standard approach to first-year grading – which involves a forced curve at most schools -- is likely to produce a similar percentage of “at risk” students year over year even though the objective credentials of each entering class have declined.
Second, with the decline in the number of applicants to law school, one might surmise that those choosing to go to law school really are serious about their investment in a legal education and may be working harder to be successful in law school, resulting in fewer students facing academic disqualification, even though the credentials for each entering class have been weaker year over year. This may be particularly true in law schools with robust academic support programs which may be helping some students on the margin find sufficient success to avoid academic attrition.
Third, and perhaps most significantly, however, is the reality that “academic attrition” and “other attrition” are related. Indeed, that is why I have reported them together in the charts above as two components of overall attrition. Some students who might be at risk for “academic attrition” may decide to withdraw from law school voluntarily (and be classified under “other attrition” rather than “academic attrition”). In addition, it is possible that other students, particularly at law schools with median LSATs <150, may be voluntarily withdrawing from law school because they have decided that further investment in a legal education doesn’t make sense if they are performing relatively poorly, even though the law school would not have asked them to leave under the school’s policy for good academic standing.
The fact that the percentage of students in each entering class with LSATs of <150 and even <145 has increased substantially between 2010 and 2013, while the rate of overall first-year attrition has increased only modestly over this time period, suggests that the composition of graduating classes (based on LSATs) will continue to weaken into 2016 (and probably 2017 if attrition patterns did not change in 2014-15). As a result, the declines in the median MBE scaled score in 2014 and 2015 could be expected to continue in 2016 and 2017. Some law schools also are likely to see bar passage rates for their graduates decline, perhaps significantly, in 2015, 2016 and 2017.
Unanswered Questions
This analysis focuses on first-year attrition. There continues to be attrition during the second year and third year of law school, generally at lower rates, perhaps 2-3% of second-year students and 1-2% of third-year students. (On average, the number of graduates in a given class has been around 90% of the entering class.) It is not clear yet whether attrition among upper level students follows similar patterns across different categories of law schools. The publicly-reported attrition data also does not provide any information regarding the gender or ethnicity or socio-economic background of students leaving law school. Therefore, we don’t know whether there are different rates of attrition for women as compared with men or whether students of different ethnic backgrounds have different rates of attrition. We also don’t know whether first-generation law students experience attrition at greater rates than other law students, or whether students of lower socio-economic status experience attrition at greater rates than students of higher socio-economic status.
(I am very grateful for the insights of Bernie Burk and Scott Norberg on earlier drafts of this blog posting.)
October 2, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)
Wednesday, September 9, 2015
"In Praise of Law Reviews (And Jargon-Filled, Academic Writing)"
That is the title of a forthcoming article by Cass Sunstein in the Michigan Law Review. Sunstein has unusual standing to make this case because, in addition to his academic perches at Chicago and Harvard Law, he was tapped by President Obama to lead the Office of Information and Regulatory Affairs.
Sunstein has written a remarkably thoughtful and balanced essay that I would encourage any fairminded lawyer, law student, and law professor to read. Sunstein begins by recounting how pulling the levers of power in government made several of his fellow academics despair over the prospect of returning to academic writing. When Sunstein probed further, a colleague sent along a passage from Theodore Roosevelt:
"It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood ... ."
It's a powerful passage that can be used to diminish those who write academic articles. But Sunstein subsequently references another quote, this one from John Maynard Keynes:
[T]he ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.
Sunstein acknowledges that Keynes' message is self-serving and unnecessarily demeans the intellect of those who carry the burden of leadership. But he also sees a kernel of truth--difficult problems often get solved by applying the grand ideas and concepts of academics. The impact of academics is rarely immediate, but it can be enduring and profoundly influential.
To illustrate his point, Sunstein identifies a list of seven recent books by academic authors (by Balkin, Vermeule, Mashaw, Kaplow & Shavell, Revesz & Livermore, Cross, and Adler). All of them can trace their origins to earlier law review articles, all of them are hunting "big game", and most if not all of them are unlikely to be of immediate practical to the busy practitioner or judge. But Sunstein suggests that we should be taking a longer view. Some of these books were on his shelf when he served in the government. He used them to address real world problems. The rest are scaffolding to reach something higher.
Sunstein organizes the core of his essay around the criticisms of the late Yale law professor Fred Rodell, suggesting that the author of the famous Goodbye to Law Reviews got it only half right. Sure, the style and length of law review articles limit their readership, but Sunstein observes some countervailing benefits:
When they are working well, law reviews strongly discourage arguments that are glib, sloppy, circular, or narrowly ideological. They also require both development of and sympathetic engagement with competing points of view, rather than easy or rapid dismissals. Counterarguments are strongly encouraged, even mandatory. There is a kind of internal morality to the genre, one that is (I think) connected with and helps account for some of its rigidity. The morality involves respect for the integrity of the process of argument, which entails respect for a wide range of arguers as well.
As someone who has written numerous law review articles, this description strikes me as entirely accurate. Most of my work these days is applied--designing and measuring law school courses, evaluating outcomes, and trying to re-start the labor market so it clears on something more that LSAT scores of entering students. Within this applied realm, which borders on the arena, I am often viewed as someone who is highly creative. Yet, I can safely say that nearly all of the credit goes to the mental discipline and knowledge obtain through academic writing. That process fundamentally transformed my intellect. Further, I felt that way at the end of my first law review comment, which took roughly 500 hours to research and write during my 2L year of law school. So I wanted to do it again.
A lot of smart people in law tend to focus on what is immediate and practical--i.e., what will help with the work on their desk. I can see this mindset in nascent form in a subset of my students who become impatient with classroom forays into legal theory or the social sciences. I don't think this group can be won over. By disposition, they can't see the value in reading academic work, so paying for its production seems even more pointless.
Granted, this conclusion does not resolve the harder issue of whether the current system of legal education over-incentivizes the production of legal scholarship by mandating, through ABA and AALS requirements, that most teachers be academic scholars. What is the optimal number of lawyer-scholars who should be subsidized by student tuition as opposed to grants or endowment? It may be less than the current number. Further, how those spots get allocated is another challenging issue with no simple resolution.
That said, Sunstein is clearly right--whether they realize it or not, every capable legal problem-solver is standing on the shoulders of prior academic work. It is misguided to conclude that future generations won't need new and better ideas vetted through an academic process.
September 9, 2015 in New and Noteworthy, Scholarship on legal education | Permalink | Comments (1)
Thursday, May 7, 2015
Revisiting Conditional Scholarships
Having been one of the people who brought attention to the issue of conditional scholarships a few years ago, I feel compelled to offer a few insights on a rekindled conversation about conditional scholarships involving Jeremy Telman and Michael Simkovic and Debby Merritt.
I am not sure what prompted Prof. Telman to write about conditional scholarships, but the first sentence of his initial post seems to be a few years late:
One of the ways in which law schools are allegedly inadequately transparent is in the award of merit scholarships conditional on the students’ achievement of a certain grade point average. (Emphasis added)
A few years ago, one accurately could have said that law schools were inadequately transparent regarding the awarding and retention of conditional scholarships. I did say that in an article Prof. Telman describes as “interesting.”
Today, this is no longer accurate, because we have much greater transparency regarding conditional scholarships given the disclosures mandated pursuant to Standard 509.
Thus, I am not sure anyone is alleging that law schools are inadequately transparent regarding conditional scholarships and I am not sure why this is once again an item for discussion. It has been well settled and law schools and prospective law students have adjusted to a new reality. Indeed, in his follow up posting, Prof. Telman essentially acknowledges this point:
It seems we are all agreed that the disclosure problems related to conditional scholarships have largely been addressed through the ABA website that enables students to comparison shop among scholarship offers from various schools and know their chances of retaining their conditional scholarships.
That said, given that Prof. Telman got the conversation started, I have a response to one of his assertions and some observations to share.
The general context of his posting (and Prof. Simkovic’s related posts) is that college students have lived with conditional scholarships without apparent problems so conditional scholarships shouldn’t present a concern for law students. In making his case, Prof. Telman relies on my 2011 article to support a proposition that the article actually disproves in some detail. Specifically, Prof. Telman states:
Professor Organ was able to find information about how scholarships work at 160 law schools. That means that the information was out there. Since Professor Organ was able to gather information about 160 law schools, it should not be difficult for students to gather relevant information about the one law school that they are considering attending.
He further states: “Why are law students assumed to be incapable of looking into standard grade normalizations curves for the first year?” Prof. Telman seems to be suggesting that there actually weren’t any disclosure problems because “the information was out there.” The information was not out there.
To be more precise, in putting together the article, with the efforts of research assistants as well as my own sleuthing, I was able to find sufficient information from the NAPLA-SAPLA Book of Lists, the ABA-LSAC Guide, and law school web pages from which to classify 160 law schools regarding whether the law school had a competitive scholarship program or some other type of scholarship program. If Prof. Telman would have looked carefully at the article, however, he would have noted that “only four of these 160 schools had any information posted on their webpages indicating renewal rates on scholarships.” (A point Derek Tokarz makes in the comments to Prof. Telman’s post.)
Prospective law students not only need relevant information about one law school, they need relevant and comparable information about the set of three or five or seven law schools they are considering seriously. Prior to the Standard 509 mandated disclosure of conditional scholarship information, it was profoundly difficult if not impossible for students to gather relevant information from a few or several law schools. The information simply was not “out there.”
Indeed, two of the primary points of my article were to highlight the information asymmetry between law schools and prospective law students relating to competitive scholarships and to recommend greater disclosure of the number of students receiving competitive scholarships and the number who had them renewed (or had them reduced or eliminated).
Prof. Merritt discusses in some depth this information asymmetry, noting particularly that college students who have been successful in retaining their conditional scholarships as undergrads do not appreciate the reality of the mandatory curve they will encounter in law school, a point Stephen Lubet also makes cogently in a comment to Prof. Telman’s post. (Indeed, to his credit, Prof. Telman acknowledges that prospective law students also may suffer from optimism bias in assessing their likelihood of retaining their scholarship.)
Regarding the need for greater disclosure, regardless of how savvy and sophisticated we would like to believe prospective law students might have been or might be, the nuances of conditional scholarships and mandatory curves were not things that were clearly understood in the era prior to the mandatory Standard 509 disclosure. I noted in my article that many students posting on Law School Numbers valued their scholarships based on a three-year total, regardless of whether they were conditional scholarships, suggesting these students failed to appreciate that the “value” should be discounted by the risk of non-renewal. I also spoke with pre-law advisors around the country regarding conditional scholarship and consistently was told that this information was very helpful because pre-law students (and sometimes pre-law advisors) had not appreciated the realities of conditional scholarships.
While there are other things mentioned by Prof. Telman, Prof. Simkovic and Prof. Merritt to which I could respond, this post is already long enough and I am not interested in a prolonged exchange, particularly given that many of the points to which I would respond would require a much more detailed discussion and more nuance than blog postings sometimes facilitate. My 2011 article describes my views on competitive scholarship programs and their impact on law school culture well enough. Accordingly, let me end with one additional set of observations about what has happened with conditional scholarships in an era of increased transparency.
In my follow up article available on SSRN, I analyzed the frequency of conditional scholarships generally and the extent to which conditional scholarships were utilized by law schools in different rankings tiers for the 2011-2012 academic year (the first year following the ABA's mandated disclosure of conditional scholarship retention rates).
For the entering class in the fall of 2011, I noted that there were 140 law schools with conditional scholarship programs, and 54 law schools with scholarship renewal based only on good academic standing, one-year scholarships, or only need-based scholarship assistance. I also noted that conditional scholarship programs were much less common among top-50 law schools than among bottom-100 law schools.
Based on the data reported in fall of 2014 compiled by the ABA for the entering class in the fall of 2013 (the 2013-2014 academic year), the percentage of all entering first-year students with conditional scholarships has increased slightly (from 26.1% in fall 2011 to 29% in fall 2013), while the percentage of all entering first-year students who had their scholarships reduced or eliminated has decreased slightly (from 9% as of summer of 2012 to 8.4% as of summer of 2014). The average renewal rate across law schools increased from 68.5% to 73%.
More significantly, however, the number of law schools with conditional scholarship programs has declined, while the number with other types of scholarship programs has increased fairly significantly. By 2013-2014, there were 78 law schools with scholarships renewed based on good academic standing, with one-year scholarships or with only need-based scholarship assistance, a significant growth in just two years in the number of law schools going away from conditional scholarship programs -- 24 more law schools, a 40% increase. This would seem to indicate that at least some law schools have decided conditional scholarships aren’t as good for law schools or for law students.
May 7, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)
Monday, April 13, 2015
PROJECTIONS FOR LAW SCHOOL ENROLLMENT FOR FALL 2015
This blog posting is designed to do three things. First, following up on recent discussions regarding trends in applicants by Al Brophy at The Faculty Lounge and Derek Muller at Excess of Democracy, I provide a detailed analysis to project the likely total applicant pool we can expect at the end of the cycle based on trends from March through the end of the cycle in 2013 and 2014. Second, using the likely total pool of applicants, I estimate the number of admitted students and matriculants, but also question whether the estimates might be too high given the decline in quality of the applicant pool in this cycle. Third, building on the second point, I suggest that law schools in the lower half of the top tier are likely to see unusual enrollment/profile pressure that may then have a ripple effect down through the rankings.
1. ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS
Reviewing the 2013 and 2014 Cycles to Inform the 2015 Cycle
2013 Current Volume Summary Date |
Applicants |
% of Cycle |
Projected Total Applicant Pool |
Jan. 25, 2013 |
30,098 |
56% |
53,750 |
Mar. 8, 2013 |
46,587 |
84% |
55,460 |
May 17, 2013 |
55,764 |
95% |
58,700 |
End of Cycle |
|
|
59,400 |
2014 Current Volume Summary Date |
Applicants |
% of Cycle |
Projected Total Applicant Pool |
Jan. 31, 2014 |
29,638 |
58% |
51,110 |
Mar. 7, 2014 |
42,068 |
79% |
53,250 |
April 25, 2014 |
48,698 |
89% |
54,720 |
End of Cycle |
|
|
55,700 |
2015 Current Volume Summary Date |
Applicants |
% of Cycle |
Projected Total Applicant Pool |
Jan. 30, 2015 |
26,702 |
54% |
49,450 |
Mar. 6, 2015 |
39,646 |
76% |
52,160 |
April 3, 2015 |
45,978 |
87% |
52,848 |
End of Cycle |
|
|
54,000 (Estimate) |
In each of the last two years, a modest surge in late applicants meant the final count exceeded the March/April projections by a couple thousand. That would suggest that the current projection (for just under 53,000) likely understates the end of cycle applicant pool, which I am now estimating conservatively at 54,000 (down about 3% from 2014). (In 2014, the amount by which the final pool total exceeded the early March projection was nearly 2,500. With an estimated pool of 54,000 applicants, I am estimating that the final pool in 2015 will exceed the early March projection by roughly 2,000.) (That said, if the employment results for 2014 graduates, which will be released shortly, show modest improvement over 2013, I anticipate that even more people might come off the fence and perhaps apply late for the fall 2015 class.)
2. ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS
The chart below shows the number of applicants, admitted students and matriculants over the last three years along with an estimate for fall 2015 based on the assumption above that we have a total of 54,000 applicants this cycle. With 1,700 fewer applicants, I am assuming 1,000 fewer admitted students (a slight increase in the percentage admitted from 2014), and then assuming the number of matriculants will reflect the three-year average for the percentage of admitted students who matriculate – 87%. This would yield a first-year entering class of 36,975, down about 2.5% from 2014.
Estimates of Admitted Students and Matriculants for 2015 Based on Trends in 2012-2014
|
Applicants |
Admitted Students |
Percent of Applicants |
Matriculants |
Percent of Admitted |
2012 |
67,900 |
50,600 |
74.5% |
44,481 |
87.9% |
2013 |
59,400 |
45,700 |
76.9% |
39,675 |
86.8% |
2014 |
55,700 |
43,500 |
78.1% |
37,924 |
87.2% |
2015 (est.) |
54,000 |
42,500 |
78.7% |
36,975 |
87% |
Why These Estimates for Admitted Students and Matriculants Might be Too High
a. Significant Decline in Applicants with LSATs of 165+
Because of changes in the nature of the applicant pool in 2015, however, the estimates of the number of admitted students and number of matriculants in the chart above may be too high. In 2014, almost all of the decrease in applicants came among those with LSATs of <165. The pool of applicants with LSATs of 165+ in 2014 was only slightly smaller than in 2013 (7,477 compared with 7,496). Indeed, as a percentage of the applicant pool, those with LSATs of 165+ increased from 12.6% in 2013 to 13.4% in 2014. This resulted in a slight increase in the number of matriculants with LSATs of 165+ in 2014 compared to 2013 (6,189 compared with 6,154).
In the current cycle, however, the number of applicants with LSATs of 165+ was only 6,320 as of March 6, 2015. In 2013, there were 7,228 on March 8, 2013 (of a final total of 7,496). In 2014, there were 7,150 on March 7 (of a final total of 7,477). Thus, the average increase in applicants with LSATs of 165+ between early March and the end of the cycle is only about 4%. That would suggest that we could anticipate having roughly 6,585 applicants with LSATs of 165+ at the end of the cycle – down nearly 900 from 2014 – over 12%.
Estimate of Number of Total Applicants for 2015 with LSATs of 165+ Based on Trends in 2013 and 2014
|
Applicants at 165+ |
|
Applicants at 165+ |
# Increase to end of Cycle |
% Increase to end of Cycle |
March 8, 2013 |
7228 |
End of Cycle 2013 |
7496 |
268 |
3.7% |
March 7, 2014 |
7150 |
End of Cycle 2014 |
7477 |
327 |
4.6% |
March 6, 2015 |
6320 |
End of Cycle 2015 (est.) |
6585 |
265 |
4.2% |
On a longer term basis, if the estimates in the preceding paragraphs are accurate, the entering class in fall of 2015 will again extend the slide in the number and percentage of first-year students with LSATs of 165+ that has been underway since the class that entered in fall of 2010.
Five-Year Trend in Applicants and Matriculants with LSATs of 165+ and Estimates for 2015
|
Applicants with LSATs of 165+ |
Matriculants with LSATs of 165+ |
Percent of Applicants Matriculating |
2010 |
12,177 |
9,477 |
77.8% |
2011 |
11,190 |
8,952 |
80% |
2012 |
9,196 |
7,571 |
82.3% |
2013 |
7,496 |
6,154 |
82.1% |
2014 |
7,477 |
6,189 |
82.8% |
2015 (est.) |
6,585 |
5,420 |
82.4% |
Given that on average over the last three years roughly 82.4% of admitted students with LSATs of 165+ actually matriculated, one could expect that the 6,585 applicants would translate into 5,420 matriculants with LSATs of 165+ for fall 2015, a decline of nearly 770 from 2014. Notably, this would represent a 45.9% drop in applicants with LSATs of 165+ since 2010 and a 42.8% drop in matriculants with LSATs of 165+ since 2010.
b. Modest Decrease Among Applicants with LSATs <150
On the other end of the LSAT distribution, it is a completely different story. Although the number of applicants with LSATs <150 also has declined, the decline has been more modest than among those with LSATs of 165+. Moreover, those with LSATs of <150 are much more likely to apply late in the cycle. In the last two years there has been significant growth among applicants with LSATs of <150 between early March and the end of the cycle. As a result, I would estimate that we would have 18,350 applicants with LSATs of <150 by the end of this cycle, a decline of only about 4.5%.
Estimate of Number of Total Applicants for 2015 with LSATs of <150 Based on Trends in 2013 and 2014
|
Applicants with LSATs of <150 |
|
Applicants with LSATs of <150 |
# Increase |
% Increase |
March 8, 2013 |
13,364 |
End of Cycle 2013 |
20,706 |
6,642 |
49.7% |
March 7, 2014 |
11,662 |
End of Cycle 2014 |
19,239 |
7,577 |
65% |
March 6, 2015 |
11,467 |
End of Cycle 2015 (est.) |
18,350 |
6,880 |
60% |
With applicants with LSATs <150 making up a larger percentage of the declining applicant pool, the number of matriculants with LSATs of <150 actually had grown each year up until 2014, when the slight increase in matriculants with LSATs of 165+ was mirrored by a slight decrease in matriculants with LSATs <150.
Five-Year Trend in Applicants and Matriculants with LSATs of <150 and Estimates for 2015
|
Applicants with LSATs of <150 |
Matriculants with LSATs of <150 |
Percent of Applicants Matriculating |
2010 |
26,548 |
7,013 |
26.4% |
2011 |
24,192 |
7,101 |
29.4% |
2012 |
22,089 |
7,906 |
35.8% |
2013 |
20,706 |
8,482 |
41% |
2014 |
19,239 |
8,361 |
43.5% |
2015 (est.) |
18,350 |
8,700 |
47.4% |
Given that the percentage of applicants with LSATs <150 matriculating has increased each of the last five years, it seems reasonable to expect another increase – to 47.4% -- resulting in roughly 8,700 matriculants with LSATs of <150, particularly given the decrease in the number of applicants with LSATs of 165+. Even so, it seems unlikely to make up for the drop of nearly 770 matriculants among those with LSATs of 165+. Notably, while the pool of applicants with LSATs <150 has decreased by of 30.9% since 2010, the number of matriculants has increased by 24.2%.
Thus, while the smaller decline in applicants that is expected this year might suggest a correspondingly smaller decline in matriculants, with the weaker profile of the applicant pool in 2015 compared to 2014, it is quite possible that the total number of admitted students will be lower than the chart above suggests and that the corresponding number of matriculants also will be lower than the chart above suggests.
Phrased differently, if there really is going to be a decline of roughly 770 matriculants just in the group with LSATs of 165+, then the total decline in matriculants may well be greater than the 950 estimated in the chart above. Between 2013 and 2014, a decline in applicants of 3,700, almost all with LSATs of 164 and below, resulted in a decline in matriculants of 1,750, all with LSATs of 164 and below. If the decline in applicants is 1,700 this cycle, with over half the decline among those with LSATs of 165+, with a decline of perhaps several hundred with LSATs between 150-164, and with a modest decrease (or possibly a slight increase) among those with LSATs <150, we may well see that the decline in admitted students and in matriculants is slightly larger than estimated in the chart above.
3. PROFILE CHALLENGES AMONG ELITE SCHOOLS
One interesting side note is that the significant decrease in the number of applicants with LSATs of 165+ is likely to put significant pressure on a number of top-50 law schools as they try to hold their enrollment and their LSAT profiles. Simply put, there are not enough applicants with LSATs of 165+ to allow all the law schools in the top-50 or so to maintain their profiles and their enrollment.
If the estimates above are correct – that there will be roughly 5420 matriculants with LSATs of 165+– and if we assume that at least a few hundred of these matriculants are going to be going to law schools ranked 50 or below either due to geography or scholarships or both – and if we assume that the top 15 law schools are likely to leverage rankings prestige (and perhaps scholarships) to hold enrollment and profile -- then the decrease of roughly 770 matriculants with LSATs of 165+ is going to be felt mostly among the law schools ranked 16-50 or so.
In 2014, the top 15 law schools probably had roughly 3,800 first-year matriculants with LSATs of 165+. The schools ranked 16-50 likely had another 1,900 or so. The remaining 500 plus matriculants with LSATs of 165 and above likely were scattered among other law schools lower in the rankings. Let’s assume the top-15 law schools manage to keep roughly 3,700 of the 3,800 they had in 2014. Let’s assume law schools ranked 50 and below keep roughly 500 or so. That means the law schools ranked between 16 and 50 have to get by with 1,220 matriculants with LSATs of 165+ rather than 1,900 last year. While many schools will be dealing with the challenges of maintaining enrollment (and revenue) while trying to hold profile, this likely will be a particularly challenging year for law schools ranked between 16 and 50 that are trying to navigate concerns about enrollment (and revenue) with concerns about profile. To the extent that those schools look toward applicants with lower LSAT profiles to maintain enrollment, that will then have a ripple effect through the law schools lower in the rankings.
April 13, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Tuesday, January 6, 2015
The Variable Affordability of Law School – How Geography and LSAT Profile Impact Tuition Costs
I have posted to SSRN the PowerPoint slides I presented yesterday at the AALS Conference session sponsored by the Section on Law School Administration and Finance. The presentation was entitled The Variable Affordability of Law School – How Geography and LSAT Impact Tuition Cost. (I am very grateful to my research assistant, Kate Jirik, and her husband, Sam, for awesome work on the spreadsheet that supported the data I presented.)
The presentation begins with two slides summarizing data presented in my article Reflections on the Decreasing Affordability of Legal Education showing the extent to which average public school and private school tuition increased between 1985 and 2011 relative to law school graduate income. While many have observed that law school has become increasingly expensive over the last few decades, this "macro" discussion fails to highlight the extent to which differences in tuition exist at a “micro” level either based on geography or on LSAT score.
Using 2012 tuition data, the first set of slides focuses on geographic differences – noting some states where legal education generally is very expensive, some states where legal education generally is very affordable and the balance of states in which tuition costs are in the middle or have a mix of affordable and expensive.
Following those slides, there is a set of slides that describe the process I used to calculate net tuition costs after accounting for scholarships for all entering first-year students at the 195 fully accredited and ranked law schools in fall 2012 in an effort to allocate all students into a five-by-five grid with five LSAT categories (165+, 160-164, 155-159, 150-154 and <150) and five cost categories ($0-$10,000, $10,000-$20,000, $20,000-$30,000, $30,000-$40,000, and $40,000+). There then are a set of slides summarizing this data and trying to explain what we can learn from how students are allocated across the five-by-five grid, which includes a set of slides showing the average rank of the schools at which students in each LSAT/Cost category cell are enrolled.
The concluding slide sets forth a couple of short observations about the data. There was a robust discussion with some great questions following the presentation of this data.
Here are four of the slides to give you a flavor for the presentation on net cost generally and then net cost relative to LSAT categories --
January 6, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Tuesday, November 11, 2014
What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?
The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July. Specifically, in a letter to law school deans, the NCBE stated that: “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”
Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.” Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."
I am not persuaded.
(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.” Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)
I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.
If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.
The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score. The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.
If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013. But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.
Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0. The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.
THINKING ABOUT GROUPS
The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam. I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).
What about “first-time takers”? That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction. Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.
So who’s left? Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014. If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.
COMPARING LSAT PROFILES
The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013). The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points?
The answer to that is no.
The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.
To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade.
Defining “student quality” can be a challenge. A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150.
While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.
In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.
To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134. To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class.
(In his article, Unpacking the Bar: Of Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores. I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013.
LSAT RANGE |
Prediction of Success on the Bar Exam Based on Lowest LSAT in Range |
175-180 |
.98 |
170-174 |
.97 |
165-169 |
.95 |
160-164 |
.91 |
155-159 |
.85 |
150-154 |
.76 |
145-149 |
.65 |
140-144 |
.50 |
135-139 |
.36 |
130-134 |
.25 |
Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)
One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010. Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once. Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once. This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.
Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.
Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score. Those two lines are plotted below for the period from 2005-2012. (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)
Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012
What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes. But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores. From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points. Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points. This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.
How does this help us understand what has happened in 2014 relative to 2013? The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010). Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively.
Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores? Certainly. This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.
But that is not what we have experienced. We have experienced an historic decline of 2.8 points. Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.”
THE EXAMSOFT DEBACLE
What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle.
For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam. This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions).
In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before. If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.
ONE STATE’S EXPERIENCE BELIES THE NCBE STORY
It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates. It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates. To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).
Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”
The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013. In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years. In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.
YEAR |
2004 |
2005 |
2006 |
2007 |
2008 |
2009 |
2010 |
2011 |
2012 |
2013 |
2014 |
First Time Bar Passage Rate |
87%
|
86% |
88% |
89% |
90% |
93% |
91% |
90% |
92% |
93% |
84% |
Change from Prior Year |
|
-1 |
2 |
1 |
1 |
3 |
-2 |
-1 |
2 |
1 |
-9 |
The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013. But here is the problem for the NCBE.
Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam. The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa. In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate). In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year.
Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively). In both years, Iowa had a profile of 164/161/158. In both years, Drake had a profile of 158/156/153. This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade. The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.
TIME WILL TELL
This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014. What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.
When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers. When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).
It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions. In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.
November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (4)
Monday, October 20, 2014
What Law Schools Can Learn from Dental Schools in the 1980s Regarding the Consequences of a Decline in Applicants
For four consecutive years we have seen a decline in the number of applicants to law school and a corresponding decline in the number of matriculating first-year students. Over the last year or two, some have suggested that as a result of this “market adjustment” some law schools would end up closing. Most recently, the former AALS President, Michael Olivas, in response to the financial challenges facing the Thomas Jefferson Law School, was quoted as stating that he expects several law schools to close.
To date, however, no law schools have closed (although the Western Michigan University Thomas M. Cooley Law School recently announced the closure of its Ann Arbor branch).
Have law schools found ways to cut costs and manage expenses in the face of declining revenues such that all will remain financially viable and remain in operation? Is it realistic to think that no law schools will close?
Although there may be a number of people in the legal academy who continue to believe that somehow legal education is “exceptional” – that market forces may impose financial challenges for law schools in the near term, but will not result in the closing of any law schools -- this strikes me as an unduly optimistic assessment of the situation.
To understand why, I think those in legal education can learn from the experience of those in dental education in the 1980s.
The Dental School Experience from 1975-1990
In the 1980s, dental school deans, along with provosts and presidents at their host universities, had to deal with the challenge of a significant decline in applicants to dental school.
At least partially in response to federal funding to support dental education, first-year enrollment at the country’s dental schools grew throughout the 1970s to a peak in 1979 of roughly 6,300 across roughly 60 dental schools. Even at that point, however, for a number of reasons -- improved dental health from fluoridation, reductions in federal funding, high tuition costs and debt loads -- the number of applicants had already started to decline from the mid-1970s peak of over 15,000.
By the mid-1980s, applicants had fallen to 6,300 and matriculants had fallen to 5,000. As of 1985, no dental schools had closed. But by the late 1980s and early 1990s there were fewer than 5000 applicants and barely 4000 first-year students – applicants had declined by more than two-thirds and first-year enrollment had declined by more than one-third from their earlier peaks. (Source – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author).)
How did dental schools and their associated universities respond to this changing market? Between 1986 and 1993, six private universities closed their dental schools: Oral Roberts University, Tulsa, Oklahoma (1986); Emory University, Atlanta, Georgia (1988); Georgetown University, Washington, D.C. (1990); Fairleigh Dickinson University, Rutherford, New Jersey (1990); Washington University, St. Louis, Missouri (1991); and Loyola University, Chicago, Illinois (1993). (Source: Dental Education at the Crossroads: Challenges and Change, Table 1.1 (Institute of Medicine 1995)). According to a New York Times article from October 29, 1987, “Georgetown, formerly the nation's largest private dental school, decided to close after a Price Waterhouse study found that the school would have a $3.6 million deficit by 1992.” (Source: Tamar Lewin, Plagued by Falling Enrollment, Dental Schools Close or Cut Back, New York Times, Oct. 29, 1987).
Some of the primary factors contributing to the closing of dental schools were described as follows:
Financial issues were repeatedly described as critical. Dental education was cited as an expensive enterprise that is or may become a drain on university resources. On average, current-year expenditures for the average dental school are about $1 million more than current revenues. … The declining size and quality of the applicant pool during the 1980s played a role in some closures by threatening the tuition base and prestige on which private schools rely. Faculty and alumni resistance to change may feed impatience among university administrators. In some institutions, the comparative isolation of dental schools within the university has provided them with few allies or at least informed colleagues and has left them ill-prepared to counter proposals for "downsizing." (Source: Dental Education at the Crossroads: Challenges and Change, at 202-203 (Institute of Medicine 1995)).
The Law School Experience from 2004-2014
In terms of applicants and enrollment over the last decade, the trends law schools have experienced look remarkably comparable to the experience of dental schools in the 1970s and 1980s. According to the LSAC Volume Summary, applicants to law schools peaked in 2004 with 100,600 applicants (and roughly 48,200 first-year students). By 2010, applicants had fallen to roughly 87,600, but first-year enrollment peaked at 52,500. Over the last four years, applicants have fallen steadily to roughly 54,700 for fall 2014, with a projected 37,000 first-years matriculating this fall, the smallest number since 1973-74, when there were 40 fewer law schools and over one thousand fewer law professors. (Source - ABA Statistics)(For the analysis supporting this projection of 37,000 first-years, see my blog post on The Legal Whiteboard from March 18, 2014.)
The two charts below compare the dental school experience from 1975 to 1990 with the law school experience in the last decade. One chart compares dental school applicants with law school applicants and one chart compares dental school first-years with law school first-years. (Note that for purposes of easy comparison, the law school numbers are presented as one-tenth of the actual numbers.)
(Sources – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author) and the LSAC’s Volume Summary (with my own estimates for 2014 based on the LSAC’s Current Volume Summary)).
The Law School Experience 2014-2019
Notably, these charts do not bode well for law schools. The law school experience tracks pretty closely the dental school experience over the first ten years reflected in the charts. For law schools, 2014 looks a lot like 1985 did for dental schools.
There might be any number of reasons why the law school experience over the next several years might be different from the dental school experience in the late 1980s and early 1990s, such that the next several years do not continue as a downward trend in applicants and matriculants. The market forces associated with changes in the dental profession and dental education in the 1980s are not the same as the market forces associated with changes in the legal profession and legal education in the 2010s and the cost structures for dental education and legal education are not exactly the same.
The problem for law schools, however, is that without an upward trend law schools will continue to face significant financial pressures for the next few years just as dental schools did in the late 1980s. There might be some encouraging news on the employment front over the next few years as the decreasing number of matriculants will mean a decreasing number of graduates in 2015, 2016 and 2017. Even without any meaningful growth in the employment market for law graduates, this decline in the number of graduates should mean significant increases in the percentage of graduates finding full-time, long-term employment in bar passage required jobs. Over time, this market signal may begin to gain traction among those considering law school such that the number of applicants to law school stops declining and perhaps starts increasing modestly.
But the near term remains discouraging. The number of people taking the June 2014 LSAT was down roughly 9% compared to June 2013 and the anticipation is that the number of test-takers in the most recent administration in late September was down as well compared to October 2013. Thus, applicants well might be down another 5-8% in the 2014-15 admissions cycle, resulting in perhaps as few as 51,000 applicants and perhaps as few as 35,000 matriculants in fall 2015. Even if things flatten out and begin to rebound modestly in the next few years, it would appear to be unlikely that the number of matriculants will climb back near or above 40,000 before the fall of 2017 or 2018.
Moreover, if current trends continue, the matriculants in 2015 also are going to have a significantly less robust LSAT/GPA profile than the matriculants in fall 2010. As I noted in a blog posting on March 2, 2014, between 2010 and 2013, the number of law schools with a median LSAT less than 150 grew from 9 to 32, and the number with a median LSAT of 145 or below grew from 1 to 9.
What Does this Mean for the Average Law School?
Assume you are the Dean at a hypothetical private law school that had 600 students (200 in each class) and a budget based on $18 million in JD tuition revenue in 2010-11. (This reflects a net tuition of $30,000 from each student – with nominal tuition set at $40,000 but with a discount rate of 25%.) Further assume that with this budget, your law school was providing $2.0 million annually to the university with which it is affiliated. As of 2010-11, your entering class profile reflected a median LSAT of 155 and a median GPA of 3.4.
Assume first-year enrollment declined to 170 in 2011, to 145 in 2012, and to 125 in 2013, a cumulative decrease in first-year enrollment since 2010 of 37%. As you tried to balance enrollment and profile, the law school managed to maintain its median LSAT and GPA in 2011, but saw its LSAT and GPA medians decline to 153 and 3.35 in 2012 and to 152 and 3.30 in 2013.
This means that for the 2013-14 academic year, the law school had only 440 students, a decrease of roughly 27% from its total enrollment of 600 in 2010, with a much less robust entering class profile in comparison with the entering class profile in 2010. (Note that this assumes no attrition and no transfers in or out, so if anything, it likely overstates total enrollment). (For comparison purposes, the National Jurist recently listed 25 law schools with enrollment declines of 28% or more between 2010-11 and 2013-14.)
Assume further that the law school had to increase its scholarships to attract even this smaller pool of students with less robust LSAT/GPA profiles, such that the net tuition from each first-year student beginning in fall 2012 has been only $25,500 (with nominal tuition now set at $42,500, but with a discount rate of 40%).
For the 2013-14 academic year, therefore, you were operating with a budget based on $12,411,000 in JD tuition revenue, a decrease in JD tuition revenue of over $5.5 million since the 2010-11 academic year, over 30%. (170 x $32,500 for third years ($5.525 million), 145 x $25,500 for second years ($3.698 million), and 125 x $25,500 for first-years ($3.188 million)).
What does this mean? This means you have been in budget-cutting mode for over three years. Of course, this has been a challenge for the law school, given that a significant percentage of its costs are for faculty and staff salaries and associated fringe benefits. Through the 2013-14 academic year, however, assume you cut costs by paring the library budget, eliminating summer research stipends for faculty, finding several other places to cut expenditures, cutting six staff positions and using the retirement or early retirement of ten of your 38 faculty members as a de facto “reduction in force,” resulting in net savings of $3.59 million. In addition, assume you have gotten the university to agree to waive any “draw” saving another $2 million (based on the “draw” in 2010-2011). Thus, albeit in a significantly leaner state, you managed to generate a “balanced” budget for the 2013-14 year while generating no revenue for your host university.
The problem is that the worst is yet to come, as the law school welcomes a class of first-year students much smaller than the class of third-years that graduated in May. With the continued decline in the number of applicants, the law school has lower first-year enrollment again for 2014-15, with only 120 first-year students with a median LSAT and GPA that has declined again to 151 and 3.2. Projections for 2015-16 (based on the decline in June and October 2014 LSAT takers) suggest that the school should expect no more than 115 matriculants and may see a further decline in profile. That means that the law school has only 390 students in 2014-15 and may have only 360 students in 2015-16 (an enrollment decline of 40% since 2010-11). Assuming net tuition for first-year students also remains at $25,500 due to the competition on scholarships to attract students (and this may be a generous assumption) – the JD tuition revenue for 2014-15 and 2015-16 is estimated to be $9,945,000, and $9,180,000, respectively (a decline in revenue of nearly 50% from the 2010-11 academic year).
In reality, then, the “balanced” budget for the 2013-2014 academic year based on revenues of $12,411,000, now looks like a $2,500,000 budget shortfall in 2014-15 and a $3,200,000 budget shortfall for the 2015-16 academic year, absent significant additional budget cuts or new revenue streams (with most of the “low hanging fruit” in terms of budget cuts already “picked”).
While you may be able to make some extraordinary draws on unrestricted endowment reserves to cover some of the shortfall (assuming the law school has some endowment of its own), and may be creative in pursuing new sources of revenue (a certificate program or a Master of Laws), even if you come up with an extra $400,000 annually in extraordinary draws on endowment and an extra $400,000 annually in terms of non-JD revenue you still are looking at losses of at least $1,700,000 in 2014-15 and at least $2,400,000 in 2015-16 absent further budget cuts. Even with another round of early retirement offers to some tenured faculty and/or to staff (assuming there are still some that might qualify for early retirement), or the termination of untenured faculty and/or of staff, the budget shortfall well might remain in the $1,000,000 to $1,700,000 range for this year and next year (with similar projections for the ensuing years). This means the law school may need subsidies from the university with which it is affiliated, or may need to make even more draconian cuts than it has contemplated to date. (For indications that these estimates have some relation to reality, please see the recent stories about budget issues at Albany, Minnesota and UNLV.)
Difficult Conversations -- Difficult Decisions
This situation will make for some interesting conversations between you as the Dean of the law school and the Provost and President of the university. As noted above in the discussion of dental schools, the provost and president of a university with a law school likely will be asking: How “mission critical” is the law school to the university when the law school has transformed from a “cash cow” into a “money pit” and when reasonable projections suggest it may continue to be a money pit for the next few years? How "mission critical" is the law school when its entering class profile is significantly weaker than it was just a few years ago, particularly if that weaker profile begins to translate into lower bar passage rates and even less robust employment outcomes? How “mission critical” is the law school to the university if its faculty and alumni seem resistant to change and if the law school faculty and administration are somewhat disconnected from their colleagues in other schools and departments on campus?
Some universities are going to have difficult decisions to make (as may the Boards of Trustees of some of the independent law schools). As of 1985, no dental schools had closed, but by the late 1980s and early 1990s, roughly ten percent of the dental schools were closed in response to significant declines in the number and quality of applicants and the corresponding financial pressures. When faced with having to invest significantly to keep dental schools open, several universities decided that dental schools no longer were “mission critical” aspects of the university.
I do not believe law schools should view themselves as so exceptional that they will have more immunity to these market forces than dental schools did in the 1980s. I do not know whether ten percent of law schools will close, but just as some universities decided dental schools were no longer “mission critical” to the university, it is not only very possible, but perhaps even likely, that some universities now will decide that law schools that may require subsidies of $1 million or $2 million or more for a number of years are no longer “mission critical” to the university.
(I am grateful to Bernie Burk and Derek Muller for their helpful comments on earlier drafts of this blog posting.)
October 20, 2014 in Cross industry comparisons, Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (6)
Tuesday, October 7, 2014
Does Cooperative Placement Accelerate Law Student Professional Development?
The title of an earlier essay posed a threshold question for legal ed reform: "If We Make Legal Education More Experiential, Would it Really Matter?" (Legal Whiteboard, Feb 2014) (PDF). I answered "yes" but admitted it was only my best guess. Thus, to be more rigorous, I outlined the conditions necessary to prove the concept.
The essay below is a companion to the first essay. It is a case study on how one type and brand of experiential education -- cooperative placements at Northeastern Law -- appears to accelerate the professional development of its law students. The outcome criteria are comprised of the three apprenticeships of Educating Lawyers (2007) (aka The Carnegie Report) --cognitive skills, practice skills, and professional identity.
The better outcomes flow from Northeastern's immersive, iterative, and integrative approach. First, students are immersed in full-time coops that last a standard 11 weeks. Second, students move through four iterations of coops interspersed with four quarters of upper-level classes. Third, this experiential approach is integrated into the Law School's value system -- i.e., the experiential component is perceived as central rather than marginal to the School's educational mission.
Northeastern's coop model asks more of faculty and students, thus it may be hard to replicate. Yet, there is evidence that such an approach does in fact accelerate professional development in ways that ought to please law school critics and reformers. The benefits may be well worth the costs.
[The text below was original published as the Northeastern Law Outcomes Assessment Project (OAP) Research Bulletin No. 3]
Immersive, Iterative and Integrative:
Does Cooperative Placement Accelerate Law Student Professional Development?
A steep decline in the job prospects for entry-level lawyers has been followed by a sharp drop in law school applications. Media stories criticize traditional legal education for being too expensive while producing graduates unprepared for practice. Throughout the country, legal educators and administrators at law schools are trying to formulate an effective response.
A common thread running through many new law school initiatives is greater emphasis on experiential education. Fundamentally, experiential education is learning by doing, typically by assuming the role of the lawyer in an in-class simulation, law school clinic, externship or cooperative placement. As law schools seek to add hands-on opportunities to their curricular offerings, empirical evidence on experiential education’s impact on law student professional development becomes invaluable.
Northeastern University School of Law’s Outcomes Assessment Project (OAP) is an evidenced-based approach to understanding experiential learning in the law school curriculum. A focal point of the OAP is Northeastern’s Cooperative Legal Education Program, an integral part of the school’s curriculum since the late 1960s. After completing a mostly traditional first year of law school,Northeastern students enter a quarter system in which 11-week cooperative placements alternate with 11-week upper-level courses. Through the four co-op placements during the 2L and 3L years, every Northeastern student gains the functional equivalent of nearly one year of full-time legal experience, typically across a diverse array of practice areas.
The Learning Theory of Cooperative Placement
Northeastern’s Cooperative Legal Education Program is based on a learning theory with three interconnected elements: immersion, iteration and integration.
- Immersion: Immersion in active legal work in a real-world setting enables students to feel the weight and responsibility of representing real-world clients and exercising professional judgment.
- Iteration: Iterative movement between the classroom and co-op placements provides students with concrete opportunities to connect theory with practice and understand the role of reflection and adjustment in order to improve one’s skill and judgment as a lawyer.
- Integration: Integrating experiential learning into the law school curriculum signals its high value to the law school mission — when 50 percent of the upper-level activities involve learning by doing, practice skills are on par with doctrinal learning.
The purpose of the OAP Research Bulletin No. 3 is to use preliminary project data to explore whether the immersion-iteration-integration approach to legal education has the effect of accelerating the professional development of law students.
Three Effects of Co-op Placements
The findings in Research Bulletin No. 3 are based on surveys and focus groups conducted with 2L and 3L Northeastern law students and a small number of Northeastern law graduates, who served as facilitators. In our conversations with these students and alumni, we identified three ways that co-op is impacting the professional development of students.
October 7, 2014 in Data on legal education, Important research, Scholarship on legal education | Permalink | Comments (0)
Monday, July 28, 2014
Conditional Scholarship Retention Update for the 2012-2013 Academic Year
In comparing the conditional scholarship universe between the 2011-12 academic year and the 2012-13 academic year (with a brief look at 2013-14) there are a handful of things worth noting.
First, as shown in Table 1, the number of law schools with conditional scholarships declined between 2011-12 and 2012-13 from 144 law schools to 136 law schools, and declined again for the 2013-14 academic year to 128 law schools. The number of law schools that do not have conditional scholarships grew from 49 in 2011-12 to 58 in 2012-13 to 66 in 2013-14. In addition, the number of schools with just one-year scholarships declined from five in 2011-12 to four in 2012-13, where it remained for 2013-14.
Table 1: Changes in Number of Law Schools with Conditional Scholarship Programs
Category |
2011-12 |
2012-13 |
2013-14 (indications) |
Law Schools with Conditional Scholarship Programs |
144 |
136 |
128 |
Law Schools with One-Year Scholarships |
5 |
4 |
4 |
Law Schools with Scholarships that are not Conditional Scholarships |
49 |
58 |
66 |
Second, as shown in Table 2, the number of students receiving conditional scholarships in 2012-13 declined slightly from 2011-12, from 12786 to 12470, but the percentage of first-years with conditional scholarships actually increased from 27.3% to 29.2% (given the smaller number of first-years in 2012-13 compared to 2011-12). That said, the number of students whose scholarships were reduced or eliminated declined from 4359 to 3712, meaning that the percentage of first-years whose scholarships were reduced or eliminated dropped from 9.3% to 8.7%.
Table 2: Overall Comparisons Between 2011-12 and 2012-13
Category |
2011-2012 |
2012-13 |
First-years* |
46778 |
42769 |
First-years with Conditional Scholarships** |
12786 (27.3% of first-years) |
12470 (29.2% of first-years) |
First-years whose conditional scholarships were reduced or eliminated** |
4359 (9.3% of first-years) |
3712 (8.7% of first-years) |
Average Renewal Rate (across law schools) |
69% |
71% |
Overall Renewal Rate Among Scholarship Recipients |
65.9% |
70.2% |
*Drawn from first-year enrollment at the 198 law schools included in this analysis (excluding the law schools in Puerto Rico and treating Widener as one law school for these purposes) based on information published in the Standard 509 reports.
** Based on information published in the mandated Conditional Scholarship Retention charts by each law school with a conditional scholarship program.
Third, the distribution of conditional scholarship programs across tiers of law schools is even more pronounced in 2012-13 than it was in 2011-12. Using the USNews rankings from March 2014, only 16 law schools ranked in the top 50 had conditional scholarship programs in 2012-13 and eight of those 16 had a renewal rate of 97% or higher. Three of these law schools also eliminated their conditional scholarship programs as of the fall 2013 entering class. (Moreover, only six in the top 25 had conditional scholarship programs, five of whom had a renewal rate of 97% or higher.)
As you move further down the rankings, conditional scholarship programs become more common and manifest lower scholarship retention rates on average.
Of the 53 law schools ranked between 51 and 100 (with three tied at 100), 37 law schools (nearly 70%) had conditional scholarship programs, of which two eliminated their conditional scholarship programs as of fall 2013. Notably, of the 37 law schools with conditional scholarship programs, eight had a renewal rate of 91% or better (nearly 22%), while seven had a renewal rate of 65% or less (nearly 19%) (with the other 22 (nearly 60%) with renewal rates between 67% and 88%)
For law schools ranked between 104 and 146 (44 law schools in total), 35 law schools (nearly 80%) had conditional scholarship programs, of which three eliminated their conditional scholarship programs as of fall 2013. Notably, of the 35 law schools with conditional scholarship programs, six of the 35 had a renewal rate of 93% or better (roughly 17%) while 16 had a renewal rate of 65% or less (nearly 46%) (with the other 13 (roughly 37%) with renewal rates between 67% and 88%).
Finally, among the unranked schools, 47 of 51 had conditional scholarship programs – over 92% – only five of which had a renewal rate of 91% or better (nearly 11%), while 23 had a renewal rate of 65% or less (nearly 49%) (with the other 19 (roughly 40%) with renewal rates between 66% and 88%).
Tables 3 and 4 present comparative data across law schools in different USNews rankings categories. Table 3 describes the number of law schools with conditional scholarship programs and the distribution of scholarship retention rates among law schools. Table 4 describes the total number of students within each USNews rankings category along with the number of students on conditional scholarships and the number of students who had their conditional scholarship reduced or eliminated.
Table 3: Scholarship Retention Rates by USNews Ranking Categories
Category |
Top 50 |
51-100 (n=53) |
104-146 (n=44) |
Unranked (n=51) |
Schools with Conditional Scholarship Programs |
16 |
37 |
35 |
47 |
Retention Rates of 90% or More |
8 |
8 |
6 |
5 |
Retention Rates of 66%-88% |
4 |
22 |
13 |
19 |
Retention Rates of 65% or Less |
4 |
7 |
16 |
23 |
Table 4: Number and Percentage of First-Year Students in 2012 by USNews Rankings Categories Having Conditional Scholarships and Having Conditional Scholarships Reduced or Eliminated
|
Top 50 Law Schools (n=50) |
Law Schools Ranked 51-100 (n=53) |
Law Schools Ranked 104-146 (n=44) |
Law Schools Ranked Alphabetically (n=51) |
Number (%) of Law Schools with Conditional Scholarship Programs |
16 (32%) |
37 (70%) |
35 (79.5%) |
47 (92%) |
Total First-Years at These Law Schools |
11,862 |
10,937 |
7,611 |
12,180 |
Number (%) of First-Years with Conditional Scholarships |
1,587 (13.4%) |
3,192 (29.2%) |
3,247 (42.7%) |
4,444 (36.5%) |
Number (%) of Conditional Scholarship Recipients Whose Scholarships were Reduced or Eliminated |
154 (9.7% of conditional scholarship recipients and 1.3% of first-years) |
734 (23% of conditional scholarship recipients and 6.7% of first-years) |
1,124 (34.6% of conditional scholarship recipients and 14.8% of first-years) |
1,700 (38.3% of conditional scholarship recipients and 14% of first-years) |
Overall, as shown in Table 5, the distribution of retention rates across law schools was as follows for the 2012-13 academic year: 18 law schools had retention rates less than 50%, 20 law schools had retention rates between 50% and 59.99%, 25 law schools had retention rates between 60% and 69.99%, 21 law schools had retention rates between 70% and 79.99%, 25 law schools had retention rates between 80% and 89.99%, and 27 law schools had retention rates of 90% or better.
Table 5 – Number of Law Schools with Conditional Scholarship Renewal Rates in Different Deciles
Renewal Category |
Number of Schools |
90% or More |
27 (16 of which were ranked in top 100) |
80%-89.9% |
25 (12 of which were ranked in top 100) |
70%-79.9% |
21 (10 of which were ranked in top 100) |
60%-69.9% |
25 (8 of which were ranked in top 100) |
50%-59.9% |
20 (5 of which were ranked in top 100) |
Less than 50% |
18 (2 of which were ranked in top 100) |
Notably, of the 52 law schools ranked in the top 100 with conditional scholarship programs, only two (four percent) had retention rates that were less than 50%, while 16 (nearly 31%) had retention rates of 90% or better. By contrast, of the 82 (of 95) law schools ranked 104 or lower with conditional scholarship programs, 16 (nearly 20%) had retention rates of 50% or less, while only 11 (roughly 13%) had retention rates of 90% or better.
In sum then, with several schools eliminating their conditional scholarship programs as of fall 2013, less than 50% of the law schools ranked in the top 100 (47 of 103 – nearly 46%) still had conditional scholarship programs, and of those, more than 27% (13 of 47) had retention rates for the 2012-13 academic year of 90% or better while less than 22% (10 of 47) had retention rates of 65% or less.
By contrast, as of fall 2013, more than 80% of the schools ranked below 100 (79 of 95 – roughly 83%) still had conditional scholarship programs, and of those, less than 12% (9 of 79) had retention rates for the 2012-13 academic year of 90% or better and nearly half (39 of 79 – roughly 49%) had retention rates of 65% or less.
July 28, 2014 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)
Sunday, March 30, 2014
Review of The Lawyer Bubble and Tomorrow's Lawyers
Readers might enjoy my forthcoming essay, Letting Go of Old Ideas, 112 Mich L Rev _ (2014), which reviews two important new books on the legal profession, Steven Harper's The Lawyer Bubble and Richard Susskind's Tomorrow's Lawyers. If you want to know why the legal profession circa 2014 is such a rich topic for study, here is a useful clue: Harper and Susskind both critically examine this topic yet come to dramatically different conclusions that neither overlap nor conflict with one another. The complexities run that deep.
Thanks to his prolific commentary in the legal press, Harper's critique is familar to many readers. He is angry with the elite legal establishment -- large law firms and the legal professoriate -- for succumbing to "a culture of short-termism" that focuses obsessively on the AmLaw and US News league tables. As someone in the target group, I confess that I don't remember making a conscious decision to sell out. Yet, here is the problem. When all the facts in the public domain are arrayed by a skilled trial lawyer, the question can be asked, "why didn't you stand up to this nonsense?" This is a classic example of diffusion of responsibility. When we are all equally responsible for upholding good behavior, no one is responsible. Collective denial sets it, and the profession gets a black eye.
Yet, to my mind, there is an avenue for at least partial redemption -- reading Richard Susskind's slender 165 page book. In my Counterpoint essay, I lay out the mounting evidence that the legal industry is in the early stages of a sea change. The best theoretical treatment of this sea change is Susskind's Tomorrow's Lawyers. Yet, I am amazed at how many lawyers and law professors know essentially nothing about Susskind's work. Tomorrow's Lawyers was written for law students. It is a short, accessible book. After reading the first two paragraphs, I doubt anyone with a long-term time horizon in the legal industry will put it down without finishing it:
This book is a short introduction to the future for young and aspiring lawyers.
Tomorrow’s legal world, as predicted and described here, bears little resemblance to that of the past. Legal Institutions and lawyers are at a crossroads, I claim, and are poised to change more radically over the next two decades than they have over the last two centuries. If you are a young lawyer, this revolution will happen on your watch. (p. xiii).
If you have not read Tomorrow's Lawyers, you may be setting yourself for a Kodak moment.
March 30, 2014 in Blog posts worth reading, Current events, Important research, New and Noteworthy, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (2)
Monday, March 17, 2014
A Counterpoint to "The most robust legal market that ever existed in this country"
There is a line in Professor Reich-Graefe's recent essay, Keep Calm and Carry On, 27 Geo. J. Legal Ethics 55 (2014), that is attracting a lot of interest among lawyers, law students, and legal academics:
[R]ecent law school graduates and current and future law students are standing at the threshold of the most robust legal market that ever existed in this country—a legal market which will grow, exist for, and coincide with, their entire professional career.
This hopeful prediction is based on various trendlines, such as impending lawyer retirements, a massive intergenerational transfer of wealth that will take place over the coming decades, continued population growth, and the growing complexity of law and legal regulation.
Although I am bullish on future growth and dynamism in the legal industry, and I don't dispute the accuracy or relevance of any of the trendlines cited by Reich-Graefe, I think his primary prescriptive advice -- in essence, our problems will be cured with the passage of time -- is naive and potentially dangerous to those who follow it.
The Artisan Lawyer Cannot Keep Up
The primary defect in Reich-Graefe's analysis is that it is a one-sided argument that stacks up all impending positive trendlines without taking into account the substantial evidence that the artisan model of lawyering -- one-to-one consultative legal services that are tailored to the needs of individual clients -- is breaking down as a viable service delivery model.
Lawyers serve two principal constituencies--individuals and organizations. This is the Heinz-Laumann "Two-Hemisphere" theory that emerged from the Chicago Lawyers I and II studies. See Heinz et al, Urban Lawyers (2005). The breakdown in the artisan model can be observed in both hemispheres.
- People. Public defenders are understaffed, legal aid is overwhelmed, and courts are glutted with pro se litigants. Remarkably, at the same time, record numbers of law school graduates are either unemployed or underemployed. Why? Because most poor and middle-class Americans cannot afford to buy several hours of a lawyer's time to solve their legal problems.
- Organizations. The most affluent organizations, multinational corporations, are also balking at the price of legal services. As a result, foreign labor, technology, process, or some combination thereof has become a replacement for relatively expensive and unskilled junior lawyers.
The primary driver of this structural shift is the relentless growth in legal complexity. This increase in complexity arises from many sources, including globalization, technology, digitally stored information, and the sheer size and scope of multinational companies.
But here is a crucial point: the complexity itself is not new, only its relative magnitude. A century ago, as the modern industrial and administrative state was beginning to take shape, lawyers responded by organizing themselves into law firms. The advent of law firms enabled lawyers to specialize and thus more cost-effectively tackle the more complex legal problems. Further, the diffusion of the partner-associate training model (sometimes referred to as the Cravath system) enabled firms to create more specialized human capital, which put them in an ideal position to benefit from the massive surge in demand for legal services that occurred throughout the 20th century. See Henderson, Three Generations of Lawyers: Generalists, Specialists, Project Managers, 70 Maryland L Rev 373 (2011).
The legal industry is at the point where it is no longer cost effective to deal with this growing complexity with ever larger armies of artisan-trained lawyers. The key phrase here is cost effective. Law firms are ready and willing to do the work. But increasingly, clients are looking for credible substitutes on both the cost and quality fronts. Think car versus carriage, furnace versus chimney sweep, municipal water system versus a well. A similar paradigm shift is now gaining momentum in law.
The New Legal Economy
I have generated the graph below as a way to show the relationship between economic growth, which is the engine of U.S. and world economies, and the legal complexity that accompanies it.
This chart can be broken down into three phases.
1. Rise of the law firm. From the early twentieth century to the early 1980s, the increasing complexity of law could be capability handled by additional law firm growth and specialization. Hire more junior lawyers, promote the best ones partner, lease more office space, repeat. The complexity line has a clear bend it in. But for most lawyers, the change is/was very gradual and feels/felt like a simple linear progression. Hence, there was little urgency about the need for new methods of production.
2. Higher law firm profits. Over the last few decades, the complexity of law outpaced overall economic growth. However, because the change was gradual, law firms, particularly those with brand names, enjoyed enough market power to perennially increase billing rates without significantly improving service offerings. Corporate clients paid because the economic benefits of the legal work outweighed the higher costs. Lower and middle class individuals, in contrast, bought fewer legal services because they could not afford them. But as a profession, we barely noticed, primarily because the corporate market was booming. See Henderson, Letting Go of Old Ideas, 114 Mich L Rev 101 (2014).
3. Search for substitutes. Laws firms are feeling discomfort these days because the old formula -- hire, promote, lease more space, increase rates, repeat -- is no longer working. This is because clients are increasingly open to alternative methods of solving legal problems, and the higher profits of the last few decades have attracted new entrants. These alternatives are some combination of better, faster, and cheaper. But what they all share in common is a greater reliance on technology, process, and data, which are all modes of problemsolving that are not within the training or tradition of lawyers or legal educators. So the way forward is profoundly interdisciplinary, requiring collaboration with information technologists, systems engineers, project managers, data analysts, and experts in marketing and finance.
Why is this framework potentially difficult for many lawyers, law firms, and legal educators to accept? Probably because it requires us to cope with uncertainties related to income and status. This reluctance to accept an unpleasant message creates an appetite for analyses that say "keep calm and carry on." This is arguably good advice to the British citizenry headed into war (the origin of the saying) but bad advice to members of a legal guild who need to adapt to changing economic conditions.
There is a tremendous silver lining in this analysis. Law is a profoundly critical component of the globalized, interconnected, and highly regulated world we are entering. Lawyers, law firms, and legal educators who adapt to these changing conditions are going to be in high demand and will likely prosper economically. Further, at an institutional level, there is also the potential for new hierarchies to emerge that will rival and eventually supplant the old guard.
Examples
One of the virtues of lawyers is that we demand examples before we believe something to be true. This skepticism has benefited many a client. A good example of the emerging legal economy is the Available Positions webpage for kCura, which is a software company that focuses exclusively on the legal industry.
The current legal job market is terrible, right? Perhaps for entry-level artisan-trained lawyers. But at kCura, business is booming. Founded in 2001, the company now employs over 370+ workers and has openings for over 40 full-time professional positions, the majority of which are in Chicago at the company's LaSalle Street headquarters. Very few of these jobs require a law degree -- yet the output of the company enables lawyers to do their work faster and more accurately.
What are the jobs?
- API Technical Writer [API = Application Programming Interface]
- Big Data Architect - Software Engineering
- Business Analyst
- Enterprise Account Manager
- Group Product Manager
- Litigation Support Advice Analyst
- Manager - Software Engineering
- Marketing Associate
- Marketing Specialist -- Communications
- Marketing Specialist -- Corporate Communications and Social Media
- Product Manager -- Software and Applications Development
- QA Software Engineer -- Performance [QA = Quality Assurance]
- Scrum Team Coordinator [Scrum is a team-based software development methodology]
- Senior SalesForce Administrator
- Software Engineer (one in Chicago, another in Portland)
- Software Engineer (Front-End Developer) [Front-End = what the client sees]
- Software Engineer in Test [Test = finds and fixes software bugs]
- Technical Architect
- Technical Architect - Security
- VP of Product Development and Engineering
kCura operates exclusively within the legal industry, yet it has all the hallmarks of a great technology company. In the last few years it has racked up numerous awards based on the quality of its products, its stellar growth rate, and the workplace quality of life enjoyed by its employees.
That is just what is happening at kCura. There are many other companies positioning themselves to take advantage of the growth opportunities in legal, albeit none of them bear any resemblance to traditional law firms or legal employers.
In early February, I attended a meeting in New York City of LexRedux, which is comprised of entrepreneurs working in the legal start-up space. In a 2008 essay entitled "Legal Barriers to Innovation," Professor Gillian Hadfield queried, "Where are the 'garage guys' in law?" Well, we now know they exist. At LexRedux, roughly 100 people working in the legal tech start-up space were jammed into a large open room in SoHo as a small group of angel investors and venture capitalists fielded questions on a wide range of topics related to operations, sales, and venture funding.
According to Angel's List, there are as of this writing 434 companies identified as legal start-ups that have received outside capital. According to LexRedux founder Josh Kubicki, the legal sector took in $458M in start-up funding in 2013, up from essentially zero in 2008. See Kubicki, 2013 was a Big Year for Legal Startups; 2014 Could Be Bigger, Tech Cocktail, Feb 14, 2014.
The legal tech sector is starting to take shape. Why? Because the imperfections and inefficiencies inherent in the artisan model create a tremendous economic opportunity for new entrants. For a long period of time, many commentators believed that this type of entrepreneurial ferment would be impossible so long as Rule 5.4 was in place. But in recent years, it has become crystal clear that when it comes to organizational clients where the decisionmaker for the buyer is a licensed lawyer (likely accounting for over half of the U.S. legal economy) everything up until the courthouse door or the client counseling moment can be disaggregated into a legal input or legal product that can be provided by entities owned and controlled by nonlawyers. See Henderson, Is Axiom the Bellwether of Legal Disruption in the Legal Industry? Legal Whiteboard, Nov 13, 2013.
The Legal Ecosystem of the Future
In his most recent book, Tomorrow's Lawyers, Richard Susskind describes a dynamic legal economy that bares little resemblance to the legal economy of the past 200 years. In years past, it was easier to be skeptical of Susskind because his predictions seemed so, well, futuristic and abstract. But anyone paying close attention can see evidence of a new legal ecosystem beginning to take shape that very much fits the Susskind model.
Susskind's core framework is the movement of legal work along a five-part continuum, from bespoke to standardized to systematized to productized to commoditized. Lawyers are most confortable in the bespoke realm because it reflects our training and makes us indispensible to a resolution. Yet, the basic forces of capitalism pull the legal industry toward the commoditized end of the spectrum because the bespoke method of production is incapable of keeping up with the needs of a complex, interconnected, and highly regulated global economy.
According to Susskind, the sweet spot on the continuum is between systematized and productized, as this enables the legal solution provider to "make money while you sleep." The cost of remaining in this position (that is, to avoid commoditization) is continuous innovation. Suffice it to say, lawyers are unlikely to make the cut if they choose to hunker down in the artisan guild and eschew collaboration with other disciplines.
Below is a chart I have generated that attempts to summarize and describe the new legal ecosystem that is now taking shape [click-on to enlarge]. The y-axis is the Heinz-Laumann two-hemisphere framework. The x-axis is Susskind's five-part change continuum.
Those of us who are trained as lawyers and have worked in law firms will have mental frames of reference that are on the left side of the green zone. We tend to see things from the perspective of the artisan lawyer. That is our training and socialization, and many of us have prospered as members of the artisan guild.
Conversely, at the commoditized end of the continuum, businesses organized and financed by nonlawyers have entered the legal industry in order to tap into portion of the market that can no longer be cost-effectively serviced by licensed U.S. lawyers. Yet, like most businesses, they are seeking ways to climb the value chain and grow into higher margin work. For example, United Lex is one of the leading legal process outsourcers (LPOs). Although United Lex maintains a substantial workforce in India, they are investing heavily in process, data analytics, and U.S. onshore facilities. Why? Because they want to differientiate the company based on quality and overall value-add to clients, thus staving off competition from law firms or other LPOs.
In the green zone are several new clusters of companies:
- LeanLaw. This sector is comprised of BigLaw that is transforming itself through reliance on process and technology. Seyfarth Shaw has become the standard-bearer in this market niche, see What does a JD-Advantaged Job Look Like? A Job Posting for a "Legal Solutions Architect", Legal Whiteboard, Oct 15, 2013, though several other law firms have been moving under the radar to build similar capabilities.
- NewLaw. These are non-law firm legal service organizations that provide high-end services to highly sophisticated corporations. They also rely heavily on process, technology, and data. Their offerings are sometimes called "managed services." Novus Law, Axiom, Elevate, and Radiant Law are some of the leading companies in this space.
- TechLaw. These companies would not be confused with law firms. They are primarily tool makers. Their tools facilitate better, faster, or cheaper legal output. kCura, mentioned above, works primarily in the e-discovery space. Lex Machina provides analytic tools that inform the strategy and valuation of IP litigation cases. KM Standards, Neota Logic, and Exemplify provide tools and platforms that facilitate transactional practice. In the future, these companies may open the door to the standardization of a wide array of commercial transactions. And standardization drives down transaction costs and increases legal certainty -- all good from the client's perspective.
- PeopleLaw. These companies are using innovative business models to tap into the latent people hemisphere. Modria is a venture capital-financed online dispute resolution company with DNA that traces back to PayPal and the Harvard Negotiations Workshop. See Would You Bet on the Future of Online Dispute Resolution (ODR)? Legal Whiteboard, Oct 20, 2013. LegalForce is already an online tour de force in trademarks -- a service virtually every small business needs. The company is attempting to translate its brand loyalty in trademarks into to new consumer-friendly storefront experience. Its first store is in the heart of University Avenue in Palo Alto. LegalForce wants to be the virtual and physical portal that start-up entrepreneurs turn to when looking for legal advice.
Conclusion
When I write about the changes occurring in the legal marketplace, I worry whether the substance and methodology of U.S. legal education provides an excellent education for a legal world that is gradually fading away, and very little preparation for the highly interdisciplinary legal world that is coming into being.
Legal educators are fiduciaries to our students and institutions. It is our job to worry about them and for them and act accordingly. Surely, the minimum acceptable response to the facts at hand is unease and a willingness to engage in deliberation and planning. Although I agree we need to stay calm, I disagree that we need to carry on. The great law schools of the 21st century will be those that adapt and change to keep pace with the legal needs of the citizenry and broader society. And that task has barely begun.
March 17, 2014 in Blog posts worth reading, Current events, Data on legal education, Data on the profession, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (16)
Sunday, March 2, 2014
THOUGHTS ON FALL 2013 ENROLLMENT AND PROFILE DATA AMONG LAW SCHOOLS
DECLINING ENROLLMENT – Between fall 2012 and fall 2013, the 199 law schools in the 48 contiguous states and Hawaii (excluding the Puerto Rican schools) accredited by the ABA’s Section for Legal Education and Admissions to the Bar, experienced the following first-year enrollment changes:
25 schools had a decline in first-year enrollment of 25% or more,
34 schools had a decline in first-year enrollment of 15%-24.99%,
44 schools had a decline in first-year enrollment of 5% to 14.99%,
62 schools had “flat” first-year enrollment of -4.99% to 4.99%,
19 schools had an increase in first-year enrollment of 5% and 14.99%, and
15 schools had an increase in first-year enrollment of 15% or more.
Overall, more than half (103) had a decrease in first-year enrollment of at least 5%, while roughly 17% (34) had an increase in first-year enrollment of at least 5%.
Across these 199 schools, first-year enrollment declined from 42,590 to 39,109, a decrease of 8.2%. The average decline in first-year enrollment across U.S. News “tiers” of law schools was 2.6% among top 50 schools, 8.2% among schools ranked 51-99, 7.7% among schools ranked 100-144 and 7.9% among schools ranked alphabetically.
Between fall 2010 and fall 2013, the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth), experienced the following first-year enrollment changes:
28 schools had a decline in first-year enrollment of 40% or more,
29 schools had a decline in first-year enrollment of 30% to 39.99%
43 schools had a decline in first-year enrollment of 20% to 29.99%
43 schools had a decline in first-year enrollment of 10% to 19.99%
36 schools had a decline in first-year enrollment of 0% to 9.99%
10 schools had an increase in first-year enrollment of 0.01%to 9.99%
6 schools had an increase in first-year enrollment of 10% or more.
Overall, more than half (100) had a decrease in first-year enrollment of at least 20%, while only roughly 8% (16) had any increase in first-year enrollment.
Across these 195 schools, first-year enrollment declined from 50,408 to 38,773, a drop of 23.1%. The average decline in first-year enrollment across U.S. News “tiers” of law schools was 14.7% among top 50 schools, 22.5% among schools ranked 51-99, 22.8% among schools ranked 100-144, and 26.8% among schools ranked alphabetically.
DECLINING PROFILES -- Across the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (thus excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth) the entering first-year class average LSAT profile fell one point at all three measures between 2012 and 2013, from 159.6/157/153.5 to 158.6/156/152.5. The entering first-year class average LSAT profile fell roughly two points at all three measures between 2010 and 2013, from 160.5/158.1/155.2 to 158.6/156/152.5.
The average decline in median LSAT scores between 2012 and 2013 across U.S. News “tiers” of law schools was .98 among top 50 schools, 1.18 among schools ranked 51-99, .72 among schools ranked 100-144, and 1.13 among schools ranked alphabetically.
Notably, 133 law schools saw a decline in their median LSAT between 2012 and 2013, with 80 down one point, 38 down two points, 12 down three points, one down four points, one down five points and one down six points, while 54 law schools were flat and 7 saw an increase in their median LSAT.
In terms of schools experiencing “larger” declines in median LSAT scores between 2012 and 2013, five schools in the top 50 saw a three point decline in their median LSAT, five schools ranked 51-99 saw at least a three point decline (of which one was down four points), three schools ranked 100-144 saw a three point decline, and two schools ranked alphabetically saw large declines – one of five points and one of six points.
The average decline in median LSAT scores between 2010 and 2013 across U.S. News “tiers” of law schools was 1.54 among top 50 schools, 2.27 among schools ranked 51-99, 2.11 among schools ranked 100-144, and 2.79 among schools ranked alphabetically. If one were to unpack the top 50 schools a little more, however, one would discover that the top 20 schools saw an average decline in their median LSAT of 1.05 between 2010 and 2013, while the bottom 15 schools in the top 50 saw an average decline in their median LSAT of 2.53.
In terms of schools experiencing “larger” declines in median LSAT scores between 2010 and 2013, three schools in the top 50 have seen declines of four or more points, nine schools ranked 51-99 have seen declines of four or more points, 11 schools ranked 100-144 have seen declines of four or more points and 17 schools ranked alphabetically have seen declines of four or more points.
When looking at the 2012-13 data in comparison with the 2010-2013 data, one sees that lower ranked schools have had more of a sustained challenge in terms of managing profile over the last few years, while schools ranked in the top 50 or top 100 had been managing profile fairly well until fall 2013 when the decreased number of high LSAT applicants really began to manifest itself in terms of impacting the LSAT profiles of highly ranked schools.
The overall decline in the LSAT profile of first-year students also can be demonstrated with two other reference points. In 2010, there were 74 law schools with a median LSAT of 160; in 2013, that number has fallen to 56. At the other end of the spectrum, in 2010, there were only 9 schools with a median LSAT of less than 150 and only one with a median LSAT of 145. In 2013, the number of law schools with a median LSAT of less than 150 has more than tripled to 32, while the number of law schools with a median LSAT of 145 or less now numbers 9 (with the low now being a 143).
CONCLUDING THOUGHTS – Over the last three years, few schools have had the luxury of being able to hold enrollment (or come close to holding enrollment) and being able to hold profile (or come close to holding profile). Many schools have found themselves in a “pick your poison” scenario. A number of schools have picked profile and made an effort to hold profile or come close to holding profile by absorbing significant declines in first-year enrollment (and the corresponding loss of revenue). By contrast, a number of schools have picked enrollment and made an effort to hold enrollment or come close to holding enrollment (and maintaining revenue) but at the expense of absorbing a significant decline in LSAT profile. Some schools, however, haven’t even been able to pick their poison. For these schools, the last three years have presented something of a double whammy, as the schools have experienced both significant declines in first-year enrollment (and the corresponding loss of revenue) and significant declines in profile.
March 2, 2014 in Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (0)