Monday, May 2, 2016

Changes in Reporting and Classifying of Law-School-Funded Positions Result in Decline in Number of Graduates in Full-Time, Long-Term Law-School-Funded Bar-Passage-Required Positions

This blog posting summarizes how recent changes in the definition and reporting of law-school-funded positions have impacted the number of law-school-funded positions classified as full-time, long-term or full-time, short-term bar-passage-required positions for graduates in the Class of 2015. Comparisons between results for the Class of 2014 and the Class of 2015 show a significant decline in the number of full-time, long-term bar-passage-required positions that are law-school-funded (from 831 to 398) and a significant increase in the number of full-time short-term bar-passage-required positions that are law school funded (from 184 to 277). Overall the number of law-school-funded bar-passage-required positions declined by one-third, from 1015 to 675, as a result of these changes.

Changes in Reporting Framework and Definition

In March 2015, the Council for the Section of Legal Education and Admissions to the Bar approved a change in the reporting of law-school-funded positions to take effect this spring with reporting of employment outcomes for the Class of 2015. Previously, law schools included law-school-funded positions within all other employment categories “above the line” and then delineated “below the line” the number of law-school-funded positions in each category. Under this approach, between the Class of 2012 and the Class of 2014, the number of full-time, long-term bar-passage-required positions that were law-school-funded increased from 517 to 831, an increase of more than 60%.

With the change, however, the Council added “Employed – Law School Funded” as a separate “above the line” category such that law-school-funded positions no longer are included in other categories (e.g., Employed – Bar Passage Required or Employed – JD Advantage), although law schools are still required to provide more detailed information about the different categories of law-school-funded jobs “below the line” on the employment summary report.

In July 2015, the Council also approved a change in the definition of when a law-school-funded position may be classified as a “long-term” position, requiring that it be a position the employer expects to last at least one year with a salary of at least $40,000 per year.

Long Term. (OLD DEFINITION) A long-term position is one that does not have a definite or indefinite term of less than one year. It may have a definite length of time as long as the time is one year or longer. It may also have an indefinite length of time as long as it is expected to last one year or more.

Long-term. (NEW DEFINITION) A long-term position is one that the employer expects to last one year or more. A law school/university funded position that the law school expects to last one year or more may be considered long-term for purposes of this definition only if the graduate is paid at least $40,000 per year. . . .”

This change also took effect with the reporting of employment outcomes this spring for the Class of 2015.

An example might help explain how these changes might impact classification of a given position. Assume you have a graduate of Law School A in 2014 who took a one-year position as a lawyer with a public interest law firm as part of a “bridge-to-practice” program, working on a full-time basis and receiving a stipend of $24,000 paid partly by the law school. Law School A agreed to subsidize a portion of the stipend for a year but the law school continued to support the graduate’s ongoing effort to seek other gainful employment during the year.

In the Class of 2014 reporting format, this graduate could have been classified and probably would have been classified “above the line” in the full-time, long-term Employed – Bar Passage Required category because the job had a defined duration of one year even though the student might not be planning on staying in the position for the entire year. (This graduate also would have been listed separately “below the line” in the law-school-funded category as having a full-time, long-term bar-passage-required position).

Following the March 2015 changes, a Class of 2015 graduate in the same job, working as a lawyer with a public interest law firm on a full-time basis and receiving a stipend of $24,000 paid partly by the law school, would have been classified “above the line” in the full-time, long-term Employed – Law School Funded category and not in full-time, long-term Employed -- Bar Passage Required. (As was the case with the Class of 2014 graduates, this graduate also likely would have been listed separately “below the line” in the law-school-funded category as having a full-time, long-term bar-passage-required position).

Following the July 2015 changes, a Class of 2015 graduate in the same job, working as a lawyer with a public interest law firm on a full-time basis and receiving a stipend of $24,000 paid partly by the law school, would be classified “above the line” in the full-time, short-term Employed – Law School Funded category because under the new definition of “long-term” either or both the lack of an employer expectation that the job would last for one year or more or the lack of a stipend of at least $40,000 would mean that this job would not qualify as “long-term” and therefore would be classified as “short-term.” (This graduate also would be listed separately “below the line” in the law-school-funded category as having a full-time, short-term bar-passage- required position).

Consequences of the Change in Reporting Framework and Definition

With the ABA’s release of its Employment Summary report reporting employment for graduates of the Class of 2015 ten months after graduation, we can compare law-school-funded positions for the Class of 2015 with law-school-funded positions for the Class of 2014. The following table includes results from all law schools listed in the ABA’s Employment Summary spreadsheets for the Class of 2014 and for the Class of 2015.

Law School Funded Bar Passage Required, Full-Time, Long-Term and Full-Time, Short-Term Positions for the Class of 2014 and Class of 2015

YEAR

FTLT BPR LSF

FTST BPR LSF

TOTAL BPR LSF

Class of 2014

831

184

1015

Class of 2015

398

277

675

Full-time, long-term bar-passage-required positions that were law-school-funded declined by more than 50% from 831 to 398. Meanwhile, full-time, short-term bar-passage-required positions that were law-school-funded increased by roughly 50% from 184 to 277. Overall, however, law-school-funded positions that were in one of these two categories declined by 340 or by roughly 33%, from 1015 to 675.

Although it is not easy to know for sure, the most plausible explanation for these changes is that some of the jobs previously classified as full-time, long-term bar-passage-required positions had a stipend or salary lower than $40,000 per year and that law schools offering such positions could not increase the salary sufficiently to continue to have such positions classified as full-time, long-term bar-passage-required positions under the new regime. Alternatively, or additionally, some positions may not have been classified as full-time, long-term bar-passage-required positions if the employers with graduates with law-school-funded positions did not expect that the position would last for at least one year. These possibilities would explain the shift of some positions from full-time, long-term to full-time, short-term, but they would not necessarily explain the complete loss of so many law-school-funded bar-passage-required positions.

The loss of roughly one-third of the law-school-funded bar-passage-required positions might be explained partly by the decline in the number of graduates passing the July 2015 bar exam compared with July 2014.

Additionally, a portion of the loss of roughly one-third of the law-school-funded bar-passage-required positions also might be explained by the reality that there was more perceived “value” in a law school being able to claim a law-school-funded positon as a full-time, long-term bar-passage-required position than a full-time, short-term bar-passage-required position. With the change in reporting framework and definition, some law schools may have concluded that further investment in law-school-funded positions was not justifiable, particularly given how USNews accounts for these positions in its rankings (a point highlighted by Derek Muller in his post about these changes in law-school-funded positions).

Different Responses across Different Law Schools

  • The Top-25 Law Schools for Full-Time, Long-Term Law-School-Funded Bar- Passage-Required Positions for the Class of 2014

The decline in law-school-funded bar-passage-required positions was manifested most particularly at several law schools. The top-25 law schools for full-time, long-term, bar-passage-required positions that were law-school-funded for the Class of 2014 (those schools with 10 or more law-school-funded positions classified as full-time, long-term bar-passage-required positions), are responsible for the vast majority of the decline in such positions for the Class of 2015. Across these 25 law schools, the number of graduates in full-time, long-term bar-passage-required positions that were law-school-funded fell from 676 to 295, a drop of 381 out of the total decline of 440 or nearly 87% of the total decline in such positions. Across these 25 law schools, the number of graduates in full-time, short-term bar-passage-required positions that were law-school-funded increased from 11 to 213, far exceeding the actual increase in such positions (which was counter-balanced by several schools greatly reducing the number of full-time, short-term bar-passage-required positions that were law-school-funded).

  • 14 Law Schools in the Top-25 for Law-School-Funded Positions that Saw Significant Changes in Law-School-Funded Bar-Passage-Required Positions Between the Class of 2014 and the Class of 2014

There was a subset of 14 law schools within this group that saw the most significant changes between the Class of 2014 and the Class of 2015, being responsible for 359 of the 440 decline in full-time, long-term bar-passage-required positions that were law-school-funded and being responsible for an increase from 8 to 202 in the full-time, short-term bar-passage-required positions that were law-school-funded. These 14 law schools are set forth in the following table in descending order of the full-time, long-term bar-passage-required law-school-funded positions in the Class of 2014.

School

2014 LSF FTLT

BPR

2015 LSF FTLT

BPR

2014 LSF FTST

BPR

2015 LSF FTST

BPR

2014 Graduates

2015 Graduates

George Washington

78

6

0

19

584

465

Georgetown

64

35

0

53

626

678

Emory

52

0

0

20

268

308

American

44

4

0

40

460

464

Michigan

33

2

5

21

390

354

Southern California

31

7

0

20

217

213

Texas

23

11

1

1

351

354

Vanderbilt

22

0

0

12

194

185

Notre Dame

22

4

1

0

179

179

California-Berkeley

20

11

1

2

287

278

William and Mary

19

0

0

3

215

178

California-Davis

19

9

0

0

169

185

Washington Univ.

14

2

0

10

258

228

Cornell

11

2

0

1

191

183

TOTAL

452

93

8

202

4389

4252

Notably, across these 14 law schools, the total number of bar-passage-required positions that were law-school-funded declined from 460 (of which only eight were short-term) for the Class of 2014 to 295 (of which 202 were short-term) for the Class of 2015. At these 14 law schools, therefore, there not only was a decline of 165, over one-third, in the number of full-time, law-school-funded, bar-passage-required positions, there also was a dramatic shift in the ratio of full-time, long-term to full-time, short-term bar-passage-required positions, from over 98% to less than 33%.

  • 11 Law Schools in the Top-25 for Law-School-Funded Positions that Did Not See Significant Changes in Law-School-Funded Bar-Passage-Required Positions Between the Class of 2014 and Class of 2015

At the other 11 law schools among the top-25 for law-school-funded positions that were bar-passage-required in the Class of 2014 there was not a significant decline in law-school-funded positions that were bar-passage-required for the Class of 2015. These 11 law schools are set forth in the following table in descending order of the full-time, long-term bar-passage-required law-school-funded positions in the Class of 2014.

School

2014 LSF FTLT

BPR

2015 LSF FTLT

BPR

2014 LSF FTST

BPR

2015 LSF FTST

BPR

2014 Graduates

2015 Graduates

New York Univ.

36

30

0

2

479

485

Virginia

33

30

0

0

349

367

UCLA

31

31

2

0

336

335

Columbia

31

28

0

0

468

413

Harvard

24

20

1

0

586

589

Illinois

15

10

0

2

185

181

Boston University

12

12

0

0

246

208

Brigham Young

11

9

0

0

138

133

Chicago

11

6

0

5

210

196

California-Irvine

10

20

0

0

93

110

Stanford

10

6

0

2

187

195

TOTAL

224

202

3

11

3277

3212

These law schools either already had salaries of at least $40,000 for most of their law-school-funded bar-passage-required positions for the Class of 2014 or made the decision to make sure that the vast majority of their law-school-funded bar-passage-required positions for the Class of 2015 had salaries of at least $40,000, as the number of full-time, long-term law-school-funded positions that were bar-passage-required across these 11 law schools only declined by 22 while the number of full-time, short-term law-school-funded positions that were bar-passage-required increased only by eight. The ratio of full-time, long-term to full-time, short-term bar-passage-required positions across these 11 law schools changed very little, from over 98% to nearly 95%.

  • The Remaining Law Schools

Across the remaining law schools, for the Class of 2014, there were only 57 law schools across which there were 155 law-school-funded positions that were full-time, long-term bar-passage-required positions. For the Class of 2015, there were only 46 law school across which there were 103 full-time, long-term positions that were bar-passage-required. Across this set of schools, therefore, there was a decline of 52 positions or roughly one-third in the number of full-time, long-term bar-passage-required positions.

Across all the remaining law schools, for the Class of 2014, there were only 24 law schools with a total of 173 full-time, short-term bar-passage-required law-school-funded positions. For the Class of 2015, there were only 19 law schools with a total of 64 full-time, short-term bar-passage-required, law-school-funded positions. Thus, full-time, short-term bar-passage-required positions that were law-school-funded declined across these law schools by over 100.

In total, then, these other law schools saw law-school-funded bar-passage-required positions decline from a total of 328 for the Class of 2014 to only 167 for the Class of 2015, a decline of nearly 50%.

Total Changes in Law-School-Funded Bar-Passage-Required Positions

Between the Class of 2014 and the Class of 2015

 

2014 LSF BPR FTLT

2015 LSF BPR FTLT

2014 LSF BPR FTST

2015 LSF

BPR FTST

Top 25 (10 or more LSF BPR FTLT in 2014)

676

295

11

213

11

224

202

3

11

14

452

93

8

202

Other Schools with LSF

155

(57 schools)

103

(46 schools)

173

 (24 schools)

64

 (19 schools)

Total

831

398

184

277

(I am very grateful to Janelle Chambers for her research assistance in compiling this data and am very grateful to Scott Norberg and Bernie Burk for helpful comments on earlier drafts of this blog posting.)

May 2, 2016 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Sunday, May 1, 2016

Mixed Signals from the Legal Employment Market – Preliminary Results for the Class of 2015

THIS BLOG UPDATES THE EARLIER BLOG POSTING TO INCORPORATE DATA FROM THE ABA's EMPLOYMENT SUMMARY SPREADSHEETS FOR THE CLASS OF 2014 and CLASS OF 2015 AS OF MAY 3, 2016, WITH DOUBLE-COUNTED DATA FOR MITCHELL|HAMLINE IN THE CLASS OF 2015 REMOVED AND WITH ALL LAW-SCHOOL-FUNDED POSITIONS FOR BOTH YEARS REMOVED FROM THE CALCULATIONS.  THE 2015 NUMBERS NOW MATCH THOSE ON THE ABA's 2015 LAW GRADUATE EMPLOYMENT DATA SHEET RELEASED ON MAY 3 WHILE THE 2014 NUMBERS NOW MATCH THOSE FOR 2014 ON THE ABA's 2015 LAW GRADUATE EMPLOYMENT DATA SHEET ONCE LAW-SCHOOL-FUNDED POSITIONS ARE REMOVED.

The Class of 2015 employment summary reports have been posted by all ABA-accredited law schools, resulting in reporting of results for some states or regions.  The ABA Section of Legal Education and Admissions to the Bar released the complete Employment Summary spreadsheet for all law schools on its website yesterday (May 2) and updated it today (May 3) and likely will be updating it again tomorrow (to eliminate the double-counting for Hamline, William-Mitchell and Mitchell|Hamline).

In this initial post I provide a brief summary of the Class of 2015’s employment outcomes compared with the Class of 2014’s employment outcomes based on data from these spreadsheets as of described above.

In a subsequent post (posted on May 2) I provide a summary of changes in the reported number of law-school-funded, bar-passage-required positions between the Class of 2014 and the Class of 2015 as a result of changes in the classification and reporting of such positions.

Changes in the Percentage of Graduates and Number of Graduates in Full-Time, Long-Term Bar-Passage-Required and JD Advantage Jobs

Across all law schools for which the ABA has released employment summary data for the Class of 2015, the percentage of graduates in full-time, long-term bar-passage-required positions and full-time, long-term JD advantage positions increased from 69% for the Class of 2014 to 70.1% for the Class of 2015. This would appear to be modestly good news. When you disaggregate the two categories, the full-time, long-term bar-passage required positions went from 58% to 59.2% while the full-time, long-term JD advantage positions went from 11% to 10.9%.

Because there was a significant decline in the number of graduates across these law schools between 2014 and 2015, however, this modest increase in the percentage of graduates in these positions masks an actual decline in the number of graduates in such positions. There were 39,984 graduates in the Class of 2015 compared with 43,832 graduates in the Class of 2014, a decline of 3,848 graduates, or 8.8%. There were 28,029 graduates in the Class of 2015 with full-time, long-term bar-passage-required or JD advantage positions, compared with 30,234 graduates in the Class of 2014 with such positions, a decline of 2,205, or 7.3%.

When these totals are disaggregated, full-time, long-term bar-passage-required positions declined from 25,417 for the Class of 2014 to 23,687 for the Class of 2015, a decline of 1,730, or 6.8%. For full-time, long term JD advantage positions, the total went from 4,817 to 4,342, a decline of 475, or 9.9%.

(Please note that numbers for both 2014 and 2015 exclude law-school-funded positions from both categories.  The ABA's 2015 Law Graduate Employment Data sheet compares Class of 2014 INCLUDING law-school-funded positions with CLASS of 2015 EXCLUDING law-school-funded positions, which leads to slightly different results showing a more exaggerated decline in the number of graduates in full-time, long-term bar-passage-required and JD advantage jobs that also results in a decline in the percentage of graduates in such positions.)

Comparison of Full-Time, Long-Term Bar-Passage-Required Positions and JD Advantage Positions for the Class of 2014 and Class of 2015

 

Graduates

# FTLT

BPRJDA

% FTLT

BPRJDA

# FTLT

BPR

% FTLT

BPR

# FTLT

JDA

% FTLT

JDA

Class of 2014

43,832

30,234

69%

25,417

58%

4,817

11%

Class of 2015

39,984

28,029

70.1%

23,687

59.2%

4,342

10.9%

Change

(3,848)

(2,205)

 

(1,730)

 

(475)

 

Changes in the Number and Percentage of Graduates Whose Employment Status is Unknown or Who Were Classified as Unemployed Seeking or Unemployed Not Seeking

Looking at the other end of the employment outcomes continuum, however, both the number and percentage of graduates who had unknown employment outcomes, or who classified as unemployed seeking or unemployed not seeking, declined slightly between the Class of 2014 and the Class of 2015. For the Class of 2014, there were 5,778 graduates whose employment status was unknown or who were classified as unemployed seeking or unemployed not seeking. This represented 13.2% of the 43,832 graduates. For the Class of 2015, however, there were only 5,200 graduates whose employment status was unknown or who were classified as unemployed seeking or unemployed not seeking. This represented 13% of the 39,984 graduates.

Searching for Explanations

In the coming weeks and months, there likely will be a number of commentators offering suggestions for why the Class of 2015 might have seen a decline in the number of graduates obtaining full-time, long-term bar-passage-required or JD advantage positions.

Part of the decline likely is attributable to the decline in the number and percentage of graduates passing the July bar exam, as reported by the NCBE in its annual statistics publications for each of the last three years.

Year

First-Time Bar Takers in July from ABA-Accredited Law Schools*

First-Time Bar Passers in July from ABA-Accredited Law Schools

July Pass Rate Among First-Time Takers from ABA-Accredited Law Schools

2013

47,465

38,909

82%

2014

44,282

34,333

78%

2015

39,955

29,772

75%

*Note that the NCBE’s classification of first-time takers is over-inclusive in that it reflects not just graduates from May who are taking the bar exam for the first time in July, but also graduates from a prior year who might be taking the bar exam for the first-time in a given jurisdiction even if they have previously taken the bar exam in another jurisdiction. Thus first-time bar passers includes some people who are not part of the graduating cohort in a given year.

In the two-year period, then, between 2013 and 2015, the number of first-time takers from ABA-accredited law schools taking the July bar exam who passed the exam and became eligible for jobs requiring bar passage declined by roughly 9,100 and by nearly 23.5%. Moreover, the percentage of all first-time bar takers taking the February exam rather than the July exam also increased slightly between 2013 and 2015 from 18.7% to 19.7%, which might mean slightly more May 2015 graduates might not have been positioned to accept a full-time, long-term bar-passage-required or JD advantage position as of March 15, 2016, because they may have been studying for and taking the February 2016 bar exam.

Part of the decline also likely is attributable to market conditions in some parts of the country. For example, a recent story about graduates of Texas law schools noted that the decline in oil prices and tort reform may have impacted hiring in the Texas legal market for graduates of the Class of 2015. Once the full set of employment outcomes is available, it will be easier to assess the extent to which certain states or certain regions might have seen better or worse results than other states or regions.

Part of the decline also may be a manifestation of the impact of technology on the legal services market, with the possibility that the legal services market will have slightly fewer entry level positions over the near term.

One Possible Counterpoint

If this decline in the number of full-time, long-term bar passage required positions is a manifestation of a weakening job market law graduates, then one would expect that salary data also would demonstrate weakness. Once NALP publishes its report on the employment results for the Class of 2015 later this summer, we will have a chance to assess the extent to which salary trends are consistent with a weakening legal services market or suggest that the market remains somewhat competitive. If this decline in graduates taking jobs that are full-time, long-term bar passage required or JD advantage jobs is counterbalanced by a continuation of the year-over-year modest increases in mean and median salaries in recent years for  law graduates, it might suggest that that there is less market weakness than this initial employment summary might indicate.

Concluding Thoughts

For those thinking that the recent news about the improving situation with respect to applicants to law school is the beginning of an upward trend that will gradually return law schools to first-year class sizes in the 45,000 to 46,000 range, this employment outcomes data provides a cautionary tale. The fact that the employment market for law school graduates appears to have stagnated and even declined to some extent over the last two years may mean that risk averse potential law school applicants who focus on post-graduate employment opportunities when assessing whether to invest in a legal education may remain skittish about applying, such that this year’s good news on the applicant front may be somewhat short-lived.

(I am very grateful for the research assistance of Janelle Chambers in gathering data for this blog posting prior to the release of the ABA Employment Summary spreadsheet and for very helpful comments on earlier drafts of this blog posting from Scott Norberg and Bernie Burk and for the helpful insights of Debby Merritt as we worked on reconciling data in the ABA spreadsheets.)

May 1, 2016 in Current events, Data on legal education, Scholarship on legal education | Permalink | Comments (1)

Sunday, April 24, 2016

Projections for Law School Enrollment for Fall 2016

In this blog posting I am doing two things. First, I provide a detailed analysis to estimate the likely total applicant pool we can expect at the end of the current cycle based on trends from March through the end of the cycle in 2013 and 2014 and 2015. Second, given the increase in the strength of the applicant pool, I suggest that law schools in the top 60 or 70 of USNEWS ranking will see more enrollment growth and profile stability in comparison with law schools further down the rankings continuum.

ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS

Reviewing the 2013, 2014, and 2015 Cycles to Inform the 2016 Cycle

The table set forth below shows the number of applicants in the admissions cycle as of early March in 2013, 2014, 2015 and 2016 along with the projected total applicant pool (based on percentage of applicants at that point in the cycle in the previous year) and the actual total applicant pool at the end of each cycle (with an estimate of the 2016 total applicant pool).

2013 Current Volume Summary Date

Applicants

% of Cycle in Previous Year on This Date

Applicant Pool

Mar. 8, 2013

46,587

84%

55,460 Projected as of March 8 based on % of Cycle

End of Cycle

   

59,400 Actual

2014 Current Volume Summary Date

Applicants

% of Cycle in Previous Year on This Date

Applicant Pool

Mar. 7, 2014

42,068

79%

53,250 Projected on March 7 based on % of Cycle

End of Cycle

   

55,700 Actual

2015 Current Volume Summary Date

Applicants

% of Cycle in Previous Year on This Date

Applicant Pool

Mar. 6, 2015

39,646

76%

52,160 Projected on March 6 based on % of Cycle

End of Cycle

   

54,500 Actual

2016 Current Volume Summary Date

Applicants

% of Cycle in Previous Year on This Date

Applicant Pool

Mar. 4, 2016

42,981

76%

56,553 Projected on March 4 based on % of Cycle

End of Cycle

   

57,500 Estimate

In each of the last three years, a modest surge in late applicants meant the final total applicant count exceeded the March projections by more than 2000, with the amount by which the actual total applicant count exceeded the projected total applicant count getting smaller each year (dropping from roughly 4,000 in 2013 to roughly 2,300 in 2015). This “late surge” would suggest that the projection for fall 2016 based on the applicant pool as of March 4, 2016 (for just over 56,500) likely understates the end of cycle total applicant pool. To be somewhat conservative, I am estimating that the final total applicant pool in 2016 will exceed the early March projection by roughly 1,000, the smallest such increase in the last four years, resulting in an estimated total applicant pool of 57,500 (up about 5.5% from 2015). This would be the first increase in applicants since 2010.

ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS

The chart below shows the number of applicants, admitted applicants and matriculants over the last four years along with an estimate for fall 2016 based on the assumption above that we have a total of 57,500 applicants this cycle. With 3,000 more applicants than in 2014-15, I am assuming 2,400 more admitted applicants (roughly 80% of the additional applicants), and then assuming the number of matriculants will reflect close to the four-year average for the percentage of admitted applicants who matriculate – 87.6%. This would yield a first-year entering class of 39,150, up about 5.6% from 2015. (Using this process last April, I estimated a first-year enrollment of 36,975, 83 less than the actual first-year enrollment of 37.058.)

Estimates of Admitted Students and Matriculants for 2016 Based on Trends in 2012-2015

 

Applicants

Admitted Students

Percent of Applicants

Matriculants

Percent of Admitted

2012

67,900

50,600

74.5%

44,481

87.9%

2013

59,400

45,700

76.9%

39,675

86.8%

2014

55,700

43,500

78.1%

37,924

87.2%

2015

54,500

42,300

77.6%

37,058

87.6%

2016 (est.)

57,500

44,700

77.7%

39,150

87.6%

DIFFERENTIAL IMPACT ON ENROLLMENT AND PROFILES ACROSS DIFFERENT CATEGORIES OF LAW SCHOOLS

Earlier this year Ian Ayres noted that lower-ranked law schools have benefited from the rankings concerns of higher-ranked law schools. In the last few years, as higher-ranked law schools admitted fewer applicants in an effort to maintain their LSAT/GPA profiles, they left more applicants for lower-ranked law schools to admit. In this admissions cycle, the strength of the pool of applicants means things likely will swing the other way. Higher-ranked law schools likely will be admitting more students, leaving fewer students for lower-ranked law schools to admit.

INCREASES IN APPLICANTS WITH HIGH LSATs BODE WELL FOR HIGHER RANKED LAW SCHOOLS

For the first time in the last five years, we are seeing a year-over-year increase in the number of applicants with LSATs of 165 or higher. As of the April 15 Current Volume Summary, there were a total of 7,054 applicants with LSATs of 165 or higher, compared with 6,519 on April 17, 2015. Another 130 with LSATs of 165 or higher ended up applying during the balance of the 2014-15 admissions cycle, resulting in a total of 6,649. I am presently assuming there will be another 146 applicants with LSATs of 165 or higher in the balance of the 2015-16 admissions cycle for a total of 7,200. On average, over the past four years, 82.6% of these applicants have matriculated. I think it is going to be slightly higher this year as I think there are a number of top-60 or top-70 law schools dealing with revenue pressures from decreased enrollment in recent years that are going to take advantage of the stronger quality in this applicant pool to increase their first-year enrollment without seeing too much erosion in their entering class profile. Thus, I think we will see roughly 6,000 matriculants this year with LSATs of 165 or higher, an increase of nearly 500 from fall 2015.

Five-Year Trend in Applicants and Matriculants with LSATs of 165+ and Estimates for 2015

 

Applicants with LSATs of 165+

Matriculants with LSATs of 165+

Percent of Applicants Matriculating

2010

12,177

9,477

77.8%

2011

11,190

8,952

80%

2012

9,196

7,571

82.3%

2013

7,496

6,154

82.1%

2014

7,477

6,189

82.8%

2015

6,649

5,505

82.8%

2016 (est.)

7,200

6,000

83.3%

In addition, the number of applicants with LSATs of 160-164 also has increased in this cycle, from roughly 6,500 at this point in 2014-15 to over 6,800 in 2015-16. This likely means that at the end of the cycle there will be at least 300 more applicants with LSATs of 160-164, which likely will generate an additional 240 matriculants (roughly 80% or the 300 more applicants) in this range than in the 2014-15 admissions cycle. Combining these categories, when this admissions cycle ends, there likely will be 740 more matriculants with LSATs of 160 or higher in the 2015-16 applicant pool than in the 2014-15 applicant pool – from roughly 11,200 to nearly 12.000.

This increase in quality in the applicant pool means law schools ranked in the top 60 or top 70 or so (those with median LSATs near or above 160), collectively could be able to welcome more than 1,200 more matriculants than last year without meaningfully impacting their profile. (If the top 70 law schools garner 600 of the 740 additional applicants with LSATs of 160 or higher, they also could admit almost as many additional applicants with LSATs below their median without impacting their profile. For top-70 law schools focused on profile AND revenue, every additional matriculant with an LSAT above 160 who helps the law school maintain its median LSAT allows the law school to add a matriculant with an LSAT of less than 160.)(Of course, not all law schools are going to have the financial strength to continue to use scholarship resources to attract top applicants, so there likely will be some variability among top 70 schools in terms of enrollment growth/decline and in terms of profile retention/erosion.)

Continuing But Slowing Declines in Applicants with LSATs Between 150-159 Likely Will Present Challenges for Some Law Schools with Median LSATs Between 150-159 

Year

LSAT of 140-144

LSAT of 145-149

LSAT of 150-154

LSAT of 155-159

2013

6114

9439

11430

10920

2014

5893

8428

10587

9919

2015

6214

8665

10518

9681

2016 (est.)

6500

9000

10400

9600

Based on the numbers of applicants with LSATs between 150-159 as of the April 15 Current Volume Summary, the pool of applicants in this range is likely to remain flat or continue to show a modest decline as reflected in the table above. If law schools in the top-60 or top-70 do take advantage of the increase in applicants with LSATs of 160 or higher to increase their enrollment, then fewer of these 20,000 applicants with LSATs between 150-159 will be available to law schools with median LSATs in those ranges. This will put pressure on law schools with median LSATs of 150-159 to admit fewer applicants or to dip deeper into the applicant pool to fill their classes. (Note that while the pool of applicants with LSATs between 150-159 is flat to slightly down, the pool of applicants with LSATs between 140-149 appears to be increasing again this year, for the second year in a row.) Once again, enrollment results and profile results are likely to vary somewhat widely across law schools depending upon their relative financial strength and their ability to continue to use scholarship assistance to compete for qualified applicants.

CONCLUSION

If the estimates regarding applicants and matriculants above are accurate we will see roughly 2,100 more matriculants in the 2015-16 cycle. The increased strength of the applicant pool and the anticipated admissions strategies and efforts of top-ranked schools dealing with revenue pressures from reduced enrollment in the last few years likely will mean that most of the increase in matriculants, perhaps as many as 1,200 or more, will be among law schools that are relatively highly ranked – perhaps the top-60 or top-70.

This anticipated increase in enrollment among top law schools likely will decrease the number of applicants in the 150-159 LSAT range available to lower-ranked law schools, particularly given that the number of applicants with LSATs of 150-159 already looks like it could be slightly smaller this year. This likely will leave law schools outside the top-60 or top-70 facing challenging decisions of shrinking enrollment further to hold profile (and dealing with further revenue declines) or accepting declines in profile in exchange for stable or larger enrollments (and the corresponding revenue).

With continued growth in applicants between 140-149 to go along with the projection of a slight decline in the number of applicants with LSATs of 150-159, many law schools ranked outside the top-60 or top-70 may find it difficult to maintain their LSAT profiles as the pool of applicants from which they can draw their matriculants will be weighted more to the lower end of the LSAT distribution.

QUESTIONS TO CONSIDER

First, what might explain the growth in the number of applicants with LSATs of 160 or more for the first time in the last several years? This group had been the “market leaders” in walking away from legal education in recent years. Is this a one-time bounce or is this group going to continue to return to legal education in larger numbers?

Second, why is the middle group – those with LSATs of 150-159 -- not showing an uptick in applicants, when there is growth among those with LSATs of 160 or higher AND growth among those with LSATs of 140-149? The group of applicants with LSATs of 150-159 is more likely to be able to pass the bar exam upon completing law school than the group of applicants with LSATs of 140-149. With bar passage rates falling significantly, particularly from those graduates of law schools with lower LSAT profiles, one might have expected that fewer people with LSATs of 140-149 would be applying to law school (as they are most at risk of bar passage failure), but this cycle shows continued modest growth in that pool of applicants while the group of applicants with LSATs of 150-159 is flat to down slightly.

Third, will this strengthening of the quality of the applicant pool portend an improvement in bar passage results in July 2019? It is too early to answer this question. Once actual enrollment profiles are available in December, it will be easier to analyze the possible impact on bar passage results.

(I am very grateful for thoughtful comments from Bernie Burk and Scott Norberg on an earlier draft of this blog posting.)

April 24, 2016 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Friday, March 11, 2016

Conditional Scholarships Reprise – Of Sticks and Carrots and Asking Questions

A few years ago the Council for the Section of Legal Education and Admissions to the Bar mandated greater transparency regarding conditional scholarships, requiring law schools that offer conditional scholarships to publicize on their webpages, and to applicants receiving conditional scholarship offers, the number of conditional scholarships awarded to students and the number that had been reduced or eliminated over each of the prior three academic years.

Applicants previously had not been aware of how many students were getting conditional scholarships and didn’t know how likely they were to keep the conditional scholarships given the law school’s grading curve. They were generally unduly optimistic about their likelihood of retaining a scholarship. The mandated disclosure was designed to ameliorate this information asymmetry and optimism bias.

I have written about conditional scholarships on several occasions over the last several years, initially noting the need for greater transparency and then analyzing the data on conditional scholarships once its publication was mandated. I posted the most recent summary in December 2015, covering the 2014-15 academic year and comparing it with the 2011-12 academic year. Notably, over the last few years, while more than two dozen law schools have shifted away from using conditional scholarships, the percentage of first-year students with conditional scholarships remained at roughly 27%, although slightly fewer first-year students saw their scholarships reduced or eliminated (7.8% down from 9.4%).

With tuition deposits due in the next several weeks, prospective law students likely are comparing the varied opportunities they may have in terms of law schools and scholarship offers.  I write at this time to highlight the need for applicants receiving conditional scholarship offers to ask questions of admissions officials regarding conditional scholarships at their law schools, both with respect to traditional conditional scholarships and with respect to a new type of conditional scholarship that apparently is being offered by at least one law school and perhaps others. Prospective students need to be proactive in combatting their own propensity for optimism bias. Pre-law advisors need to help students be more proactive in combatting their propensity for optimism bias.

The Need to Ask Questions with Respect to Traditional Conditional Scholarships that Function as a Stick

Traditional conditional scholarships operate as a “stick.” If a student doesn’t maintain a defined GPA or class rank, the student’s scholarship is reduced or eliminated.

Law schools are required to publish, and to provide to conditional scholarship recipients, the number of conditional scholarship recipients and the number whose scholarships were reduced or eliminated in each of the prior three years. This is helpful generally, but it doesn’t necessarily help specific applicants all that much.

For example, assume a law school’s published data indicates that 80 students received conditional scholarships in each of the prior three years and that 20 students saw their scholarships reduced or eliminated each year. At first blush, this makes it look like the average conditional scholarship recipient has a 75% (60/80) chance of retaining her scholarship. But who is the average conditional scholarship recipient? Assuming all students had to meet the same “condition” – perhaps maintain a first-year GPA of 3.0 -- it is likely that conditional scholarship recipients in the top quarter of the LSAT/GPA distribution for entering students at the law school had perhaps a 90-95% likelihood of retaining their scholarship, while conditional scholarship recipients near the middle or below the middle of the LSAT/GPA distribution for entering students at the law school had perhaps a 50-60% likelihood of retaining their scholarship.

Recognizing this likely disparity, conditional scholarship recipients should be asking the admissions officials at the law schools from which they are receiving conditional scholarship offers what additional information the admissions officials can provide about the extent to which a student with a comparable profile and comparable condition was likely to see his conditional scholarship reduced or eliminated. Were those at the top end of the LSAT/GPA distribution more likely to retain their conditional scholarship? Were those further down the LSAT/GPA distribution less likely to retain their conditional scholarships? How did the nature of the condition impact the likelihood that a student with a given profile retained her scholarship?

Law schools should have this information available and should be willing to provide answers to these questions.  Prospective students need answers to these questions to be best positioned to calculate the expected value of a conditional scholarship over three years so that the student can make meaningful cost-comparisons across law schools.

The Need to Ask Questions with Respect to New Conditional Scholarships that Function as a Stick and a Carrot

At least one law school, and possibly others, have what appears to be a new type of “conditional” scholarship, which can best be described as a both a “stick” and a “carrot.” In addition to reducing or eliminating a student’s conditional scholarship if the student fails to maintain a given GPA or class rank, the “carrot” approach to the conditional scholarship offers students AN INCREASED SCHOLARSHIP if the student obtains a given GPA or class rank.

For example, assume a given law school has the same published information as in the previous example – 80 students received conditional scholarships and 20 students had their scholarships reduced or eliminated.

An applicant receives a conditional scholarship for 50% tuition and is informed that the scholarship will be eliminated if she fails to maintain a cumulative GPA of 2.5 at the end of the first year. But she also is informed that the scholarship will increase to 75% if she obtains a GPA of 3.5 and to 100% if she obtains a GPA of 3.7.

This student needs to ask several questions of the admissions officials at the law school. First, she needs to ask whether, given her LSAT/GPA profile, and her renewal threshold (2.5 GPA), she has the average likelihood of maintaining her scholarship (75%) or perhaps a higher or lower likelihood of maintaining her scholarship. (If the school offers 100% scholarships with a renewal condition of 3.5 and a 75% scholarship with a renewal condition of 3.0 and a 50% scholarship with a renewal condition of 2.5, it may be that the people with 50% scholarships have a higher likelihood of retaining their scholarships then those with larger scholarships but correspondingly higher conditions.)

Second, however, the student also needs to ask how many students in the previous two or three years who came into school with an LSAT/GPA profile comparable to hers managed to get a 3.5 GPA or a 3.7 GPA. For a prospective student with an LSAT/GPA in the bottom half of the entering class LSAT/GPA distribution, it well may be that few, if any, comparable students managed to get a 3.5 GPA or a 3.7 GPA at the end of the first year.

New Creative Efforts to Play on Optimism Bias of Applicants

This “carrot” approach to conditional scholarships is simply the newest technique for taking advantage of the optimism bias of prospective students. The Standard 509 disclosure obligations do not capture this type of conditional scholarship. Thus, law schools do not have an affirmative obligation to disclose the extent to which students in various ranges across the LSAT/GPA distribution of an entering class are likely to obtain a GPA of 3.5 or 3.7 at the end of the first-year.

Indeed, this “carrot” approach could be used by any law school – even law schools that do not generally offer conditional scholarships that trigger a reporting obligation. Such a law school could offer a slightly smaller unconditional scholarship on the front end along with the “carrot” condition – the prospect of a scholarship increase if certain GPA performance thresholds are met -- and perhaps entice students who optimistically believe they are going to outperform their LSAT/GPA profile to accept the law school’s scholarship offer rather than a comparable scholarship offer from another law school that did not offer a “carrot.”

Of course, this “carrot” approach to conditional scholarships presents another information asymmetry problem and optimism bias problem. The law school would know how few students meet the GPA threshold for an increased scholarship while the prospective students would optimistically, but unrealistically, believe they are capable of meeting the threshold.

But the fact that law schools do not have an affirmative obligation to disclose the likelihood of success in meeting the GPA threshold for the enhanced scholarship award does not mean that prospective students can’t ask for very specific information about the number of students with comparable LSAT/GPA profiles who actually obtained the GPA thresholds over the prior three years. Once again, law schools should have this information available and should be willing to disclose the information.

In any of these situations, if a prospective student asks for specific information about the scholarship retention or scholarship enhancement prospects of similarly-situated students in the three prior years and a law school claims not to have the information or is not willing to share the information, this should prompt suspicion on the part of the prospective student. Law schools have this information (or should have it) and should provide answers to these questions when asked.

March 11, 2016 in Data on legal education, Innovations in legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, March 8, 2016

Comments on Proposed Revisions to Standard 501

Standard 501 requires law schools to have sound admissions policies and to refrain from admitting applicants who are not capable of being successful in law school and on the bar exam. For many years, Standard 501 has received little attention, while Standard 316 – the bar passage standard – has received more attention. Accreditors focused more on outcomes – bar passage results – than inputs – the academic abilities of students admitted to law school. With the decline in the number of applicants to law school and the corresponding erosion of entering class credentials at many law schools, however, Standard 501 has begun to receive more attention.

Specifically, due to concerns that some law schools might be admitting students whose entering credentials suggest that they are not likely to be successful in law school or be able to pass the bar exam, the Standards Review Committee (“SRC”) of the ABA Section of Legal Education and Admissions to the Bar (“Section”) has proposed revisions to Standard 501. The Council for the Section will consider these proposed revisions at its upcoming March meeting.  

I am in favor of most of the suggested revisions to Standard 501 (as discussed below). I am not in favor of the SRC’s Proposal 2 which proposes an “attrition” threshold above which schools would face heightened attention. In a subsequent post, I will discuss the need for a more robust Standard 308, which addresses academic standards, as a corollary to Standard 501.

“Proposal 2” Presents Two Problems

In its Proposal 2, the SRC suggests adding a new Interpretation 501-3 – “A law school having a non-transfer attrition rate above ___ percent bears the burden of demonstrating that it is in compliance with the Standard.” (The SRC anticipated that the Council would insert a number, perhaps 10%, in place of the blank.) Proposal 2 presents two problems; it is too broad in scope, and it is unlikely to be effective in practice.

The proposed interpretation focuses on “non-transfer” attrition, when it should be focused solely on “academic attrition.”

Non-transfer attrition consists of two components – “academic attrition” and “other attrition.”

Academic attrition includes students involuntarily dismissed under a law school’s academic policies or students who leave voluntarily but would have been dismissed had they remained in school. As noted in my recent blog posting analyzing attrition data, academic attrition varies widely across law schools from zero to over 20%, particularly among law schools with relatively low LSAT/GPA profiles.

Other attrition includes students who leave law school for reasons other than academic attrition. They may have decided law school is not for them, or have had a family emergency or their own physical health concern or mental health concern that leads them to withdraw from law school. Other attrition has much less variability; at most law schools other attrition is in the 2% to 4% range.

Law schools should be able to look at historical trends regarding student performance at their law school to predict whether applicants with certain entering class credentials are likely to experience “academic attrition.” Thus, a high “academic attrition” rate may suggest a law school is admitting too many students who are unlikely to be successful. By contrast, law schools rarely are going to be able to identify in advance those students who are likely to fall into the “other attrition” category.

Accordingly, if Proposal 2 is going to move forward, I would strongly advise that it focus solely on “academic attrition” rather than on all non-transfer attrition.

(It is conceivable that the SRC chose non-transfer attrition rather than academic attrition because it was concerned law schools would opt to classify a student’s attrition as other attrition rather than academic attrition to avoid whatever threshold might be set for academic attrition. While that risk could be addressed through careful assessment of attrition data, if there is a strong desire to use non-transfer attrition, I would suggest that the non-transfer attrition threshold be set at a slightly higher percentage to recognize that many law schools regularly experience other attrition of between 2% and 4%.)

The proposed interpretation is unlikely to accomplish its intended purpose

Regardless of whether the threshold focuses solely on academic attrition or on non-transfer attrition, however, this proposed interpretation fails to account for how law schools are likely to respond to the new interpretation. The SRC may believe the proposed interpretation will make law schools refrain from admitting as many at risk students. While that is possible, it is actually as likely or more likely that some law schools will not change their admissions practices, but simply will adjust how they implement their academic dismissal policies or grading policies to keep academic attrition or non-transfer attrition below whatever threshold is established.

For example, assume the proposed interpretation set 10% as the academic attrition “threshold” for shifting to law schools the burden of demonstrating compliance with the Standard. In the 2014-15 academic year, an academic attrition threshold of 10% would have “caught” 30 law schools. One easily can imagine that a significant number of those law schools likely would simply adjust their academic dismissal policies or their grading policies so that they maintain academic attrition below the 10% threshold (even at the risk of noncompliance with the bar passage standard a few years later). (Indeed, 11 of the 30 law schools in 2014-15 with academic attrition more than 10% have an academic attrition rate between 10% and 11%, such that getting under 10% would not have been very difficult for these law schools.)

The SRC may be assuming that law schools don’t have “control” of how academic attrition actually functions. Perhaps the SRC believes that academic dismissal policies are fairly consistent across all law schools such that a given threshold (10%) would have comparable meaning and effect across all law schools. As noted above, however, academic attrition varies widely among similarly situated law schools, particularly those with relatively low LSAT/GPA profiles. This suggests that academic dismissal and grading policies differ across law schools or, phrased differently, that the way in which grading policies interact with academic dismissal policies varies widely. In reality, law schools have sufficient “local control” over their grading and academic dismissal policies that it would not be that difficult for law schools to avoid being “caught” by whatever academic attrition or non-transfer attrition threshold would get established in Standard 501.

The Other Three Suggested Revisions to Standard 501 are Generally Good Ideas

First, in Section 501(a), the SRC wants to replace “maintain” with “adopt, publish, and adhere to such that the standard will read: “A law school shall adopt, publish, and adhere to sound admission policies and practices consistent with the Standards, its mission, and the objectives of its program of legal education.” This clarifies that law schools have to have policies, have to publish policies, and have to adhere to the policies, all good things.

Second, in Section 501(b), the SRC wants to shift from a negative framework to a positive framework. The existing standard has a negative framework -- “A law school shall not admit an applicant who does not appear capable of satisfactorily completing its program of legal education and being admitted to the bar.”   The SRC recommends shifting to a positive framework – “A law school shall admit only applicants who appear capable of satisfactorily completing its program of legal education and being admitted to the bar.” (emphasis added) While I don’t feel strongly about this, I think it is easier to conceptualize this in the positive framework – focused on who should be admitted -- rather than on who should not be admitted. That said, it might help to maintain a singular focus on each “applicant” rather than shifting to the plural “applicants.” “A law school shall admit an applicant only if the applicant appears capable of satisfactorily completing its program of legal education and being admitted to the bar.” This keeps the focus on each individual applicant rather than the pool of applicants a law school admits.

Third, in the first interpretation – Interpretation 501-1 -- the SRC recommends adding a sentence that states: “Compliance with Standard 316 is not alone sufficient to comply with the Standard.” Standard 316 is the bar passage standard. This suggested revision is designed to highlight that Standard 501 is an independent standard that is not just derivative of Standard 316. This also strikes me as a useful change.

If Standard 501 is simply derivative of Standard 316, then there is no way to assess compliance with Standard 501 in “real time.” Rather, one would only assess compliance with Standard 501 by waiting three or four years to see whether the graduates of the law school comply with the bar passage standard. Standard 501 would be somewhat superfluous.

By adding this sentence, the SRC is suggesting that compliance with Standard 501 should be assessed “presently” by looking at the LSAT/GPA profile of matriculants, the law school’s experience with attrition, and the success of the law school’s academic support program, along with the law school’s historical bar passage results. Historical attrition and historical results on the bar exam in relation to prior graduates’ entering LSAT/GPA profile and prior graduates’ law school academic performance should inform the determination of whether the law school is continuing to admit only those applicants reasonably capable of being successful in the program of legal education and in passing the bar exam. If students with certain LSAT/GPA profiles over a three-year or four-year period consistently have performed poorly in law school and experienced academic attrition, or performed poorly on the bar exam, then the law school has data that would make it challenging for the law school to demonstrate that applicants with those profiles are “capable of satisfactorily completing its program of legal education and being admitted to the bar.”

(I am very grateful for the helpful comments of Debby Merritt and Scott Norberg on earlier drafts of this blog posting.)

March 8, 2016 in Current events, Data on legal education | Permalink | Comments (0)

Saturday, February 27, 2016

Updated Analysis of Attrition through the 2014-15 Academic Year

In October 2015, I posted a blog discussing attrition rates between 2010 and 2014. With the release of the Standard 509 reports in December, I now have compiled attrition data from all of the fully-accredited ABA law schools outside of Puerto Rico for the last five full academic years. I have calculated average attrition rates for the class as a whole and then broken out average attrition rates by law schools in different median LSAT categories – 160+, 155-159, 150-154 and <150.

In a nutshell, overall first-year attrition has increased each of the last four years, going from 5.81% to 7.04% over that period. This overall increase, however, results largely from increases in overall attrition among schools with median LSATs less than 150, as the overall attrition rates for law schools with medians LSATs of 150 or greater have generally decreased over this period. “Academic attrition” rates increase significantly as median LSAT decreases, while “other attrition” presents more of a mixed record.

Average Overall First-Year Attrition Rates Continue to Increase

In calculating attrition rates, I wanted to capture those students who are no longer in law school anywhere. Thus, for these purposes, “attrition” is the sum of “academic attrition” and “other attrition.” “Academic attrition” occurs when a law school asks someone to leave because of inadequate academic performance. As of the 2014-15 academic year, “academic attrition” also includes a student who left voluntarily but who would have been asked to leave because of academic performance had the student not left voluntarily. “Other attrition” occurs when a student departs from the law school volitionally without being at risk of academic dismissal. Both of these categories exclude “transfers.”

The following chart shows that despite the declining “LSAT profile” of the entering classes between 2010 and 2014, there had not been any meaningful change in the average “academic attrition” rate for first-year students through the 2013-14 academic year, but that academic attrition increased modestly in 2014-15 to over 4%. Some portion of this increase in academic attrition might be attributable to the continued decline in LSAT profile of the entering class of students in 2014. Given that there was a corresponding decline in “other attrition” (for the first time in the four-year period assessed), however, at least some portion of the increase in “academic attrition” would appear to be attributable to the redefinition of “academic attrition” described in the preceding paragraph, Roughly 80% of the increase in overall first-year attrition over this period from 5.81% to 7.04%, is due to a growth in the “academic attrition” category from 3.32% to 4.15%.

Overall First-Year Attrition for Classes Entering in 2010, 2011, 2012, 2013 and 2014

 

Beg. Enrollment

Academic Attrition

% Academic

Other Attrition

% Other

Total Attrition

% Attrition

2010-11

50408

1673

3.32

1256

2.49

2929

5.81%

2011-12

46477

1551

3.34

1262

2.72

2813

6.06%

2012-13

42399

1461

3.45

1186

2.8

2647

6.25%

2013-14

38837

1316

3.39

1236

3.18

2552

6.57%

2014-15

37086

1539

4.15

1072

2.89

2611

7.04%

(Calculating attrition rates for 2010-11, 2011-12 and 2012-13, is a little more complicated than one might think. For ABA reporting years of 2011, 2012, and 2013, “academic attrition” was reported separately, but “other attrition” included “transfers out.” Thus, to generate the real “other attrition” number, one needed to “subtract” from “other attrition” the numbers associated with “transfers out.” Because some schools occasionally listed transfers out in “second year” “other attrition,” this analysis should be understood to have a little fuzziness to it for years 2010-11, 2011-12 and 2012-13. For ABA reporting years 2014 and 2015, transfers out were not commingled with “other attrition,” so the calculations were based solely on the sum of “academic attrition” and “other attrition.”)

Academic Attrition Rates Increase as Law School Median LSAT Decreases

Notably, there are different rates of attrition across law schools in different LSAT categories. The following chart breaks down attrition by groups of law schools based on median LSAT for the law school for the entering class each year. For each year, the chart shows the average first-year attrition rates for law schools with median LSATs of 160 or higher, for law schools with median LSATs of 155-159, for law schools with median LSATs of 150-154 and for law schools with median LSATs less than 150. In addition, it breaks out “academic attrition” and “other attrition” as separate categories for each category of law school and for each year and then provides the total overall attrition rate each year along with the five-year average total overall attrition rate.

Average Attrition Rates by Category of Schools Based on Median LSAT

 

2010-11

2011-12

2012-13

2013-14

2014-15

 

Median LSAT

Acad

Oth

Total

Acad

Oth

Total

Acad

Oth

Total

Acad

Oth

Total

Acad

Oth

Total

Five-Year Avg.

160+

0.6

1.7

2.3

0.6

1.9

2.5

0.4

2.0

2.4

0.3

1.5

1.8

0.3

1.3

1.6

2.1

155-159

2.9

2.6

5.5

2.2

2.8

5.1

2.1

2.9

5.1

1.7

3.2

4.9

2.0

2.6

4.6

5.0

150-154

6.3

3.8

10.1

6.2

3.4

9.6

6.0

3.7

9.7

4.2

4.3

8.5

4.7

4.0

8.7

9.3

<150

10.1

2.4

12.5

9.4

3.8

13.2

9.1

3.0

12.2

9.7

4.7

14.4

12.7

4.4

17.1

13.9

When looking at this data, some things are worth noting.

Attrition Rates Increase as Median LSAT Decreases

As one moves from law schools in the highest LSAT category to the lowest LSAT category, overall attrition increases, going from an average over the five years of 2.1%, to 5.0%, to 9.3%, to 13.9%. “Academic attrition” consistently increases as median LSAT decreases, while “other attrition” increased as median LSAT decreased in only three of the five years.

Although this analysis is focused on four LSAT categories, the trend of having academic attrition increase as median LSAT decreases continues if you add a fifth LSAT category. In 2010-11 there was only one law school with a median LSAT of 145 or less, with only 320 students. By 2014-15, however, there were 12 law schools with a median LSAT of 145 or less, with 2,826 students. The average academic attrition rate at these 12 schools in 2014-15 was 15.6 percent. The academic attrition rate at the other 24 law schools with a median LSAT less than 150 but more than 145 was 10.1 percent.

The Top Three Categories of Law Schools Saw Decreases in Academic Attrition Over Time

Over the period from 2010-11 to 2014-15, “academic attrition” generally appears to be flat to decreasing for schools with median LSATs of 160+. For schools with median LSATs of 155-159 and 150-154, “academic attrition” generally had declined from 2010-11 to 2013-14, but increased slightly in 2014-15 (although still well below levels in 2010-11). The only category in which academic attrition in 2014-15 exceeded academic attrition in 2010-11, was for law schools with median LSATs <150, where the academic attrition rate increased from 10.1% to 12.7%.

 
Thumbnail

By contrast, “other attrition” presents more of a mixed record over time, but decreased in 2014-15 across all LSAT categories (perhaps because of the redefinition of “academic attrition” discussed above).

 
 
Thumbnail
[If you are wondering why the average overall attrition rate has increased while the overall attrition rates for the top three LSAT categories have decreased, the answer is because of the changing number of students in each category over time. The number of students and percentage of students in the top LSAT category has declined significantly, while the number of students and percentage of students in the bottom LSAT category has increased significantly. This results in the average overall attrition rate increasing even as rates in various categories are decreasing.]

Increasing Variability in Attrition Rates

While it may make sense that “academic attrition” increases as law school median LSAT decreases, when one looks at the data within each LSAT category, there is a surprising range of academic attrition rates across law schools, with variability increasing significantly as median LSAT scores decrease. There was much less variability with respect to “other attrition.”

There were 50 law schools with median LSATs of 160+ in 2014-15, of which 37 (roughly 75%) had an academic attrition rate of 0, while the other 13 had academic attrition rates less than 5% with only four having academic attrition rates of 1% or more, topping out at 3.7%.

There also were 50 law schools with median LSATs of 155-159 in 2014-15, of which 11 had an academic attrition rate of 0 (roughly 22%), while 32 of these law schools had academic attrition rates of less than 5%, and seven had academic attrition rates of more than 5%, topping out at 8.6%.

There were 59 law schools with median LSATs of 150-154 in 2014-15, of which 10 had an academic attrition rate of 0 (roughly 17%), while 28 of these law schools had an academic attrition rate of less than 5%, 12 had an academic attrition rate of 5% to 10%, eight had an academic attrition rate of 10% to 15%, and one had an academic attrition rate in excess of 15% (17.9%).

Finally, there were 36 law schools with a median LSAT <150 in 2014-15, of which none had an academic attrition rate of 0, while seven had academic attrition rates less than 5%, 13 had an academic attrition rates of 5% to 10%, four had an academic attrition rate of 10% to 15% and nine had an academic attrition rate of 15% or more of which five were over 20%, with one at 33%.

 
Thumbnail
This phenomenon of increasing variability in attrition rates may merit further attention. For law schools with a similar 50th percentile LSAT and 25th percentile LSAT for their entering classes, what can explain a range of academic attrition from 2% to 20%. Does one school have a much higher standard for academic good standing and dismissal? Does one school have a much more robust academic support program? Have the professors at one school failed to adjust their grading to reflect a significantly different entering class profile among their students?

How does this varied approach to academic attrition ultimately impact bar passage results? If we have two law schools with comparable entering class profiles in states with comparable cut scores and bar passage percentages, does the law school with a higher rate of academic attrition show a higher bar passage rate when compared to the law school with a much lower rate of academic attrition? (I hope to explore this question in a subsequent blog posting.)

Unanswered Questions

The publicly-reported attrition data does not provide any information regarding the gender or ethnicity or socio-economic background of students leaving law school. Therefore, we don’t know whether there are different rates of attrition for women as compared with men or whether students of different ethnic backgrounds have different rates of attrition. We also don’t know whether first-generation law students experience attrition at different rates than other law students, or whether students of lower socio-economic status experience attrition at different rates than students of higher socio-economic status. Similarly, at law schools with part-time programs, we don’t know whether part-time students and full-time students experience attrition at comparable rates.

We also do not know for sure who is experiencing attrition within a given law school. The data presented here would suggest that students on the lower end of the distribution of a law school’s entering class profile are more likely to experience academic attrition than students on the higher end of the distribution, but presently that is not easily verified.

Further Thoughts

This is an appropriate time to pay closer attention to attrition data. The Standards Review Committee recently revisited Standard 501 and suggested to the Council that attrition rates might be used to inform the appropriateness of a law school’s admission policies. I hope to discuss the Standard Review Committee’s proposal at greater length in a subsequent blog posting. Given that the Standards Review Committee also recommended changes to Standard 316, the bar passage standard, trying to develop a better understanding of the relationship between academic attrition and bar passage (discussed above) also makes sense.

(I am grateful to Bernie Burk and Debby Merritt for comments on a earlier draft of this blog posting.)

February 27, 2016 in Data on legal education | Permalink | Comments (0)

Monday, January 18, 2016

Changes in Composition of the LSAT Profiles of Matriculants and Law Schools Between 2010 and 2015

In late December 2014, I posted a blog analyzing how the distribution of matriculants across LSAT categories had changed since 2010 based on the LSAC’s National Decision Profiles and on law school 50th percentile LSATs and 25th percentile LSATs across ranges of LSAT scores. With the LSAC’s recent release of the 2014-15 National Decision Profile and the ABA’s recent release of Standard 509 data, I am posting this blog to provide an update with the 2015 data.

At one level, this is a story that has already become well understood over the last year since my blog posting, with much discussion of the relationship between declining LSAT profiles and declining median MBE scores and bar passage rates. This 2015 information indicates that the decline in the LSAT profiles of matriculants and of law schools has continued, although with some moderation.

Given that the LSAT profiles of matriculants and of law schools for fall 2013, fall 2014 and fall 2015 are less robust than those for fall 2011 and fall 2012 (the classes that graduated in 2014 and 2015, respectively), one can anticipate that the declines in median MBE scaled scores and corresponding bar passage rates in 2014 and 2015 will continue in July 2016, 2017 and 2018 absent increases in attrition (I discussed attrition rates in a blog posting in October), significant improvement in academic support programs at law schools, or improved bar preparation efforts on the part of graduates.

Tracking Changes Based on LSAC’s National Decision Profiles – 2010-2015

The following discussion summarizes data in the LSAC’s National Decision Profiles from the 2009-10 admission cycle (fall 2010) through the 2014-15 admission cycle (fall 2015).

Let’s start with the big picture. If you take the matriculants each year and break them into three broad LSAT categories – 160+, 150-159, and <150 – the following chart and graph show the changes in percentages of matriculants in each of these categories over the last six years.

Change in Percentage of Matriculants in LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

<150

14.2

15.7

19.3

22.5

23

23.8

150-159

45

45.3

44.3

44.1

43.6

44.2

 160+

 40.8

39 

 36.3

 33.4

 33.5

 32

Change in Percentage of Matriculants in LSAT Categories – 2010-2015 (Visual) 
Thumbnail

Notably, this chart and graph show almost no change in the “middle"" -- 150-159 – (blue – dropping from 45% to 44.2%), with most of the change at 160+ (green -- decreasing from 40.8% to 32%) and at <150 (red -- increasing from 14.2% to 23.8%). This chart and graph also show some stabilization between 2013 and 2014, followed by a modest decline in 2015 in the percentage of students with LSATs of 160+ and a modest increase in the percentage of students with LSATs of <150.

While I think this tells the story pretty clearly, for those interested in more detail, the following charts provide a more granular analysis.

Changes in LSAT Distributions of Matriculants – 2010-2013       

 

2010

2011

2012

2013

2014

2015

Change in #

% Change in #

170+

3635

3330

2788

2072

2248

2022

(1613)

-44.4%

165-169

5842

5622

4783

4082

3941

3483

(2359)

-40.4%

 

160-164

10666

8678

7281

6442

6010

5743

(3923)

-36.8%

 

155-159

11570

10657

9700

8459

7935

7780

(3790)

-32.8%

 

150-154

10626

9885

8444

8163

7934

7805

(1821)

-17.1%

 

145-149

5131

5196

5334

5541

5158

5274

143

2.8%

 

<145

1869

1888

2564

2930

3203

3084

1215

65%

 

49339

45256

40894

37689

36429

35191

   
                       

Note that in terms of the percentage change in the number of matriculants in each LSAT category, the four highest LSAT categories are all down at least 30% since 2010, with 165-169 and 170+ down over 40%, while the two lowest LSAT categories are up, with <145 being up over 60%. 
  Thumbnail

Note that in the line graph above, the top two LSAT categories have been combined into 165+ while the bottom two LSAT categories have been combined into <150. Perhaps most significantly, in 2010, the <150 group, with 7,000 students, was over 2,400 students smaller than the next smallest category (165+ with 9,477) and more than 4,500 students smaller than the largest category (155-159 with 11,570). By 2015, however, the <150 category had become the largest category, with 8,358, more than 500 larger than the second category (150-154, with 7,805) and more than 2,800 larger than the smallest category, 165+ with only 5,505. Moreover, 88% of the growth in the <150 category was in the <145 category (1,215 of the 1,358 more people in the <150 category were in the <145 category).

Changes in Percentage of Matriculants in LSAT Ranges – 2010-2015

 

2010

2011

2012

2013

2014

2015

% Chg in %

>169

7.4

7.4

6.8

5.5

6.2

5.7

-23%

165-169

11.8

12.4

11.7

10.8

10.8

9.9

-16.1%

160-164

21.6

19.2

17.8

17.1

16.5

16.3

-24.5%

155-159

23.5

23.5

23.7

22.4

21.8

22.1

-6%

150-154

21.5

21.8

20.6

21.7

21.8

22.2

3.2%

145-149

10.4

11.5

13

14.7

14.2

15

44.2%

<145

3.8

4.2

6.3

7.8

8.8

8.8

132%

In terms of the “composition” of the class, i.e., the percentage of matriculants in each LSAT category, we see significant declines of 20% or more at 160-164 and 170+ and significant increases of 40% at 145-149 and over 100% at <145.

Tracking Changes in Law Schools by Looking at the Distribution of 50th Percentile LSAT Scores Across Six LSAT Categories

Obviously, this change in the composition of the entering class has resulted in corresponding changes in the LSAT profiles of law schools. Based on the data law schools reported in their Standard 509 Reports from 2010 to 2015, the chart below lists the numbers of law schools reporting a 50th percentile LSAT within certain LSAT ranges. (This chart excludes law schools in Puerto Rico and provisionally-approved law schools.)

Number of Law Schools with a 50th Percentile LSAT in Six LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

165+

30

31

26

23

21

21

160-164

47

41

39

31

29

28

155-159

59

57

56

53

51

48

150-154

50

52

53

56

59

59

145-149

9

14

22

28

29

33

<145

0

1

0

5

7

7

Total

195

196

196

196

196

196

The table above pretty clearly demonstrates the changes that have taken place since 2010, with declines in the number of law schools with a 50th percentile LSAT in higher LSAT categories and increases in the number of law schools with a 50th percentile LSAT in the lower LSAT categories, although 2015 saw only modest changes from 2014 at 160-164 (down 1), at 155-159 (down 3) and at 145-149 (up 4). 

  Thumbnail

As shown in the chart above, the number of law schools with a 50th percentile LSAT of 155 or higher has declined from 136 to 97. By contrast, the number of law schools with a 50th percentile LSAT of 154 or lower has increased from 59 to 99. In 2010, therefore, there were more than twice as many law schools with a 50th percentile LSAT of 155 or higher as compared with the number with a 50th percentile LSAT of 154 or lower (136 and 59, respectively), but as of 2015, those numbers were nearly identical (97 and 99, respectively).

The “mode” in 2010 was in the 155-159 category, with nearly 60 law schools, but by 2014, the “mode” had shifted to the 150-154 category with nearly 60 law schools.

Perhaps most pronounced is the shift in the upper and lower ranges. As shown in the chart below, the number of law schools with a 50th percentile LSAT of 160 or higher has dropped by more than one-third, from 77 to 49, while the number of law schools with a 50th percentile LSAT of 149 or lower has more than quadrupled from 9 to 40. In 2010, there were only three law schools with a 50th percentile LSAT of 145 or 146; as of 2015, there were 15 law schools with a 50th percentile LSAT of 146 or lower, of which five were at 143 or lower, with the two lowest being 142 and 141.
  Thumbnail

   
Thumbnail
Tracking Changes in Law Schools by Looking at the Distribution of 25th Percentile LSAT Scores Across Six LSAT Categories

For those who want to focus on the bottom 25th percentile of LSAT profile among law schools, the table below shows changes in distribution of the bottom 25th percentile LSAT among law schools across six LSAT categories between 2010 and 2015.

Number of Law Schools with a 25th Percentile LSAT in Six LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

165+

17

16

11

10

10

7

160-164

26

20

21

17

15

17

155-159

55

54

49

42

41

38

150-154

67

69

59

65

57

59

145-149

26

33

46

48

48

52

<145

4

4

10

14

25

23

Total

195

196

196

196

196

196

With respect to changes between 2014 and 2015, this table shows a little more variability, with decreases in three categories -- 165+ (down 3l 155-159 (down 3) and less than 145 (down 2) -- and with increases in three categories -- 160-164 (up 2), 150-154 (up 2), and 145-149 (up 4).

Looking at changes between 2010 and 2015, note that the four top categories have all declined, while the number of law schools with a 25th percentile LSAT of 145-149 has doubled and the number of law schools with a 25th percentile LSAT of <145 has more than quintupled from four in 2010 (two at 144 and two at 143), to 23 in 2015, with 13 of them at 142 and below.

  Thumbnail

As shown in the chart below, in 2010, the number of law schools with a 25th percentile LSAT of 155 or higher and the number with a 25th percentile LSATs of 154 or lower were nearly identical (98 and 97, respectively). As of 2015, however, there were more than twice as many law schools with a 25th percentile LSAT of 154 or lower when compared with those with a 25th percentile LSAT of 155 or higher (134 and 62, respectively).
  Thumbnail

Moreover, between 2010 and 2015, the number of law schools with a 25th percentile LSAT of 160 or higher has fallen more than 40% from 43 to 24, while the number with a 25th percentile LSAT of 149 or lower has more than doubled from 30 to 75. 

 
Thumbnail
 

Changes in Average 75th, 50th and 25th Percentile LSATs Across Fully-Accredited ABA Law Schools

One other way of looking at this is just to see how the average first-year LSAT and UGPA profiles have changed over the last six years.

Average LSATs of Matriculants at Fully-Accredited ABA Law Schools

 

75th Percentile

50th Percentile

25th Percentile

2010

160.5

158.1

155.2

2011

160.1

157.8

154.5

2012

159.6

157

153.6

2013

158.7

156

152.6

2014

158.2

155.4

151.8

2015

157.9

155.3

151.8

Overall Drop

-2.6

-2.8

-3.4

(Note that these are not weighted averages based on the number of matriculants at each school, but are simply averages across law schools.)

Notably, over this same period of time the average UGPAs have fallen modestly as well from a 75th/50th/25th profile of 3.63 – 3.41 – 3.14 in 2010 to 3.6 – 3.37 – 3.09 in 2015.

Conclusion

If one focuses on the LSAT scores and UGPAs as measures of “quality” of the entering class of law students each year, then the period from 2010-2015 not only has seen a significant decline in enrollment, it also has seen a significant decline in “quality.”

The LSAC’s most recent Current Volume Report (January 8, 2016) suggests that the pool of applicants to law schools is rebounding slightly in this current cycle. With 22,662 applicants at a point in the cycle at which 40% of applicants had been received last year, one can project an applicant pool of roughly 56,600. The “quality” of applicants also appears to be stronger, with double digit percentage increases in applicants to date in LSAT categories of 165 and higher. If these trends continue in the applicant pool for the current cycle, then the fall 2015 entering class may represent the “bottom” both in terms of the number of matriculants and in terms of the “quality” of the matriculants as measured by LSAT and UGPA. Of course, we won’t know for sure about that until next December when the 2016 Standard 509 Reports are published.

(I am grateful for the helpful comments of Scott Norberg on an earlier draft of this blog.)

January 18, 2016 in Data on legal education | Permalink | Comments (1)

Thursday, December 31, 2015

Conditional Scholarship Programs: Comparing 2014-15 with 2011-12

A few years ago, the Council for the Section of Legal Education and Admissions to the Bar approved revisions to Standard 509, requiring that law schools post a chart identifying the number of conditional scholarships given to incoming first years and the number of those scholarship recipients whose scholarships were reduced or eliminated at the end of the first year.

As a result of this requirement, there is now a much greater universe of publicly available information about law school scholarship programs. In the summer of 2013, I posted to SSRN an article entitled Better Understanding the Scope of Conditional Scholarship Programs among American Law Schools, summarizing the first year of available data on conditional scholarship programs, covering the 2011-12 academic year.

Law schools have now published this data for four years, with data covering the 2014-15 academic year having just been released as of December 15.

This blog posting highlights the smaller number of law schools with conditional scholarship programs as of 2014-15, summarizes the extent to which the number and percentage of first-year students with conditional scholarships and the number and percentage of rising second-year students whose scholarships were reduced or eliminated has changed since 2011-12, and looks at how the distribution of retention rates by decile has changed since 2011-12. It also analyzes both the prevalence of conditional scholarship programs among law schools across different rankings categories and the extent to which scholarship retention rates differ among law schools across different rankings categories.

1. Number of Law Schools with Conditional Scholarship Programs Declines

Excluding the three law schools in Puerto Rico, there were 140 fully-accredited ABA law schools with conditional scholarship programs in 2011-12. For the 2014-15 academic year, however, the number of fully-accredited ABA law schools with conditional scholarship programs had dropped by 27, to 113, a decline of nearly 20%.

2. Average Conditional Scholarship Retention Rate Increases Modestly

In 2011-12, the average scholarship retention rate across the 140 law schools with conditional scholarship programs was 69%. In total, 12,681 students who entered law school in the fall of 2011 and continued into their second year of law school at the same school entered with conditional scholarships and 4,332 of those students had their scholarships reduced or eliminated, a retention rate across individual students of roughly 66%.

For the 2014-15 academic year, the average retention rate across the 113 law schools with conditional scholarship programs increased to 73.2%. In total, 10,099 students who entered law school in the fall of 2014 and continued into their second year of law school at the same school entered with conditional scholarships and 2,880 of those students had their scholarships reduced or eliminated. Thus, the retention rate across individual students also increased to roughly 71.5%.

3. Percentage of First-Year Students with Conditional Scholarships Stays the Same While the Percentage of Rising Second-Year Students Whose Scholarships were Reduced or Eliminated Declines Slightly 

Across the 194 law schools on which I compiled data for the 2011-12 academic year, the fall 2011 entering first-year class totaled 46,388. Thus, roughly 27.3% (12,681/46,388) of the entering first-year students in the fall 2011 entering first-year class were on conditional scholarships. Roughly 9.4% (4,382/46,388) of all the first-year students in the fall 2011 entering first-year class failed to retain their conditional scholarship as they moved into the second year of law school.

Interestingly, of the 37,086 first-years who matriculated at fully-accredited ABA law schools in fall 2014, roughly 27.2% (10,099/37,086) were on conditional scholarships, almost the exact same percentage as in 2011-12. But a smaller percentage, roughly 7.8% (2,880/37,086) of all the first-year students who entered law school in fall 2014 failed to retain their conditional scholarship as they moved into the second year of law school.

Therefore, even though fewer law schools had conditional scholarship programs, those with such programs offered conditional scholarships to a larger percentage of students, such that the overall percentage of students with conditional scholarships remained roughly the same (27.3% in 2011-12 compared with 27.2% in 2014-15). Nonetheless, because there was a modest increase in retention rates, a smaller percentage of the overall population of students (7.8% in 2015 compared with 9.4% in 2012) saw their conditional scholarships reduced or eliminated.

4. Distribution of Retention Rates by Decile Shows Fewer Schools with Lower Retention Rates

The distribution of retention rates by deciles across all 140 law schools reporting conditional scholarship programs for 2011-2012 and all 113 law schools reporting conditional scholarship programs for 2014-2015 is set forth in Table 1. The biggest change reflected Table 1 is the decrease in the number of law schools with retention rates of less than 70%, falling from 73 in 2011-12 to 40 in 2014-15.

Table 1: Number of Law Schools Reporting Retention Rates by Decile Range

Retention Rate

Number

Description

 

2012      2015

             2012                                     2015

Less than 40%

8

6

Four of the eight were ranked alphabetically

Five of the six were ranked 100 or lower

40-49%

12

4

Six of the 12 were law schools ranked between 50 and 99

Two of the four were ranked 51-99

50-59%

23

17

18 of the 23 were law schools ranked 99 or lower

14 of the 17 were ranked 100 or lower

60-69%

30

13

23 the 30 were ranked in 100 or lower

Nine of the 13 were ranked 100 or lower

70-79%

23

23

13 of the 23 were ranked in the top 100

17 of the 23 were ranked 100 or lower

80-90%

19

24

14 of the 19 were ranked between 51 and 145

10 of the 24 were ranked 51-99

90% or better

25

26

12 of the 25 were ranked in the top 50

14 of the 26 were ranked in the top 100, five in the top 50

TOTAL

140

113

 

5. Differences in Conditional Scholarship Programs across Law School Rankings Categories

As noted in some of the descriptors in Table 1, there also were some differences based on the rankings of the law schools in U.S. News, using the March 2012 rankings and the March 2015 rankings. These differences are summarized in the Tables 2A and 2B and the discussion that follows.

Table 2A – Changes in Number and Percentage of First-Year Students with Conditional Scholarships across Different Categories of Law Schools Based on U.S. News Rankings for 2012 and 2015

Rank

# of Schools

# of Schools with Cond. Scholar.

12        15

% of Schools with Cond. Scholar.

12     15

 # of 1Ls

12        15

# of 1Ls with Conditional Scholarships

12         15

.% of 1Ls with Cond. Scholar.

12          15

Top 50

50

51

20

10

40

20

13109

11715

1656

814

12.8

6.9

51/100

50

50

40

31

80

62

11592

8972

4179

3159

36

35.2

101/150

46

52

36

38

78

73

9293

7899

2803

2997

30.1

37.9

Alpha.

48

42

44

34

92

81

12394

8500

4043

3129

32.6

36.8

TOTAL

194

195

140

113

72

58

46388

37086

12681

10099

27.3

27.2

Table 2A shows that the percentage of fully-accredited ABA law schools with conditional scholarship programs increases as you move down the U.S. News Rankings. Indeed, of the 27 law schools that moved away from conditional scholarship programs between 2011-12 and 2014-15, 19 were ranked in the top 100. As a result, only 41 of the top 100 law schools had conditional scholarship programs as of 2014-15, while 72 law schools ranked 101 or lower had conditional scholarship programs.

Table 2A also shows that the percentage of students with conditional scholarships is basically divided into two camps. Within the top 50 law schools, only 10 law schools have conditional scholarship programs, and five of those have retention rates of 100%, such that really only five of the 51 law schools in the top 50 have meaningful conditional scholarship programs. (At these five schools, however, roughly one in three students with conditional scholarships saw their scholarships reduced or eliminated.) Across all 51 law schools in the top 50, only 6.9% of first-year students have conditional scholarships. Throughout the rest of the law schools, however, roughly 35% to 38% of first-year law students in each rankings category have conditional scholarships.

Even though the percentage of first-year students with conditional scholarships declined among top 50 law schools between 2011-12 and 2014-15, the percentage increased among law schools ranked 101 and below, such that the overall percentage of first-year students with conditional scholarship has remained almost the same between 2011-12 and 2014-15, at slightly more than 27%.

Table 2B – Changes in Retention Rates of Conditional Scholarship Recipients across Different Categories of Law Schools Based on U.S. News and World Report Rankings for 2012 and 2015

Rank

# Scholarships Not Retained

12            15

% of Scholarships Not Retained.

12           15

% of All 1Ls Who Did Not Retain Scholarship

12            15

Top 50

186

167

11.2

20.5

1.4

1.4

51/100

1452

672

34.7

21.3

12.5

7.5

101/150

1069

917

38.1

30.6

11.5

11.6

Alpha.

1625

1124

40.2

35.9

13.1

13.2

TOTAL

4332

2880

34.2

28.5

9.4

7.8

Table 2B shows that the percentage of all students whose conditional scholarships were reduced or eliminated in 2014-15 consistently climbs as one moves down the rankings categories – going from 1.4% among law schools ranked in the top 50, to 7.5% for law schools ranked 51-100, to 11.6% for law schools ranked 101-150, to 13.2% for law schools ranked alphabetically.

Table 2B shows that among law schools ranked in the top 50, the average percentage of conditional scholarship recipients whose scholarships were reduced or eliminated increased between 2011-12 and 2014-15 from 11.2% to 20.5%. (But remember, this is across a relatively small sample of 10 schools and 814 students.) By contrast, across each of the other three categories of law schools, the average percentage of conditional scholarship recipients whose scholarships were reduced or eliminated declined between 2011-12 and 2014-15, from 34.7% to 21.3% among schools ranked 51-100, from 38.1% to 30.6% among schools ranked 101-150, and from 40.2% to 35.9% among law schools ranked alphabetically.

Nonetheless, law schools ranked 51-100 were the only category of law schools which saw a decrease in the percentage of all rising second-year students whose scholarships were reduced or eliminated, as the percentage fell from 12.5% to 7.5%. For top 50 law schools, the combination of fewer students with conditional scholarships, but a higher rate at which scholarships were reduced or eliminated, meant that 1.4% of all students saw their scholarships reduced or eliminated, the same percentage as in 2011-12. For law schools ranked 101 and below, the combination of more students with conditional scholarships, but a lower rate at which scholarships were reduced or eliminated, meant that roughly the same percentage of all students saw their scholarships reduced or eliminated in 2014-15 as in 2011-12 (for law schools ranked 101-150, 11.6% in 2014-15 compared with 11.5% in 2011-12; for law schools ranked alphabetically, 13.2% in 2014-15 compared with 13.1% in 2011-12).  Because of the decrease in the percentage of students whose scholarships were reduced or eliminated in the category of law schools ranked 51-100, however, the percentage of all students who saw their scholarships reduced or eliminated fell from 9.4% in 2011-12 to 7.8% in 2014-15.

Conclusion

Even though 27 fewer law schools had conditional scholarship programs in 2014-15 than in 2011-12, the percentage of all first-year students on conditional scholarships in 2014-15 was nearly the same as in 2011-12 because the 113 schools with conditional scholarship programs, on average, gave conditional scholarships to a larger percentage of their students.

Only 20 percent of law schools in the top 50 have conditional scholarship programs and only 10 percent actually reduced or eliminated scholarships for some of their students. Outside the top 50 law schools, however, more than two-thirds of law schools have conditional scholarship programs and roughly 36.5% of all law students have conditional scholarships. This means more than one-third of first-year students in law schools ranked 51 and below in the U.S. News 2015 rankings needed to be concerned about whether they would perform well enough to retain their conditional scholarships.

December 31, 2015 in Data on legal education | Permalink | Comments (0)

Tuesday, December 29, 2015

Updating the Transfer Market Analysis for 2015

This blog posting updates my blog postings here and here of December 2014 regarding what we know about the transfer market. With the release of the 2015 Standard 509 Reports, we know have two years of more detailed transfer data from which to glean insights about the transfer market among law schools.

NUMBERS AND PERCENTAGES OF TRANSFERS – 2011-2015

The number of transfers dropped to 1,979 in 2015, down from 2,187 in 2014 and 2,501 in 2013. The percentage of the previous fall’s entering class that engaged in the transfer market also dropped slightly to 5.2%, down from 5.5% in 2014 and 5.6% in 2013. In other words, there is no reason to believe the transfer market is “growing” as a general matter. It has been fairly consistently in the 4.6% to 5.6% range for the last five years, with an average of 5.2%

 

2011

2012

2013

2014

2015

Number of Transfers

2427

2438

2501

2187

1979

Previous Year First Year Enrollment

52,500

48,700

44,500

39700

37900

%   of Previous First-Year Total

4.6%

5%

5.6%

5.5%

5.2%

SOME LAW SCHOOLS CONTINUE TO DOMINATE THE TRANSFER MARKET

The following two charts list the top 15 law schools in terms of receiving transfer students in descending order in Summer 2013 (fall 2012 entering class), Summer 2014 (fall 2013 entering class), and Summer 2015 (fall 2014 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first-year class.

Note that in these two charts, the “repeat players” are bolded – those schools in the top 15 for all three years are in black, those schools in the top 15 for two of the three years are in blue.   Seven of the top ten by number in 2015 and seven of the top ten by percentage 2015 have been among the top 15 on each list for all three years.

Largest Law Schools by Number of Transfers from 2013-2015

School

Number in 2013

School

bNumber in 2014

School

Number in 2015

Georgetown

122

Georgetown

113

Georgetown

110

George   Wash.

93

George Wash.

97

George Wash.

109

Florida   St.

90

Arizona St.

66

Arizona St.

65

Emory

75

Idaho

57

Harvard

55

Arizona   State

73

Cal. Berkeley

55

Emory

51

American

68

NYU

53

NYU

51

Texas

59

Emory

50

Cal. Berkeley

49

Columbia

52

Columbia

46

Rutgers

45

NYU

47

American

44

Columbia

44

Minnesota

45

UCLA

44

Miami

44

Arizona

44

Wash. Univ.

44

UCLA

43

Northwestern

44

Texas

43

Texas

37

UCLA

41

Minnesota

37

American

33

Cardozo

38

Northwestern

35

Florida St.

32

Southern   Cal.

37

Harvard

33

Minnesota

31

TOTAL

928

 

817

 

799

% of All Transfers

37.1%

 

37.4%

 

40.4%

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class - 2013-2015

School

% 2013

School

% 2014

School

% 2015

Florida State

48.1

Arizona State

51.6

Arizona State

45.5

Arizona State

48

Idaho

51.4

Emory

22.9

Utah

34.7

Washington Univ.

23.3

George Wash.

20.2

Emory

29.6

Emory

22.9

Miami

19.2

Arizona

28.9

Georgetown

20.8

Georgetown

19

Minnesota

22

George Wash.

20.2

Cal. Berkeley

17.9

George Wash.

21.8

Cal. Berkeley

19.4

Florida St.

17

Georgetown

21.2

Florida St.

18.2

Florida Int’l

16.7

Rutgers – Camden

20.7

Rutgers – Camden

17.1

Minnesota

16.1

Southern Cal.

19.7

Southern Cal.

17.1

Utah

16

Texas

19.1

Minnesota

16.7

UNLV

14.3

Cincinnati

17.5

Utah

15.9

UCLA

13.7

Northwestern

17.1

Northwestern

15.3

Texas

12.3

Washington Univ.

15.4

UCLA

15

Chicago

12.1

Univ. Washington

15.3

Seton Hall

14.5

Rutgers

12.1

Interestingly, the number of law schools welcoming transfers representing more than 20% of their first-year class has fallen from nine in 2013 to only three in 2015.

Nonetheless, as shown in the following chart, we are continuing to see a modest increase in concentration in the transfer market between 2011 and 2015 as the ten law schools with the most students transferring in captured an increasing share of the transfer market, from 23.5% in 2011 to 31.5% in 2015.  Nearly one-third of all transfers in 2015 transferred to one of the ten schools with the most transfers.

Top Ten Law Schools as a Percentage of All Transfers

 

2011

2012

2013

2014

2015

Total Transfers

2427

2438

2501

2187

1979

Transfers to 10 Law Schools with Most Transfers

570

587

724

625

623

Transfers to 10 Law Schools with Most Transfers as % of Total Transfers

23.5%

24.1%

28.9%

28.6%

31.5%

NATIONAL AND REGIONAL MARKETS

Starting in 2014, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring schools with more than twelve transfers in to report not only the number of students who have transferred in, but also the law schools from which they came (indicating the number from each law school) along with the 75%, 50% and 25% first-year, law school GPAs of the students who transferred in. This allows us to look at where students are coming from and are going to, and to look at the first-year GPA profile of students transferring in to different law schools. The following chart focuses on the top 15 law schools in terms of transfers in for 2015 presented in descending USNews ranking. It indicates the extent to which these law schools were attracting transfers from their geographic region and also identifies the law school that provided the largest number of transfers to each listed law school as well as the percentage of transfers that came from that school.

Percentage of Transfers from Within Geographic Region and Top Feeder School(s)

School

# of Transfers

14/15

Region

Regional # of Transfers

14/15

Regional % of Transfers

14/15

School from Which Largest Number of Transfers Came in 2015

#/% of Transfers

Harvard

33/55

NE

6/15

18/27

GWU

6/11%

Columbia

46/44

NE

19/19

41/43

Wash. Univ.

5/11%

NYU

50/51

NE

20/15

40/29

Georgetown

7/14%

Berkeley

55/49

CA

43/29

78/59

Hastings

19/39%

Georgetown

113/110

Mid-Atl

49/43

43/39

GWU

11/10%

Texas

43/37

TX

27/22

63/59

Texas Tech

6/16%

UCLA

44/43

CA

31/26

70/60

Hastings

9/21%

Emory

53/51

SE

40/31

75/61

Atlanta’s   John

Marshall

11/22%

Minnesota

37/31

MW

21/17

57/55

Hamline

9/29%

GWU

97/109

Mid-Atl

78/70

80/64

American

44/40%

Azizona St.

66/65

SW

51/48

77/74

Arizona Sum.

47/72%

Florida St.

31/32

SE

24/27

77/84

Florida Coastal

11/34%

Miami

29/44

SE

21/27

72/61

St. Thomas

12/27%

American

44/33

Mid-Atl

14/6

32/18

Charleston

3/9%

Rutgers*

45

NE

29

64

Widener-Delaware

10/22%

*Rutgers is a unified school as of 2015, but for 2014 reported data separately for the Newark campus and the Camden campus, so this only reports the 2015 data.

For these top 15 law schools for transfer students in 2015, 10 schools obtained most of their transfers (55% or more) from within the geographic region within which the law school is located, while five schools (Harvard, Columbia, NYU, Georgetown and American) had fewer than 45% of their transfers from within the region in which they are located.  Interestingly, 11 of the 14 law schools with data for both 2014 and 2015 saw a decline in the percentage of transfers from within the region in which the law school is located. Only two law schools in 2015 had more than 70% of their transfers from within the region in which the law school is located (Arizona State and Florida State), down from seven such law schools in 2014.

Moreover, several law schools had a significant percentage of their transfers from one particular feeder school.  For Miami, roughly 34% of its transfers came from St. Thomas University (Florida); for Berkeley, roughly 39% of its transfers came from Hastings; for George Washington, 40% of its transfers came from American; and for Arizona State, 72% of its transfers came from Arizona Summit.

The chart below shows the tiers of law schools from which the largest 15 law schools in the transfer market received their transfer students.  Ten of the top 15 law schools for transfers are ranked in the top 20 in USNews, but of those 10, only six had 75% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, Berkeley, UCLA and George Washington.  Two additional schools, Georgetown and Texas, had at least 50% of their transfers from schools ranked between 1 and 99.  The remaining two law schools ranked in the top 20 in USNews (Emory and Minnesota) and the other five law schools in the list had at least half of their transfer students from law schools ranked 100 or lower, with five of those law schools having 75% or more of their transfers from law schools ranked 100 or lower. 

In addition, it shows that as you move down the rankings of law schools that are large players in the transfer market, the general trend in first-year law school GPA shows a significant decline, with several schools taking a number of transfers with first-year GPAs below a 3.0, including Minnesota, Arizona State, Florida State, Miami and American.

 Percentage of Transfers from Different Tiers of School(s) for 2014 and 2015 Along With First-Year Law School GPA 75th/50th/25th)

(In each column, the number on the left is the 2014 number and the number on the right is the 2015 number.)

(Highlighted cells indicate the modal response for each law school.)

(Color-coding of GPA data Reflects increases (Green) or decreases (Red) of .05 or more points.)

 

# of Transfers

 

Rank Top 50

   #              %

Rank 51-99

   #              %

Rank 100+

   #              %

GPA 75th

 

GPA 50th

 

GPA 25th

 

Harvard

33/55

23/41

70/75

10/13

30/24

0/1

0/2

3.95/3.98

3.9/3.92

3.83/3.85

Columbia

46/44

29/30

63/68

14/10

30/23

3/4

7/9

3.81/3.82

3.75/3.76

3.69/3.66

NYU

50/51

41/40

82/78

7/10

14/20

2/1

4/2

3.74/3.76

3.62/3.68

3.47/3.52

Berkeley

55/49

17/15

31/31

27/26

49/53

11/8

20/16

3.9/3.87

3.75/3.81

3.68/3.69

Georgetown

113/110

27/30

24/27

38/30

34/27

48/50

42/45

3.77/3.77

3.67/3.66

3.55/3.59

Texas

43/37

17/10

40/27

13/13

30/35

13/14

30/38

3.62/3.6

3.45/3.46

3.11/3.32

UCLA

44/43

15/15

34/35

23/23

52/53

6/5

14/12

3.73/3.7

3.58/3.58

3.44/3.46

Emory

53/51

3/5

6/10

7/8

13/16

43/38

81/75

3.42/3.45

3.27/3.3

2.93/3.06

GWU

97/109

13/21

13/19

73/63

75/58

11/25

11/23

3.53/3.46

3.35/3.32

3.21/3.15

Minnesota

37/31

4/6

11/19

12/7

32/23

21/18

57/58

3.3/3.43

3.1/3.12

2.64/2.96

Arizona St.

66/65

4/0

6/0

5/6

8/9

57/59

86/91

3.51/3.5

3.23/3.17

2.97/2.95

Florida St.

31/32

2/0

6/0

4/2

13/6

25/30

81/94

3.29/3.32

3.1/3.14

2.9/2.96

Miami

29/44

1/3

3/7

4/7

14/16

24/34

83/77

3.3/3.26

3.07/3.05

2.87/2.9

American

44/33

2/0

5/0

14/1

32/3

28/32

64/97

3.25/3.04

2.94/2.89

2.78/2.74

Rutgers

45

0

0

2

4

43

96

/3.29

/3.05

/2.75

STILL MANY UNKNOWNS

As I noted last year, this more detailed transfer data should be very helpful to prospective law students and pre-law advisors, and to current law students who are considering transferring.  This data gives them a better idea of what transfer opportunities might be available depending upon where they go to law school (or are presently enrolled as a first-year student).

Even with this more granular data now available, however, there still are a significant number of unknowns relating to transfer students.  In an upcoming post, I will touch on some questions that remain unanswered about the transfer market as well as a few other aspects of the transfer experience.

December 29, 2015 in Data on legal education | Permalink | Comments (0)

Sunday, December 6, 2015

The Opaqueness of Bar Passage Data and the Need for Greater Transparency

There has been a great deal of discussion lately over at The Faculty Lounge regarding declines in law school admissions standards, declines in bar passage rates, and the general relationship between LSAT scores and bar passage. Much of this discussion is clouded by the lack of meaningful data regarding bar passage results.  In this blog posting I will delineate several questions that just cannot be answered meaningfully based on the presently available bar passage data.

The national first-time bar passage rate among graduates of ABA-accredited law schools fell significantly in 2014. According to the NCBE’s statistics, the average pass rate from 2007-2013 for July first-time test-takers from ABA-accredited law schools was 83.6%, but fell to 78% in 2014. (2015 data won’t be available until next Spring when it is released by the NCBE.)

While there might be some reasons to believe these results were somewhat aberrational given that the objective criteria of the entering class in 2011 was only modestly less robust than the objective criteria of the entering class in 2010, and given the ExamSoft debacle with the July 2014 bar exam, the results are concerning, given that the objective criteria of the entering classes in 2012, 2013 and 2014 showed continued erosion. As the last two years have seen declines in the median MBE scaled score among those taking the July bar exam, the changes in entering class credentials over time suggest further declines in median MBE scaled scores (and bar passage rates) may be on the horizon.

In 2010, there were roughly 1,800 matriculants nationwide with LSATs of 144 or less. In 2012, there were roughly 2,600 matriculants nationwide with LSATs of 144 or less. In 2014, there were roughly 3,200 matriculants nationwide with LSATs of 144 or less. Recognizing that law school grades will be a better predictor of bar passage than LSAT scores, I think it is safe to say that entering law students with LSATs in this range are more likely than entering law students with higher LSATs to struggle on the bar exam.  Because the number of those entering law school with LSAT scores of 144 or less has grown substantially (particularly as a percentage of the entering class, more than doubling from less than 4% in 2010 to more than 8% in 2014), many are concerned that bar passage rates will continue to decline in the coming years.

While there has been a great deal of discussion regarding declines in admission standards and corresponding declines in bar passage standards, this discussion is profoundly limited because the lack of meaningful bar passage data presently provided by state boards of law examiners and by the ABA and ABA-accredited law schools means that we do not have answers to several important questions that would inform this discussion.

  1. What number/percentage of graduates from each law school (and collectively across law schools) sits for the bar exam in July following graduation and in the following February? Phrased differently, what number/percentage of graduates do not take a bar exam in the year following graduation?

This is a profoundly important set of questions as we look at employment outcomes and the number/percentage of graduates employed in full-time, long-term bar passage required positions. Given that only those who pass the bar exam can be in full-time, long-term bar passage required positions, it would be helpful to know the number/percentage of graduates who “sought” eligibility for such positions by taking a bar exam and the number/percentage of graduates who did not seek such eligibility. It also would be helpful to understand whether there are significant variations across law schools in terms of the number of graduates who take a bar exam (or do not take a bar exam) and whether those who do not take a bar exam are distributed throughout the graduating class at a given law school or are concentrated among those at the bottom of the graduating class. At present, however, this information simply is not available.

  1. What is the first-time, bar passage rate for graduates from ABA-accredited law schools?

One might think this would be known as ABA-accredited law schools are required to report first-time bar passage results. But the way in which first-time bar passage results are reported makes the data relatively unhelpful. Law schools are not required to report first-time bar passage for all graduates or even for all graduates who took a bar exam. Rather, law schools are only required to report first-time bar passage results for at least 70% of the total number of graduates each year. This means we do not know anything about first-time bar passage results for up to 30% of graduates of a given law school. Across all law schools, reported results account for roughly 84% of graduates, leaving a not insignificant margin of error with respect to estimating bar passage rates.

People would have been flabbergasted if the ABA had required reporting of employment outcomes for only 70% of graduates. Now that the ABA is requiring reporting on employment outcomes for all graduates, there is no good reason why the ABA should not be requiring bar passage accounting for all graduates, requiring law schools to note those who didn't take a bar exam, those who took and passed a bar exam, those who took and failed a bar exam, and those for whom bar status is unknown.  (Up until recently, some boards of law examiners were not reporting results to law schools, but my understanding is that the number of state boards of law examiners not reporting results to law schools is now fairly small.)

Notably, for 2011, 2012, and 2013, the average bar passage rate for first-time takers from all ABA-accredited law schools based on data reported by the law schools was consistently higher than the data reported by NCBE for the corresponding years (2011 – 83.8% v. 82%, 2012 – 81.8% v. 79%, 2013 – 82.4% v. 81%. (Moreover, first-time takers are not measured equivalently by the ABA and by the NCBE. The ABA reporting requirement focuses on graduates who took any bar exam for the first-time. The NCBE defines as first-time takers any person taking a bar exam in a given jurisdiction for the first-time. Thus, the NCBE set of first-time takers is broader, as it includes some people taking a bar exam for the second time (having taken the bar exam in another jurisdiction previously).

  1. What is the “ultimate” bar passage rate for graduates from ABA-accredited law schools?

Even though a number of commenters have noted that “ultimate” bar passage is more important than first-time bar passage, there is no publicly available data indicating the ultimate bar passage rate on a law school by law school basis for the graduates of each ABA-accredited law school. What number/percentage of graduates of a given law school who take a bar exam pass after the second attempt? What number/percentage of graduates of a given law school who take a bar exam pass after the third attempt? What number/percentage of graduates of a given law school never pass a bar exam? This information just is not publicly available at present.

While Standard 316, the bar passage accreditation standard, allows schools to meet the standard by demonstrating that 75% or more of those graduates who sat for a bar exam in the five most recent calendar years passed a bar exam, this “ultimate” bar passage data is not publicly disseminated. Thus, while first-time bar passage data is limited and incomplete for the reasons noted above, “ultimate” bar passage data on a law school by law school basis is actually not available.

The modest amount of information available on “ultimate” bar passage rates is not very helpful.  The LSAC National Longitudinal Bar Passage Study contains some analysis of "ultimate" bar passage rates, but it focused on the entering class in the fall of 1991, which it described as being “among the most academically able ever to enter” law school based on entering class statistics (page 14), a description that could not be used with the classes that have entered in the last year or two or three. It also does not contain any information about "ultimate" bar passage for graduates of individual law schools.  In addition, Law School Transparency has recently received some information from at least one law school that has requested anonymity. Much better “ultimate” bar passage information is needed to better inform many of the discussions about the relationship between entering class credentials and bar passage.

  1. How can we compare bar passage results from one jurisdiction to another?

Most state boards of law examiners do not present data regarding bar passage that allows reasonable bases for analyzing the results in ways that provide meaningful insight and a meaningful basis for comparison. Fewer than one-third of states publicly provide information in which a delineation is made between first-time takers and repeat takers on a law school by law school basis and only a few of these provide information about MBE scores on a school by school basis. Accordingly, it is very difficult to make meaningful comparisons of year-over-year results in the months following the July bar exam, because data is rarely reported in a consistent manner. The NCBE does provide statistics annually (in the spring) which includes a delineation of bar passage rates by state based on first-time test takers from ABA-accredited schools, but the NCBE does not provide MBE scores on a state by state basis (although it seemingly should be able to do this).

Conclusion

There is a need for much greater transparency in bar passage data from boards of law examiners and from the ABA and ABA-accredited law schools. It well may be that some law schools would be a more meaningful investment for "at-risk" students, those whose entering credentials might suggest they are at risk of failing the bar exam, because those law schools have done a better job of helping "at risk" students learn the law so that they are capable of passing the bar exam at higher rates than graduates of other law schools with comparable numbers of at risk students. It may well be that some jurisdictions provide "at risk" students a greater likelihood of passing the bar exam.  At the moment, however, that information just isn’t available. Much of the disagreement among various commentators about the relationships between admission standards and bar passage rates could be resolved with greater transparency – with the availability of much better data regarding bar passage results.

December 6, 2015 in Current events, Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Monday, November 9, 2015

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

This is Part IV of a blog series that focuses on alumni surveys based on data for Northeastern Law alumni who graduated between 1971 and 2012 (n = 833, 21% response rate).  Prior posts covered data related to the pre-law (Part II) and law school (Part III) experience.  This final installment summarizes data on the careers of Northeastern alumni. 

Varied Careers

One of the most significant post-law school findings from the Northeastern alumni survey is the sheer breadth of careers.  Sure, we all know in a general sense that lawyers have very diverse careers, yet I found the sheer magnitude of that diversity both striking and surprising.

Below is a graphic that summarizes the percentage of Northeastern alumni who have worked in a particular practice settings,by decade of graduation.

% Alumni/ae who have worked in Practice Setting, by Decade of Graduation

Job_Type_Percentages_By_Decade

To interpret this graphic [click on to enlarge], it is important to understand the composition of the underlying data.  The survey question asks, “Describe your previous employment history starting with your most recent employer first.”  Some graduates have only one job to report -- the one they started after graduation; others have had many.  These jobs are then classified by practice setting and binned into the six categories shown in the above graphic.  Note that bars total well beyond 100%. Why?  Because alumni are changing not just jobs, but also practice settings—on average, at least once, but sometimes two, three, or even four times over the course of several decades.

The graphic above conveys several significant pieces of information:

General point.  Legal careers are extremely varied.  As it has tightened up, the entry level market has become an area of intense scrutiny, and rightly so because it affects early career lawyers and law school applicant volume.  In contrast, the chart above reflects the longer view. It suggests that very able, motivated people who attend law school go on to varied careers that no one could have predicted at the time of enrollment, including--most significantly--the entering student.  These generational cohorts are a versatile group that comprise a disproportionate number of leaders in industry, government, and the nonprofit world.  Law schools cannot take full credit for this; we admit people of enormous potential.  Yet many alumni tell me that their legal training and knowledge has given them an enormous leg up. One law grad who is now a successful business executive recently asked me, "Why is it JD-advantaged? Why not the advantage of the JD?" 

Northeastern.  It is somewhat surprising that for Northeastern alumni who graduated during the 1970s, 80s, and 90s, 48% have worked in government.  That is a big number.  Northeastern’s mission and faculty emphasize public service. This same emphasis appears to be reflected in the careers of its graduates.

Changing Legal Ecosystem.  As noted in Posts II and III, because the Northeastern alumni survey spans multiple decades, it is possible that responses will be influenced by changes in the underlying legal economy. Stated simply, career opportunities and competition may have changed substantially between 1971 and 2012.  Such a pattern appears to be present here.  Specifically, 30% or more of graduates of the 1990s and 2000s have worked in private industry compared to 24% or less for those graduating in the 1970s and 80s.  This would be consistent with the incomplete absorption theory discussed in Part III.  See also Henderson, “Is the Legal Profession Showing its Age,” LWB, Oct 12, 2015.

Practicing versus Non-Practicing Lawyers

Another significant finding that flows from the Northeastern alumni survey are the workplace experiences of practicing versus non-practicing lawyers. 

Approximately 25% of respondents were not practicing lawyers but working, with no significant difference by decade cohort. The chart below compares these two groups based on 19 dimensions of workplace satisfaction. The question is drawn directly from the AJD Wave III:  “How satisfied are you with the following aspects of your current position?” 

Dimensions of Workplace Satisfaction, Practicing vs. Non-Practicing Lawyer

  SatisfactionDifferential-1

Choices ranged from 1 (highly dissatisfied) to 7 (highly satisfied).  The chart above summarizes the differential between the two groups.  For example, on Intellectual Challenge, we subtracted the non-practicing attorney average from the practicing attorney average.  The result is +.35 difference for practicing attorneys, meaning that they are more likely to find intellectual challenge in their work.  Likewise, the same results holds for the substance of one's work.  

In contrast, on workplace diversity, non-practicing lawyers were significantly more satisfied – on average, roughly 2/3 of a response point.  In fact, non-practicing lawyers were more likely to rate their workplaces higher on several surprising factors, including social value of work, performance reviews, work/life balance, and pro bono opportunities.

Can we generalize from these findings?

The results presented in this blog series reflect the collective experience of one law school’s alumni base – Northeastern.  There is no way to know if these results can be fairly generalized to the larger law graduate population, though there is a reasonable basis to believe that at least some of them can (e.g., the changing ecology of the legal job economy).  Yet, why speculate when the cost of collecting and analyzing the data is going down and the value of such applied research is going up?

AbfLet me reiterate my suggestion from Part I that a consortium of law schools should begin this effort under the aegis of the American Bar Foundation (the prime architect of the AJD Project).  Northeastern has agreed to donate the survey and research tools we created as part of the Outcomes Assessment Project.   Such an initiative would enable researchers to draw stronger conclusions from these data, including potentially laudatory school-level effects that can help the rest of legal education. 

I have been researching legal education for many years.  I have spent enough time with alumni at Indiana Law, Northeastern Law, and several other law schools to gain a strong impression that law school graduates are having, on balance, important, satisfying and high-impact careers.  Further, there is strong evidence that the legal industry is undergoing a significant structural change – that is much of what the Legal Whiteboard catalogs.  This structural change topic is of great interest to prospective students, lawyers, and the mainstream press.  Yet, these two themes--the careers of alumni and structural change--are related. 

If legal education wants to influence the narrative on the value of the JD degree, it is far better to rely on data rather than rhetoric.  My sense is that data on our alumni will tell a rich, balanced story that will enable us to make better decisions for all stakeholders, including prospective law students. Further, if we don’t gather high quality facts, we can expect to get outflanked by a blogosphere and a mainstream press that are armed with little more than anecdotes.  To a large extent, that is already happening.  Now is the time to catch up.

EvanparkerCredits

This blog post series would not have been possible without the dedication and world-class expertise of my colleague, Evan Parker PhD, Director of Analytics at Lawyer Metrics.  Evan generated all the graphics for the Northeastern Alumni/ae Survey and was indispensable in the subsequent analysis. He is a highly talented applied statistician who specializes in data visualization.  Evan, thanks for you great work!

For other “Varied Career Path” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part II, Alumni Surveys, Before-Law School

Part III: Alumni Surveys, During Law School


November 9, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Structural change | Permalink | Comments (0)

Wednesday, November 4, 2015

Part III: Alumni Surveys, Responses on the Law School Experience

Part II of this blog series reported that the top motivations to attend law school have remained the same for over four decades, at least for Northeastern University School of Law (NUSL).  Alumni reported the same underlying desire: to build a satisfying, intellectually challenging career where they could help individuals and improve society. This may be an image forged by pop culture and the idealism of youth, but it is also likely sincere.  It is the better side of our human nature. 

Part II also showed two motivations to attend law school – the desire for “transferable skills” and “eventual financial security"-- that did appear to be shifting over time.  I suggested that these shifts are more likely about a changing ecosystem than a fundamental shift in the type of people applying to law school.  

A similar ecological theme can be observed in the "During Law School" data. For example, since its reopening in 1968, Northeastern Law has required every graduate to complete four 11-week cooperative placements, usually in four different practice settings (e.g., government agency, public defender, large firm, public interest organization). As noted in Part I, students can be paid during co-op because it is a university rather than an ABA requirement. Cf. Karen Sloan, “The ABA says No to Paid Student Externships,” Nat’l L J, June 10, 2014.

One series of questions in the alumni survey specifically focused on the co-op experience, including co-op quality, what was learned, and whether they were paid.  The chart below reveals a steady, four-decade decline in the number of paid co-ops.

  NPaidCoopsByYear-1

In the early 1970s, essentially all four co-ops were paid.  By the mid-80s, the average was down to three. Since the 2000s, the average has been two or fewer paid co-ops.

To my mind, the above trendline is compelling evidence of a steady, systemic shift in the legal ecosystem. I have written about this pattern in the past, suggesting that the rate of absorption of law grads into the licensed bar has been going down since the 1980s.  See Henderson, “Is the Legal Profession Showing its Age,” LWB, Oct 12, 2014 (noting that between 1980 and 2005, the average age of licensed lawyers increased from 39 to 49).  

When I saw this downward trendline for the first time, I recalled my numerous interviews with NUSL alumni/ae from the 1970s. In describing their co-ops, they spoke of opportunities that were plentiful and varied. I often heard the refrain, “I paid for law school mostly with my income from co-op.”  Note that during the 1970s, graduating from college was much less prevalent than today.  Law firms were also growing, with 1970 becoming a major inflection point in the rise of the large law firm. See Galanter & Palay, Tournament of Lawyers (1991) (seminal text collecting and analyzing data on the growth of large firms).

The trendline on paid co-ops also made me rethink what I heard from NUSL co-op employers. The school has roughly 900 employers who regularly or semi-regularly participate in co-op.  I heard several regular employers express strong preferences for “third or fourth co-ops." Why?  Because third or fourth co-op students already had significant legal experience and needed less training to be valuable to the employer.  Training is costly. Even if the trainee is unpaid, the lawyer-as-teacher is expending their own valuable time.  If an employer is going to provide training, they need a way to recapture that investment. Unpaid labor for eleven weeks is one potential way; if the labor is already partially trained, that is even better.

Unfortunately, doing a great job for a co-op employer does not guarantee permanent employment or even a modest wage for temporary work.  The legal ecosystem does not reliably and consistently support those outcomes. Yet, 20, 30, or 40 years ago, the dynamics were far more favorable. 

Obviously, in the year 2015, law grads are having a difficult time finding permanent, long-term professional employment (bar passage-required, JD-advantaged, or non-legal professional jobs).  The shortage of high-quality entry level jobs has given rise to criticisms that legal education needs more practical training.  The implicit assumption is that such a change will cure the underemployment problem.  I am skeptical that is true. 

A more likely explanation for law grad underemployment is that the supply of trained lawyers is in excess of demand, partially due to demographics and partially due to the inability of most citizens to afford several hours of a lawyer's time.  This is a very difficult problem to fix. But misdiagnosing the problem does not help.

To the extent a legal employer is looking for a practice-ready law grad, Northeastern’s co-op model is as likely to deliver that outcome as anything else I have observed.  My in-depth review for how co-op affects professional development is written up in OAP Research Bulletin No. 3.  Ironically, what may be the best practice-ready model among ABA-accredited law schools is a 50-year old program that most critics may not know exists. But see Mike Stetz, “Best Schools for Practical Training,” Nat’l Jurist, March 2015 (ranking Northeastern No. 1).

The experiential education crowd will be heartened by another “During Law School” finding.  Among 833 alumni respondents, there were more than 3,200 co-ops identified by practice setting.  Alumni were asked to identify their most valuable co-op and provide a narrative as to why. 

Below is a chart that plots the difference between the baseline frequency of a particular co-op practice setting and how often that practice setting was picked as the most valuable.  The scale is in standard deviation units, with “par” meaning that the practice setting was most valuable in the same proportion as its frequency in the overall sample.

  MostImportantCoop-1

It is not hard to see the common theme.  Co-ops where students can observe lawyers in action – or better yet, get stand-up time in court – were rated as much more valuable.  The table below captures some of the underlying narrative comments.

Narrativetable

For other “During Law School” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part II, Alumni Surveys, Before-Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 4, 2015 in Blog posts worth reading, Data on legal education, Data on the profession | Permalink | Comments (0)

Monday, November 2, 2015

Part II: Alumni Surveys, Pre-Law Characteristics and Motivations

Building on the introduction (Part I) of this blog series, our alumni survey of Northeastern University School of Law yielded cross-sectional data that span graduates from 1971 to 2012.  Because of the large time span, some of the most interesting responses to questions tend to fall into two buckets:

  1. What is staying the same?  Here we are looking for response patterns that are relatively stable and constant across age cohorts.
  2. What is changing?  Likewise, we are also interested in responses that appear to be changing as a function of time of graduation.

In the portion of our analysis that looked at pre-law characteristics and motivations, our most striking findings tended to fall into bucket #1. 

For example, below is a graphic summarizing responses to the question, “How important were the following goals in your decision to attend law school?” Responses are organized by decade of graduation.  They are ordered by most important to least important for respondents who graduated in 2000 or later.

                              Goals for Attending Law School, by Decade of Graduation

GoalsLawSchoolDecade-1
One of the most striking features is that the top three responses are essentially identical for all four age cohorts.  For each group, the desire to have a satisfying career, help individuals, and improve society were all, on average, very important in the decision to attend law school. 

Although there are differences across age cohorts, there remains relatively clear clustering by decade of graduation. (Query: would this same pattern hold true at other law schools?  One of the advantages of pooling data across schools is the ability to isolate a self-selection effect that operates at the school level.)

Yet, some factors appear to be changing over time, such as the importance of transferable skills and eventual financial security.  With each decade cohort, respondents are rating these factors progressively more important to their decision to attend law schools. Likewise, “other goals” appear to be progressively less important. 

These patterns (and others survey results I will report in Parts III and IV) suggest gradual changes in the knowledge worker ecosystem that require students to be more deliberate and focused in their decision to attend law school.  For example, costs of all of higher education are going up at the same time that the financial payoffs of traditional graduate and professional education are becoming less certain.  This is an ecological effect that is bound to have an influence on students and student decision making.  Although legal education would be part of this shift, the shift itself would not be unique to law.

This interpretation is consistent with our focus group discussions with Northeastern alumni.  This group queried whether the term “transferable skills” was even part of the lexicon when they were graduating from college.  Likewise, the group commented that the decision to attend law school during the 1970s and 1980s was not difficult because tuition was relatively low and jobs, including paid co-op jobs, were relatively plentiful. Although the legal market may be tighter and more complex than in earlier decades, the Northeastern alumni commented that the tradeoffs were changing for all knowledge workers.  

For other “Before Law School” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part III: Alumni Surveys, During Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 2, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Important research | Permalink | Comments (1)

Part I: What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

BryantgarthSeveral years ago, as the legal academy was beginning to work its way through the implications of the landmark “After the JD” Project (AJD), one of the principal investigators, Bryant Garth, commented to a group of fellow law professors that “within a few years it will be educational malpractice for law schools to not study their own alumni.”

Garth had special standing to make this claim, as he had launched the AJD during his long tenure at the American Bar Foundation and then went on to serve as Dean of Southwestern Law School in Los Angeles. While at Southwestern, Garth taught a short 1L course about legal careers that combined AJD findings with live interviews with Southwestern alumni. Despite decades of research studying lawyers, Garth gushed at how much he personally learned from these interviews and how the narratives were often surprising and inspiring, particularly for Southwestern students filled with apprehension at what the future might hold.

I had occasion to remember Garth’s observations in early 2011 when Emily Spieler, then the Dean of Northeastern University School of Law (NUSL), suggested that I study her alumni.

Northeastern Law

Northeastern is an interesting case study because for nearly 50 years the school has required four 11-week cooperative placements (or “co-ops”) as a condition of graduation. To facilitate completion within three years, the 1L year at Northeastern is taught in semesters while the 2L and 3L years are taught over eight alternating quarters. Summer-winter co-op students take classes during the fall and spring quarters, while fall-spring co-op students attend classes in the summer and winter quarters. Because co-ops are not for academic credit – they fulfill Northeastern University rather than ABA-accreditation requirements – students can be paid for the full 11 weeks. (More on that in Part III of this series.)

Dean Spieler wanted a third party to study Northeastern because, in her experience as dean, her many encounters with Northeastern alumni suggested to her that the School’s unusual education model was accelerating the professional development of its students and enabling them to make better, more informed career choices.

Acceleration of profession development is a very difficult effect to measure, but it is certainly plausible. In fact, the entire experiential law movement is largely premised on this claim. So I signed onto a multi-year initiative that we called the Outcomes Assessment Project (OAP).

The premise of the OAP was very unusual. Through a generous but anonymous benefactor, the research tools and templates developed for the OAP would be made available to other law schools interested in studying graduates. The intent is for law schools to accumulate data using similar methods and instruments, driving up the value of the data (because it is comparable across schools) while driving down the cost of collection and analysis.

There are many phases to the OAP, including those focused on admissions, the student experience, and co-op employers. Here, however, I wanted to write about what we learned from a survey of Northeastern’s alumni.

Last fall, we sent a survey instrument to Northeastern alumni who graduated from the law school between 1971 and 2012 (~4,000 law grads for which NUSL had a current email address). The survey instrument was substantially based on the AJD Wave III survey instrument, which was sent to a representative sample of law graduates from all ABA-accredited law schools who took the bar in the year 2000.

In contrast to the AJD, which has produced remarkable knowledge about law school grads from the year 2000, the OAP Alumni/ae Survey included four decades of law graduates from a single law school. Although this is not a true longitudinal sample, which samples the same people over time, this methodology enables cross-sectional comparisons between different cohorts of graduates (e.g., by decade of graduate or pre/post AJD).

The response rate of the Northeastern alumni survey was 21% (833 total completed questionnaires), which is relatively high for a long online survey. Because the resulting sample substantially mirrored the baseline data we had for Northeastern alumni practice areas and years of graduation, we were confident that the resulting sample was both representative and reliable.

Applied Research

Similar to the AJD, the OAP Alumni/ae Survey produced enough data to keep researchers busy for several years. Hopefully, these data will eventually be archived and aggregated at the American Bar Foundation or a similar institution in order to facilitate a broader and deeper understanding of legal careers.

However, the OAP was largely set up to be applied research. What does this mean? Here, the goal is, at least in part, to obtain data that is operational in nature, thus enabling a law school to examine and test fundamental assumptions and generate insights related to its stated goals and mission. In a word, to improve.

Further, when skillfully boiled down using data visualization, the findings themselves tend to be of great interest to all law school stakeholders, including alumni, faculty, administrative staff, current students, and prospective students. Interest is particularly piqued during times of transition and uncertainty, such as now, when law schools and the practicing bar are looking to each other to provide potential answers and support.

To makes results as accessible as possible, we decided to present the preliminary Alumni Survey results in a simple three-part framework:

  • Before Law School: pre-law characteristics and motivations
  • During Law School: the law school experience
  • After Law School: job mobility and satisfaction

This week, I am going to give a sampling of findings from all three sections – findings that will likely be of interest to a non-Northeastern audience of law faculty, practicing lawyers, and students. If you are interested in reading the entire preliminary report, it can be found online at the Northeastern OAP website.

Links:

Part II, Before-Law School

Part III: Alumni Surveys, During Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 2, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Important research | Permalink | Comments (0)

Sunday, October 25, 2015

Is there a right way to respond to the "Law School Debt Crisis" Editorial?

Amidst all the other newsworthy topics, the New York Times editorial board made law school debt the lead editorial for today's Sunday edition. And the story line is not good.  

The editorial starts with the bleak statistics for Florida Coastal Law School -- low median LSAT scores and high debt loads, casting doubt on whether its graduates can pass the bar exam and repay their federally financed student loans.  The editorial highlights Florida Coastal' for-profit status but goes on to note that the rest of legal education is not much better. 

A majority of American law schools, which have nonprofit status, are increasingly engaging in such behavior, and in the process threatening the future of legal education.

Why? The most significant explanation is also the simplest — free money.

The editorial details changes in federal higher education finance that created the Direct PLUS Loan program, which, over-and-above Federal Stafford Loans, underwrites up to the full cost of attendance as determined by each law school.  The combination of poor job prospects and high debt have depressed applicant volume.  As the Times editorial notes, the systemic impact has been to lower admissions standards to sweep in students who will, as a group, struggle to pass the bar exam following graduation.  Virtually all of this is financed by DOE loan money.

I don't think the typical member of the legal academy understands the precarious financial condition of legal education.  The precariousness exists on two levels: (1) our financial fate is in the hands of the federal government rather than private markets; and (2) the Times editorial suggests that we have a serious appearance problem, which draws down the political capital needed to control our own destiny.  With the political winds so goes our budgets. 

I think it is important for the Association of American Law Schools (AALS) to take some decisive action in the very near future.  In this blog post, I explain where the money comes from to keep the law school doors open and why, as a consequence, we need to pay closer attention to the public image of legal education.  I then offer some unsolicited advice to the AALS leadership. 

(1) Who pays our bills?  

Over the last decade, the federal government has, as a practical matter, taken over the financing of higher ed, including legal education.  

Here is how it works.  Any law student who needs to borrow money to attend law school is strongly incentivized to borrow money from the Department of Education (DOE).  Although the DOE loans carry high interest rates -- 6.8% for Stafford Loans and 7.9% for Grad Plus -- they include built-in debt relief programs that functionally act as insurance policies for the risk that a graduate's income is insufficient to make timely loan repayments.  Law school financial aid offices are set up around this financial aid model and make it very easy for students to sign the loan documents, pay their tuition, and get disbursements for living expenses.

In the short to medium term, this is good for the federal government because the loans are viewed as income-producing assets in the budgets that get presented to and approved by Congress. But in the longer term this could backfire if a large portion of students fail to repay their full loans plus interest.  Federal government accounting rules don't require projections beyond ten years.  But already the government is beginning to see the size of the coming write-downs for the large number of graduates who are utilizing the Public Service Loan Forgiven program, which has a ten-year loan forgiveness horizon. And it is causing the feds to revise their budgets in ways that are politically painful.  With the loan forgiveness programs for private sector law grads operating on a 20- to 25-year repayment window, the magnitude of this problem will only grow.  

The enormous risk here for law schools is that Congress or the DOE will change this system of higher education finance.  For example, the Times editorial calls for capping the amount of federal loans that can be used to finance a law degree.  Currently, the limit on Stafford Loans for graduate education is $20,500, but Grad Plus loans have no limit at all.  If the DOE were to cap Grad Plus at $29,500 per year, leading to a total three-year federal outlay of $150,000 per law student, this would have an enormous adverse impact on the typical law school budget.

Law School Transparency reports that the average law school debt load for a 2014 law graduate is $118,570, but we know very little about the full distribution.  Because of the pervasiveness of the reverse Robin Hood policy, which uses tuition dollars of low credentialed students to finance scholarships for their high credentialed peers, there is likely a significant percentage of students at most law schools who graduate with more than $150,000 in law school debt.   Further, according to US News, there are twelve law schools -- including three in the T14 -- where the average law school debt load is more the $150,000.  Although there are no statistics on the percentage of law students graduating with greater than $200,000 in law school debt, law students tell me this amount is common. 

I have translated this meager public information into the chart below. The area in green is the volume of money that could disappear from law school budgets if the federal government imposed a hard limit on federally financed law school lending.

Lawschooldebtv3

Why would this money be at grave risk?  Two reasons:

First, private lenders will be reluctant to cover the entire shortfall.  For decades, private lenders played an important roll in law school finance.  But these lenders got pushed out of the market by the changes in federal higher ed finances described above.  Unfortunately, in the intervening years, the ratio of earning-power-to-debt has gotten too far out of whack.  To come back into this market, private lenders would need to be confident that loans would be repaid.  That likelihood is going to vary by law school and by law student, raising the cost of lending.  This means that, to varying degrees, virtually all law schools would have to sweat over money.  Unlike Grad Plus, private lenders may balk at financing full sticker tuition for lower credentialed students trying to attend the highest ranked school that admitted them.

Second, private lenders will not offer the same loan forgiveness options, such as IBR and Public Service Loan Forgiveness, currently offered by the federal government.  With the curtailed scope of these functional insurance programs, some portion of prospective law students will likely be unwilling to sign loan documents in excess of the federal lending cap.  Even very elite schools will feel the pain here.

(2) An appearance problem in the world of politics

I would bet a lot of money that law faculty have been emailing the Times editorial to one another, criticizing its lack of nuance.  But here is our problem.  We are not in a court where a judge will listen to our elegant presentation of facts and law.  Nor are we in the world of private markets where we can expect people to reliably follow their own economic self-interest.  We are in the realm of politics where sides get drawn based on appearance and political expediency.  To make matters worse, the legal academy just got lambasted by the paper of record on the left.

It is hard to argue that a cap on federal funding of legal education would be bad policy for students, the legal profession, taxpayers, or broader society.  Such a change would:

  1. Reduce the number of law grads going into a saturated labor market;
  2. Reduce the number of low credentialed students admitted to law school who will one day struggle to pass the bar;
  3. Reduce the risk of nonpayment of students loans currently borne by US taxpayers;
  4. Put in place serious cost-containment on legal education.

For law schools, however, such a change would produce layoffs and pay reductions.  And that may be the fate of the luckier schools.   It is widely known that most law schools are running deficits.  Central universities are looking for ways to wait out the storm.  But the cliff-like quality of a federal cap on law school lending would call the question of how much support is too much.  

What's the solution?

Legal education has a cost problem, but so does the entire higher ed establishment. Here is my unsolicited advice.

The leadership of the AALS needs to take a very strong public position that the trend lines plaguing higher ed need to be reversed.  This is not risky because it is so painfully obvious.  The AALS should then, in conjunction with the ABA, send a very public delegation to the Dept of Education. The delegation should be given a very simple charge:  Help the DOE

  1. Outline the systemic problems that plague higher education 
  2. Articulate the importance of sound policy to the national interest
  3. Formulate a fair and sustainable solution. 

I have faith that my legal colleagues would do a masterful job solving the problems of higher education.  And in the process, we'll discover that we have become the architects of a new system of higher ed finance that will be fair and equitable system for all stakeholders, including those employed in legal education.  That's right: act decisively to ensure a fair and equitable deal.  The only drawback is that it won't be the status quo that we'd instinctively like to preserve. 

October 25, 2015 in Blog posts worth reading, Current events, Data on legal education | Permalink | Comments (25)

Friday, October 2, 2015

Part Two - The Impact of Attrition on the Composition of Graduating Classes of Law Students -- 2013-2016

In late December 2014, I posted a blog entitled Part One – The Composition of the Graduating Classes of Law Students – 2013-2016.  That blog posting described how the composition of the entering classes between 2010 and 2013 has shifted.  During that time, the percentage at or above an LSAT of 160 dropped by nearly 20% from 40.8% to 33.4%.  Meanwhile, the percentage at or below an LSAT of 149 increased by over 50% from 14.2% to 22.5%. 

But this reflects the composition of the entering classes.   How do the graduating classes compare with the entering classes?  This depends upon the attrition experienced by the students in a given entering class.  This much belated Part Two discusses what we know about first-year attrition rates among law schools.

I have compiled attrition data from all of the fully-accredited ABA law schools outside of Puerto Rico for the last four full academic years.  I have calculated average attrition rates for the class as a whole and then broken out average attrition rates by law schools in different median LSAT categories – 160+, 155-159, 150-154 and <150.

In a nutshell, overall first-year attrition increases as the median LSAT of the law school decreases.  Over the last few years, while “academic attrition” has declined for law schools with median LSATs of 150 or greater, “other attrition” has increased modestly, particularly for law schools with median LSATs <150, resulting in a slight increase in overall first-year attrition between 2010 and 2013.

Overall First-Year Attrition Rates Have Increased Slightly

In calculating attrition rates, I wanted to capture those students who are no longer in law school anywhere.  Thus, for these purposes, “attrition” is the sum of “academic attrition” and “other attrition.”  “Academic attrition” occurs when a law school asks someone to leave because of inadequate academic performance.  “Other attrition” occurs when a student departs from the law school volitionally. Both of these categories exclude “transfers.”

The following chart shows that despite the declining “LSAT profile” of the entering classes between 2010 and 2013, there has been no meaningful change in the average “academic attrition” rate.  The modest increase in overall first-year attrition over this period, from roughly 5.8% to roughly 6.6%, is largely due to a growth in the “other attrition” category from roughly 2.5% to roughly 3.2%.

Overall First-Year Attrition for Classes Entering in 2010, 2011, 2012, and 2013

 

Beg. Enrollment

Academic Attrition

% Academic

Other Attrition

% Other

Total Attrition

% Attrition

2010-11

50408

1673

3.32

1256

2.49

2929

5.81%

2011-12

46477

1551

3.34

1262

2.72

2813

6.06%

2012-13

42399

1461

3.45

1186

2.8

2647

6.25%

2013-14

38837

1316

3.39

1236

3.18

2552

6.57%

 (Calculating attrition rates for 2010-11, 2011-12 and 2012-13, is a little more complicated than one might think.  For ABA reporting years of 2011, 2012, and 2013, “academic attrition” was reported separately, but “other attrition” included “transfers out.” Thus, to generate the real “other attrition” number, one needs to “subtract” from “other attrition” the numbers associated with “transfers out.” Because some schools occasionally listed transfers out in “second year” “other attrition,” this analysis should be understood to have a little fuzziness to it for years 2010-11, 2011-12 and 2012-13.  For ABA reporting year 2014, transfers out were not commingled with “other attrition,” so the calculations were based solely on the sum of “academic attrition” and “other attrition.”  Beginning with reporting this fall, “academic attrition” will include both involuntary academic attrition as well as voluntary academic attrition (students who withdrew before completing the first-year, but were already on academic probation).)

Academic Attrition Rates Increase as Law School Median LSAT Decreases

Notably, there are different rates of attrition across law schools in different LSAT categories.  The following chart breaks down attrition by groups of law schools based on median LSAT for the law school for the entering class each year.  For each year, the chart shows the average first-year attrition rates for law schools with median LSATs of 160 or higher, for law schools with median LSATs of 155-159, for law schools with median LSATs of 150-154 and for law schools with median LSATs less than 150.  In addition, it breaks out “academic attrition” and “other attrition” as separate categories for each category of law school and for each year and then provides the total overall attrition rate each year along with the four-year average attrition rate.

Average Attrition Rates by Category of Schools Based on Median LSAT

 

2010-11

2011-12

2012-13

2013-14

 

Median LSAT

Acad

Other

Total

Acad

Other

Total

Acad

Other

Total

Acad

Other

Total

Four-Year Average

160+

0.6

1.7

2.3

0.6

1.9

2.5

0.4

2.0

2.4

0.3

1.5

1.8

2.3

155-159

2.9

2.6

5.5

2.2

2.8

5.1

2.1

2.9

5.1

1.7

3.2

4.9

5.2

150-154

6.3

3.8

10.1

6.2

3.4

9.6

6.0

3.7

9.7

4.2

4.3

8.5

9.4

<150

10.1

2.4

12.5

9.4

3.8

13.2

9.1

3.0

12.2

9.7

4.7

14.4

13.1

 

When looking at this data, some things are worth noting. 

First, across different LSAT categories, overall attrition increases as you move from law schools with higher median LSATs to law schools with lower median LSATs, going from an average over the four years of 2.3% for law schools with median LSATs of 160+, to 5.2% for law schools with median LSATs of 155-159, to 9.4% for law schools with median LSATs of 150-154, to 13.1% for law schools with median LSATs of <150.  “Academic attrition” consistently increases as median LSAT decreases, while “other attrition” is mixed. (Although this analysis is focused on four LSAT categories, the trend of having overall attrition increase as median LSAT decreases continues if you add a fifth LSAT category. In 2010-11 there was only one law school with a median LSAT of 145 or less, with only 320 students.  By 2013-14, however, there were nine law schools with a median LSAT of 145 or less, with 2,075 students.  The overall first-year attrition rate (encompassing academic attrition and other attrition) at these nine schools in 2013-14 was 15.9 percent.  The overall attrition rate at the other 24 law schools with a median LSAT less than 150 was 13.6 percent.) 

Second, over the period from 2010-2013, “academic attrition” generally appears to be flat to decreasing for schools in all LSAT categories except for 2013-14 year for law schools with median LSATs <150, where it increased slightly (largely because of the larger number of schools with median LSATs of 145 or less).  By contrast, “other attrition” presents more of a mixed record, but generally appears to be increasing between 2010 and 2013 for schools in most LSAT categories.  Nonetheless, average overall first-year attrition is lower in 2013-14 for law schools in the top three LSAT categories.

Third, if you are wondering why the average overall attrition could be increasing while the overall attrition rates for the top three LSAT categories are decreasing, the answer is because of the changing number of students in each category over time.  As noted in Part I, the number of students and percentage of students in the top LSAT category has declined significantly, while the number of students and percentage of students in the bottom LSAT category has increased significantly.  This results in the average overall attrition rate increasing even as rates in various categories are decreasing.

Thoughts on Attrition Rates

It makes sense that “academic attrition” increases as law school median LSAT decreases.  It seems reasonable to expect that law schools with median LSATs of <155 or <150 will have higher “academic attrition” rates than those with median LSATs of 155-159 or 160 and higher. 

It may make less sense, however, that “academic attrition” generally decreased across all four categories of law schools between 2010-11 and 2013-14 (with the exception of law schools with a median LSAT <150 in 2013-14), even as the LSAT profile of each entering class continued to decline.  With an increase in the number and percentage of law students with LSATs of <150, particularly those with LSATs of <145, one might have anticipated that the average rate of “academic attrition” would have increased, particularly among law schools with median LSATs of 150-154 (who might have seen an increase in the number of students with LSATs less than 150) and among law schools with median LSATs of <150, given the increase in the number of law schools with median LSATs of 145 or less. 

Cynics might argue that from a revenue standpoint, law schools are making a concerted effort to retain a higher percentage of a smaller group of students.  But this assumes a degree of institutional purposefulness (coordination among faculty) that is rare among law schools.  Moreover, my sense is that there are much more benign explanations.

First, if law schools have not adjusted their grading curves to reflect a different student profile, then the standard approach to first-year grading – which involves a forced curve at most schools -- is likely to produce a similar percentage of “at risk” students year over year even though the objective credentials of each entering class have declined. 

Second, with the decline in the number of applicants to law school, one might surmise that those choosing to go to law school really are serious about their investment in a legal education and may be working harder to be successful in law school, resulting in fewer students facing academic disqualification, even though the credentials for each entering class have been weaker year over year.  This may be particularly true in law schools with robust academic support programs which may be helping some students on the margin find sufficient success to avoid academic attrition.

Third, and perhaps most significantly, however, is the reality that “academic attrition” and “other attrition” are related.  Indeed, that is why I have reported them together in the charts above as two components of overall attrition.  Some students who might be at risk for “academic attrition” may decide to withdraw from law school voluntarily (and be classified under “other attrition” rather than “academic attrition”). In addition, it is possible that other students, particularly at law schools with median LSATs <150, may be voluntarily withdrawing from law school because they have decided that further investment in a legal education doesn’t make sense if they are performing relatively poorly, even though the law school would not have asked them to leave under the school’s policy for good academic standing. 

The fact that the percentage of students in each entering class with LSATs of <150 and even <145 has increased substantially between 2010 and 2013, while the rate of overall first-year attrition has increased only modestly over this time period, suggests that the composition of graduating classes (based on LSATs) will continue to weaken into 2016 (and probably 2017 if attrition patterns did not change in 2014-15).  As a result, the declines in the median MBE scaled score in 2014 and 2015 could be expected to continue in 2016 and 2017.  Some law schools also are likely to see bar passage rates for their graduates decline, perhaps significantly, in 2015, 2016 and 2017.

Unanswered Questions

This analysis focuses on first-year attrition.  There continues to be attrition during the second year and third year of law school, generally at lower rates, perhaps 2-3% of second-year students and 1-2% of third-year students.  (On average, the number of graduates in a given class has been around 90% of the entering class.)  It is not clear yet whether attrition among upper level students follows similar patterns across different categories of law schools.  The publicly-reported attrition data also does not provide any information regarding the gender or ethnicity or socio-economic background of students leaving law school.  Therefore, we don’t know whether there are different rates of attrition for women as compared with men or whether students of different ethnic backgrounds have different rates of attrition.  We also don’t know whether first-generation law students experience attrition at greater rates than other law students, or whether students of lower socio-economic status experience attrition at greater rates than students of higher socio-economic status. 

(I am very grateful for the insights of Bernie Burk and Scott Norberg on earlier drafts of this blog posting.)

October 2, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)

Thursday, September 17, 2015

2015 Median MBE Scaled Score Arguably Declines Less Than Expected

Natalie Kitroeff at Bloomberg published earlier today an article with the first release of the median MBE scaled score for the July 2015 Bar Exam -- 139.9 -- a decline of 1.6 points from the July 2014 score of 141.5. 

While this represents a continuation of the downward trend that started last year (when the median MBE fell a historic 2.8 points from 144.3 in July 2013), the result is nonetheless somewhat surprising. 

The historic decline in the median MBE scaled score between 2013 and 2014 corresponded to a modest decline in the LSAT score profile of the entering classes between 2010 and 2011. 

As I discussed in my December blog posting on changing compositions of the entering classes since 2010, however, the decline in LSAT score profile of the entering classes between 2011 and 2012 was much more pronounced than the decline between 2010 and 2011.  Thus, one might have expected that the decline in the median MBE scaled score for 2015 would have been even larger than the decline between 2013 and 2014. 

But instead, the decline was only 1.6 points, just slightly more than half of the 2.8 point decline of the previous year.

Why would a demonstrably greater decline in the LSAT profile of the entering class between 2011 and 2012 (compared with 2010-2011) yield a manifestly smaller decline in the median MBE scaled score between 2014 and 2015 (compared with 2013-2014)?

This likely will remain a mystery for a long time, but my guess is that the ExamSoft debacle resulted in an aberrationally large decline in the median MBE scaled score between 2013 and 2014, such that the corresponding decline between 2014 and 2015 seems disproportionately smaller than one would have been expected.

Over on Law School Cafe, Debby Merritt has a very good description of the different factors that likely have impacted bar passage performance in July 2015.

Derek Muller has collected bar passage results for the several states that have released at least some results so far and has posted them on his Excess of Democracy blog.  Focusing only on overall bar passage rates, two states are "up," (North Dakota (6%) and Iowa (5%)), six are down between 1-5% (Missouri (-1%), Washington (-1%), Montana (-2%), Kansas (-3%), North Carolina (-4%), West Virginia (5%)), and four are down double-digits (Mississippi (-27%), New Mexico (-12%), Oklahoma (-11%), and Wisconsin (-10%).  (Last year 21 states were down 6% or more on first-time bar passage and six of those were down 10% or more.)

September 17, 2015 in Blog posts worth reading, Data on legal education, Data on the profession | Permalink | Comments (0)

Wednesday, August 12, 2015

Of Transfers and Law-School-Funded Positions

1.      Many Elite Law Schools with Large Numbers of Transfers also Have Large Numbers of Law-School-Funded Positions

Several weeks ago, I participated in two separate conversations.  One was about when law-school-funded positions should be categorized as full-time, long-term, bar-passage-required (FLB) positions and one was about transfer students.  This prompted me to compare those schools that are “big players” in law-school-funded positions with those schools that are big players in the “transfer” market.  Interestingly, as shown in the chart below, there is a significant amount of overlap.

For the Class of 2014, of the 15 law schools with the most graduates in FLB positions, ten had a significant (net) number of transfer students in the summer of 2012.  (The chart is sorted based on 2014 FLB positions (in bold).  To provide context, the chart also includes the 2011 net transfer data and 2013 law-school-funded FLB data for these 10 schools.)

Law School

2011 Net Transfers

2013 Law-School-Funded FLB

2012 Net Transfers

2014 Law-School-Funded FLB

GEORGE WASHINGTON

94

88

46

78

GEORGETOWN

63

73

75

64

EMORY

8

62

32

52

NYU

55

42

50

36

MICHIGAN

36

3

28

33

SOUTHERN CALIFORNIA

20

10

28

31

UCLA

35

31

33

31

COLUMBIA

44

29

57

31

HARVARD

30

11

31

26

BERKELEY

12

25

38

20

 Total

397

374

418

402

 

Note that in both 2013 and in 2014, six of the ten schools had more transfers than law-school-funded positions, suggesting that had they taken fewer transfers they might not have needed to provide as many law-school-funded positions. Phrased differently, this data suggests that with the transfer students, these law schools have too many graduates compared to the number of jobs the market is able to provide for their graduates.

2.      Adjusting to the Employment Market or Continuing to Attract Transfers and Provide Law-School-Funded Positions?

One might expect that a natural response to this “mismatch” between the number of graduates and the number of meaningful employment opportunities provided by the market would be to have fewer graduates (and fewer law-school-funded positions).  Indeed, for many of the schools in the chart above, the simplest way to do this would involve not accepting any transfer students (or accepting very few transfer students).  The first-year enrollment at these schools appears to be fairly-well calibrated with the number of meaningful employment opportunities provided by the market.  Of course, this would mean a significant loss of revenue.

But what happened at these ten law schools in the summer of 2013 and the summer 2014 with respect to transfer students? As shown in the chart below, almost all have continued to take large numbers of transfer students.  With knowledge that a not insignificant percentage of their graduates need the support of law-school-funded positions because they can’t find market positions, these law schools continue to take large numbers of transfers.  Indeed, the total number of net transfers at these ten law schools is even higher in 2013 and 2014 than in 2011 and 2012.

 

 

2014   Net Transfers

2013   Net Transfers

 

GEORGE   WASHINGTON

77

71

GEORGETOWN  

106

115

EMORY 

47

69

NYU

48

46

MICHIGAN

14

20

SOUTHERN   CALIFORNIA

27

34

UCLA

33

36

COLUMBIA

41

50

HARVARD 

31

34

BERKELEY

53

24

 Total

477

499

 

3.   Why are These Schools Continuing to be Big Players in the Transfer Market and in Providing Law-School-Funded Jobs and Why Aren’t Other Schools Doing This as Well?

Many elite law schools are participating heavily in the transfer market and in providing law-school-funded jobs because they can and because it makes financial sense to do so.

As a general matter, only relatively elite law schools are able to attract large number of transfer students willing to pay $50,000 per year in tuition.  (This assumes that most transfers are paying full tuition. There is very little information available about scholarships in the transfer market, but anecdotes suggest that scholarships are uncommon.)  By taking large numbers of transfers, these schools generate revenue that funds general operations AND enables the school to fund post-graduate employment opportunities for a significant number of graduates.  According to NALP, most graduates in law-school-funded positions receive salaries of roughly $15,000-$30,000 per year.  Even if they have as many law-school-funded positions as they do transfers, the schools still net $70,000 to $88,000 per transfer student over the second and third year of law school even after accounting for the salaries for law-school-funded positions. (To be fair, some modest percentage of law-school-funded positions at several of these law schools may be great opportunities that are highly competitive and pay a salary comparable to a market salary – in excess of $40,000 per year.  Some of these may be public interest opportunities that some students find particularly attractive.  But the proliferation of law school funded positions (having grown from just over 500 in 2012 to more than 800 in 2014), with most of the growth occurring at relatively elite law schools, suggests that many of these positions do not fit the profile described in the preceding two sentences.)

Other schools would love to do this, but most simply don’t have the ability to attract significant numbers of transfer students.  Moreover, in the present legal education environment with declining enrollment at most schools, many law schools are running a deficit, and simply can’t afford to invest money in law-school-funded positions for their graduates.

Notably, up until this year, this effort was aided by the reporting of law-school-funded jobs as if they were the same as jobs provided by the market.  A school with law-school-funded positions that otherwise met the definition of FLB positions could report a higher percentage of its graduates in such positions.  This minimized the extent to which less than robust employment results might erode the schools’ ability to attract students and has allowed these elite schools to continue to attract large numbers of relatively highly-credentialed entering students (and transfers) along with the revenue they bring to the school.  For the Class of 2015, however, these law-school-funded positions will be reported separately from FLB positions provided by the market.

4.      What Questions Might this Raise for Students?

Students considering transferring to one of these elite schools should ask two questions: 1) What percentage of law-school-funded positions went to transfer students? and 2) How do employment outcomes for transfer students compare with employment outcomes for students who began at the school as first years?  (Even with the increased attention on transparency with respect to employment outcomes, one data point not presently collected relates to employment outcomes for transfer students.)  This isn’t to suggest that all transfers end up in law-school-funded positions.  Some transfer students may outperform some of the students who began at a law school as first years, both in terms of academic performance and in terms of relationship skills.  These transfer students may outcompete original first-year students for market employment opportunities.  But students considering transferring might want to assess whether their employment prospects really will be better at the school to which they might transfer as compared with the opportunities available to them if they remained at the school from which they are considering transferring, particularly if they are near the top of the class at the school from which they are considering transferring.

Students who had matriculated as first-years at one of these elite law schools, might want to ask the law school administration how and why having a large number of transfers is a good thing for those who matriculated as first-years at the elite law school.  Having the additional revenue might enhance the educational experience in some way, but having significantly more students competing for jobs would seem to be an unnecessary challenge. 

5.      Conclusion

The data on transfers in 2013 and 2014 suggests that at many elite law schools, there will continue to be more graduates than jobs provided by the market.  As a result, these law schools are likely to continue to provide law-school-funded positions for some number of their graduates. Indeed, the prospect of law-school-funded positions as a fall-back option if a market position is not available might provide some solace for students, including transfer students, at these elite law schools. 

Nonetheless, there is a further ripple effect.  With dozens of graduates from these elite law schools in law-school-funded positions looking for market jobs, it makes it even more challenging for the next year’s graduates from these elite schools to find market jobs and almost assures that many graduates will still need the support of law-school-funded positions in the coming years.

(I am grateful to Bernie Burk and others for helpful comments on earlier drafts of this posting.)

August 12, 2015 in Data on legal education, Data on the profession | Permalink | Comments (0)

Thursday, August 6, 2015

How is the entry-level legal job market in Australia?

AlsaNot good.  There are more law graduates than jobs, yet law schools are making matters worse by admitting more students in order to generate subsidies for other parts of the university. That the basic charge of the Australian Law Students Association (ALSA), according to this story in the Lawyers Weekly, a publication that covers the business of law in Australia.

Legal education is Australia is very different than the U.S.,  yet the dynamics of the two entry-level markets seem to be converging.  Law has historically been an undergraduate degree in Australia (LLB), but in recent years the JD has been added as a new and more prestigious way into the profession. Here is the statement of an ALSA spokesperson based on recent survey results of the ALSA membership.

ALSA are of the position that there is still an oversupply of graduates because of the increasing sizes of law schools and the duplication in the number of law schools across the country. ...

Many who have undertaken the Juris Doctor particularly expressed concerns in their survey responses, highlighting that they undertook the postgraduate law degree to further their job prospects. Instead, they are facing the worrying reality that there are fewer jobs available for law graduates as well as the fact that they are completing their degrees with a sizeable student debt.

The article then goes on to describe growing law student anxiety over employment and student loan debt.  Wow, different system but a very similar result.  

One of the advantages of the Australian LLB degree is that it is often combined with another undergraduate degree, typically by adding one year of additional study.  As a result, many LLBs don't go on to qualify for practice, but the legal training probably augments their worldly knowledge and critical thinking skills.  But alas, the Australians are starting to dilute their extremely generous higher education subsidies -- we are just much further down that road. Further, the true undercurrent here is the growing insecurity facing virtually all knowledge workers, Australian or US.  Legal education is just the bleeding edge of this problem.

August 6, 2015 in Current events, Data on legal education, Data on the profession, New and Noteworthy | Permalink | Comments (0)

Wednesday, July 22, 2015

What is more important for lawyers: where you go to law school or what you learned? (Part II)

If you're trying to maximize the financial value of an undergraduate degree, it is better to bet on course of study than college prestige.  Indeed, prestige is largely irrelevant to those who major in engineering, computer science, or math.  In contrast, prestige does matter for art & humanities grads, albeit the financial returns are significantly lower than their tech counterparts.  

These are some of the takeaways from Part I of this blog post. Part I also presented data showing that law is a mix of both: financial returns have been high (cf. "red" tech majors) and prestige matters (cf. "blue" arts & humanities crowd).  

The goal of Part II is to address the question of whether the pattern of high earnings/prestige sensitivity will change in the future. I think the answer to this question is yes, albeit most readers would agree that if law will change is a less interesting and important question than how it will change.  Speed of change is also relevant because, as humans, we want to know if the change is going to affect us or just the next generation of lawyers.

Shifts in the Legal Market

There are a lot of changes occurring in the legal market, and those changes are altering historical patterns of how legal services are being sold and delivered to clients. In the past, I have thrown around the term structural change, yet not with any clear definition.  To advance the conversation, I need to correct that lack of precision. 

In economics, there is a literature on structural change as applied to national or regional economies (e.g. moving from a developing nation to an industrial nation; or moving from an industrial to a knowledge-based economy).  Investors also focus on structural change within a specific industry because, obviously, large changes can affect investor returns.  When I have used the term structural change on this blog, it has been much closer to investor conceptions.  Investopedia offers a useful definition even if it's somewhat colloquial: 

Definition of 'structural change': An economic condition that occurs when an industry or market changes how it functions or operates. A structural change will shift the parameters of an entity, which can be represented by significant changes in time series data.

Under this definition, the legal industry is certainly undergoing structural change.  The proportion of law graduates getting a job in private practice has been on the decline for 30 years; over the last 35 years, the average age of the licensed lawyer has climbed from 39 to 49 despite record numbers of new law school graduates; the proportion of associates to partners has plummeted since the late 1980s.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014.  Since the early 2000s, long before the great recession, associate-level hiring has been cut in half. See Sea Change in the Legal Market, NALP Bulletin, August 2013.

Likewise, among consumers of legal services, there is a lot of evidence to suggest that lower and middle class citizens can't afford a lawyer to solve life's most basic legal problems, thus leading to a glut of pro se litigants in state courts and many more who simply go without things like contracts and wills.  This troubling trend line was obscured by a boom in corporate legal practice, albeit now even rich corporations have become more sensitive to legal costs -- the sheer volume and complexity of legal need is outstripping their budgets.  In response to the lag in lawyer productivity and innovation, there is a ton of investor-backed enterprises that are now elbowing their way into the legal industry.  See A Counterpoint to "the most robust legal market that ever existed in this country"LWB, March 17, 2014.  

The impact of all this change -- structural or otherwise -- is now being felt by law schools. Applicants are down to levels not seen since the 1970s, yet we have dozens more law schools. It has been said by many that law schools are losing money, albeit we have zero data to quantify the problem.  Based on my knowledge of my own law school and several others I am close to, I am comfortable saying that we have real changes afoot that affect how the legal education market "functions or operates."

There is a sense among many lawyers and legal academics that the legal world changed after 2008. None of the "structural" changes I cite above are pegged in any way to the events of that year.  

What did change in 2008, however, was the national conversation on the legal industry, partially due to the news coverage of the mass law firm layoffs, partially due to important books by Richard Susskind and later Brian Tamanaha and Steve Harper, and partially due to a robust blogosphere.  This change in conversation emboldened corporate legal departments to aggressively use their new found market power, with "worthless" young associates getting hit the hardest.  This new conversation in turn exposed some of the risks of attending law school, which affected law school demand.  But alas, this was all fallout from deeper shifts in the market that were building for decades. Let's not blame the messengers.

Dimensions of Change

I am confident that the future of law is going to be a lot different than its past. But I want to make sure I break these changes into more discrete, digestible parts because (a) multiple stakeholders are affected, and (b) the drivers of change are coming from multiple directions.

Dimension 1: basic supply and demand for legal education

To unpack my point regarding multiple dimensions, let's start with legal education. Some of the challenges facing law schools today are entirely within the four corners of our own house.  Yet, legal education also has challenges (and opportunities) that arise from our connection to the broader legal industry.  This can be illustrated by looking at the relationship between the cost of legal education (which law schools control, although we may blame US News or the ABA) and entry level salaries (which are driven largely by the vagaries of a client-driven market).  

The chart below looks at these factors.  My proxy for cost is average student debt (public and private law schools) supplied by the ABA.  My income variables are median entry level salaries from NALP for law firm jobs and all entry level jobs.  2002 is the first year where I have all the requisite data.  But here is my twist:  I plot debt against entry-level salary based on percentage change since 2002.  

Debtversusincome-2002

If a business nearly doubles its price during the same period when customer income is flat, demand is going to fall.  Thus, the sluggish entry-level market presents a difficult problem for legal education.  Sure, we can point to the favorable statistics from the AJD or the premium that a JD has historically conferred on lifetime earnings, but law professors are not the people who are signing the loan papers.  The chart above documents a changing risk/reward tradeoff.  To use the frame of Part I, the red dots are sinking into the blue dot territory, or at least that is the way prospective students are likely to view things.

Fortunately, smaller law school classes are going to be a partial corrective to low entry-level salaries.  The biggest law school class on record entered in the fall of 2010 (52,488); in 2014, the entering class had shrunk by over 27% (37,942). When entry-level supply is reduced by 25+%, upward pressure on salaries will build.  Yet, the composition of the legal economy and the nature of legal work is clearly changing.  Further, the rate of absorption of law school graduates into the licensed bar has been slowing for decades.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014. It would be foolhardy to believe that time and fiscal austerity alone are going to solve our business problems. Instead, we need to better understand our role as suppliers to a labor market.

Dimension 2:  The content of legal education

The content of legal education is not necessarily fixed or static.  We could change the content, thus affecting how the market responds.  

To provide a simple example, one of my students is starting work this fall at Kirkland & Ellis.  From a financial perspective, this is a good employment outcome.  He will be moving to Chicago with his girlfriend who just received her MS in Information Systems from IU's Kelley School of Business.  The MS from Kelley is a very "red" degree.  It can also be completed in one year (30 credit hours).  Well before she graduated, this recent grad had competing offers from PWC and Deloitte, both in the $80,000 range.   For many Indiana Law students, an ideal post-grad outcome would be $80K in Chicago at an employer who provides challenging work and high-quality training.  Yet, my student's girlfriend got this ideal outcome in 1/3 the time and likely 1/2 the cost of an Indiana Law grad.  

Perhaps we should consider cross-pollinating these disciplines. A huge portion of the legal profession's economic challenges is attributable to flat lawyer productivity -- customers are struggling to pay for solutions to their legal needs.  Information systems are a huge part law's productivity puzzle.  Below is a chart I use in many of my presentations on the legal industry.  The chart summarizes the emerging legal ecosystem by plotting the Heinz-Laumann two-hemisphere model against Richard Susskind's bespoke-to-commodity continuum. [Click-on to enlarge.]

Ecosystem

The key takeaway from this diagram is that the largest area of growth is going to be in the multidisciplinary green zone -- the legally trained working shoulder-to-shoulder with those skilled in information systems, statistics, software development, and computational linguistics, to name but a few.  These are "red" disciplines.  Do law schools want to be part of this movement?  Let me ask this another way -- do law schools want to be relevant to the bulk of the legal market that needs to be rationalized in order to maintain its affordability? Harvard grads will have options on Wall Street for the foreseeable future.  But 98% of law schools operate in a different market.  Further, some HLS grads, or students who might qualify for admission to Harvard, might prefer the big upside rewards that are only available in the green zone.  In short, a new hierarchy is emerging in law that is still very much up for grabs.

If an academic wants to better understand the rapidly changing nature of legal work, I would urge them to visit a large legal department with a substantial legal operations ("legal ops") staff.  These are the professionals who have been empowered by general counsel to find ways to drive up quality and drive down cost using data, process, and technology.  These are the folks who are making build-versus-buy decisions, putting pressure on law firms to innovate in order to hang on to legal work, and experimenting with NewLaw legal vendors. 

I am finishing up a story on legal ops professionals for the ABA Journal.  (By the way, legal ops exist in law firms as well as legal departments and green zone legal vendors. The role is most developed, however, in legal departments.)  My editor flagged the issue that virtually all of the legal ops people in the story did not graduate from prestigious law schools (or any law school).

My only response is that legal operations people have specialized skills and knowledge (often "red" but sometimes involving EQ) that others lack; without these skills, they can't do the job.  Legal ops people live in a world of outputs and metrics.  For example, are legal expenses and settlement amounts trending down over time -- yes or no? If so, by how much?  How much internal staff time does it take to negotiate a revenue contract? How much of this process can be automated? What will it take to get our staff to accept the new system?

As these examples show, a legal ops person is typically going to be evaluated based on measurable outputs -- do they get results? Where someone went to law school is an input that is likely irrelevant to the question.  The only qualifier is whether the curriculum of that school provided valuable, specialized domain knowledge -- most likely non-legal red skills but also skills related to teams, communication, and collaboration. 

Dimension 3:  The value of pedigree to the customer 

Law has historically been what economists call a “credence good.”  This means that a layperson has a difficult time assessing quality.  As a result, proxies for quality, such as pedigree or prestige, have historically been very important when hiring a lawyer or law firm.  

One of the reasons that the field of legal operations is gaining momentum is because it is creating tools and systems that enable clients to look past credentials to obtain information on things they really care about, such as cost, outcome, and speed of delivery. There are now companies coming into existence that are gathering data on lawyers' win-loss rates. See Another Example of Using Big Data to Improve Odds of Winning in Court, LWB, April 12, 2015.  Sure, apples-to-apples comparisons are very difficult to make -- every case is unique in some respect. But the amount of money at stake is large enough that the data challenges will be surmounted.  When that day arrives, we won't opine on the value of pedigree to legal outcomes; we'll just calculate it. More significantly, clients focused on outcomes will change their buying patterns.  Early returns I have seen suggest that the value of pedigree to legal outcomes may be close to negligible.

Do any of us care where the engineers who designed our smart phones went to college? Not really. We just care how well the smart phone works. 

In this respect, the future of law is likely headed in the direction of Google (a pure red company).  In the early days, the founders of Google favored grads of Caltech, Stanford and Berkeley.  But over time, the company learned that prestige of graduate school was a poor predictor of job success. Because Google lives and dies by its outputs, the company changed its hiring model to attract the most qualified engineers.  See George Anders, The Rare Find: How Great Talent Stand Out 1-5 (2012) (telling the story of how data changed the attitudes of Google founders regarding elite credentials and altered the Google hiring model).

I have lived long enough to know that the changes I describe above are not necessarily going to be welcomed by many lawyers and law professors.  If a group benefits from a lifelong presumption of merit, it is natural that group will resist evidence that the presumption is not fully warranted. Indeed, much of the skepticism will be rooted in subconscious emotion.  If the presumption is dashed, those of us in the elite crowd will have to spend our days competing with others and proving ourselves, or even worse, watching our kids soldier through it.  We have little to gain and a lot to lose in the world we are heading into.  Yet, behind the Rawls veil of ignorance, how can we complain?

So with the red-blue crosscurrents, is law school still worth the investment?

That is a relevant and reasonable question that many young people are contemplating.  I will offer my opinion, but markets are bound to follow their own logic. 

This is a time of enormous uncertainty for young people. Education clearly opens doors, but tuition is going up much faster than earnings.  Further, competition among knowledge workers is becoming more global, which is a check on wages.  Of course, if you don't invest in education, what are your options?

I am generally on the side of Michael Simkovic and Frank McIntrye that the education provided by a law degree, on average, significantly increases lifetime earnings.  See The Economic Value of a Law Degree (April 2013).  How could it not?  The law is too interconnected to every facet of society to not, on average, enhance the law grad's critical thinking skills. Nearly 15 years of out of law school and I regularly use what I learned at Chicago Law to solve problems and communicate solutions, particularly in my applied research work with law firms and legal departments. While my Chicago Law credential has value independent of the skills and knowledge I obtained (the red AJD bar chart in Part I strongly suggests that), I can't deny the additional value of the actual skills and knowledge I obtained to solve real world business problems. It's been substantial.

In general, I also agree with Deborah Jones Merritt that there is significant evidence that the entry-level market for lawyers is weak and oversaturated.  See What Happened to the Class of 2010? Empirical Evidence of Structural Change in the Legal Profession (April 2015).   The class of 2010 is not faring as well as the class of 2000.  Indeed, the lead economist for Payscale, Katie Bardaro, recently noted that wages are stagnating in many fields, but especially in the legal profession. "More law schools are graduating people than there are jobs for them...There’s an over-saturated labor market right now. That works to drive down the pay rate.” See Susan Adams, The Law Schools Whose Grads Earn the Biggest Paychecks in 2014, Forbes, Mar. 14, 2014. 

In the face of these stiff headwinds, I think law schools have an opportunity to pack more value into three years of education. See Dimension 2 above.  To be more specific, if you are a protege of Dan Katz at Chicago-Kent, you will have a lot of career options. Ron Staudt, also at Chicago-Kent, has quietly built a pipeline into the  law and technology space.  Oliver Goodenough and his colleague at Vermont Law are making rapid progress with a tech law curriculum.  And at Georgetown Law, Tanina Rostain and Ed Walters (CEO of Fastcase) provide courses that are cutting edge.  

But absent these types of future-oriented instruction, what is the value of a JD degree as it is commonly taught today? That value is clearly positive; I would even call it high.  But whether the value is sufficient to cover the cost of attendance is likely to vary from law grad to law grad.  Lord knows, in a world of variable tuition based on merit scholarships and merit scholarships that go away after the 1L year, the swing in cost can be a $250K plus interest.

What is killing law school applications these days is the lack of near certainty among prospective students that the time and expense of law school will pay off.  The world looks different than it did in the fall of 1997 when the vast majority of the AJD respondents entered law school. Tuition and debt loads are higher and high paying entry-level jobs are harder to obtain.

So what is the solution?  For students, it's to bargain shop for law schools, which is bad news for law schools.  For law schools, it's to add more value to an already valuable degree.  Some of that value will come in the form of red technical skills that will make lawyers more productive.  In turn, this will prime demand for more legal products and services.

July 22, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Legal Departments, Structural change | Permalink | Comments (0)