Monday, January 18, 2016

Changes in Composition of the LSAT Profiles of Matriculants and Law Schools Between 2010 and 2015

In late December 2014, I posted a blog analyzing how the distribution of matriculants across LSAT categories had changed since 2010 based on the LSAC’s National Decision Profiles and on law school 50th percentile LSATs and 25th percentile LSATs across ranges of LSAT scores. With the LSAC’s recent release of the 2014-15 National Decision Profile and the ABA’s recent release of Standard 509 data, I am posting this blog to provide an update with the 2015 data.

At one level, this is a story that has already become well understood over the last year since my blog posting, with much discussion of the relationship between declining LSAT profiles and declining median MBE scores and bar passage rates. This 2015 information indicates that the decline in the LSAT profiles of matriculants and of law schools has continued, although with some moderation.

Given that the LSAT profiles of matriculants and of law schools for fall 2013, fall 2014 and fall 2015 are less robust than those for fall 2011 and fall 2012 (the classes that graduated in 2014 and 2015, respectively), one can anticipate that the declines in median MBE scaled scores and corresponding bar passage rates in 2014 and 2015 will continue in July 2016, 2017 and 2018 absent increases in attrition (I discussed attrition rates in a blog posting in October), significant improvement in academic support programs at law schools, or improved bar preparation efforts on the part of graduates.

Tracking Changes Based on LSAC’s National Decision Profiles – 2010-2015

The following discussion summarizes data in the LSAC’s National Decision Profiles from the 2009-10 admission cycle (fall 2010) through the 2014-15 admission cycle (fall 2015).

Let’s start with the big picture. If you take the matriculants each year and break them into three broad LSAT categories – 160+, 150-159, and <150 – the following chart and graph show the changes in percentages of matriculants in each of these categories over the last six years.

Change in Percentage of Matriculants in LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

<150

14.2

15.7

19.3

22.5

23

23.8

150-159

45

45.3

44.3

44.1

43.6

44.2

 160+

 40.8

39 

 36.3

 33.4

 33.5

 32

Change in Percentage of Matriculants in LSAT Categories – 2010-2015 (Visual) 
Thumbnail

Notably, this chart and graph show almost no change in the “middle"" -- 150-159 – (blue – dropping from 45% to 44.2%), with most of the change at 160+ (green -- decreasing from 40.8% to 32%) and at <150 (red -- increasing from 14.2% to 23.8%). This chart and graph also show some stabilization between 2013 and 2014, followed by a modest decline in 2015 in the percentage of students with LSATs of 160+ and a modest increase in the percentage of students with LSATs of <150.

While I think this tells the story pretty clearly, for those interested in more detail, the following charts provide a more granular analysis.

Changes in LSAT Distributions of Matriculants – 2010-2013       

 

2010

2011

2012

2013

2014

2015

Change in #

% Change in #

170+

3635

3330

2788

2072

2248

2022

(1613)

-44.4%

165-169

5842

5622

4783

4082

3941

3483

(2359)

-40.4%

 

160-164

10666

8678

7281

6442

6010

5743

(3923)

-36.8%

 

155-159

11570

10657

9700

8459

7935

7780

(3790)

-32.8%

 

150-154

10626

9885

8444

8163

7934

7805

(1821)

-17.1%

 

145-149

5131

5196

5334

5541

5158

5274

143

2.8%

 

<145

1869

1888

2564

2930

3203

3084

1215

65%

 

49339

45256

40894

37689

36429

35191

   
                       

Note that in terms of the percentage change in the number of matriculants in each LSAT category, the four highest LSAT categories are all down at least 30% since 2010, with 165-169 and 170+ down over 40%, while the two lowest LSAT categories are up, with <145 being up over 60%. 
  Thumbnail

Note that in the line graph above, the top two LSAT categories have been combined into 165+ while the bottom two LSAT categories have been combined into <150. Perhaps most significantly, in 2010, the <150 group, with 7,000 students, was over 2,400 students smaller than the next smallest category (165+ with 9,477) and more than 4,500 students smaller than the largest category (155-159 with 11,570). By 2015, however, the <150 category had become the largest category, with 8,358, more than 500 larger than the second category (150-154, with 7,805) and more than 2,800 larger than the smallest category, 165+ with only 5,505. Moreover, 88% of the growth in the <150 category was in the <145 category (1,215 of the 1,358 more people in the <150 category were in the <145 category).

Changes in Percentage of Matriculants in LSAT Ranges – 2010-2015

 

2010

2011

2012

2013

2014

2015

% Chg in %

>169

7.4

7.4

6.8

5.5

6.2

5.7

-23%

165-169

11.8

12.4

11.7

10.8

10.8

9.9

-16.1%

160-164

21.6

19.2

17.8

17.1

16.5

16.3

-24.5%

155-159

23.5

23.5

23.7

22.4

21.8

22.1

-6%

150-154

21.5

21.8

20.6

21.7

21.8

22.2

3.2%

145-149

10.4

11.5

13

14.7

14.2

15

44.2%

<145

3.8

4.2

6.3

7.8

8.8

8.8

132%

In terms of the “composition” of the class, i.e., the percentage of matriculants in each LSAT category, we see significant declines of 20% or more at 160-164 and 170+ and significant increases of 40% at 145-149 and over 100% at <145.

Tracking Changes in Law Schools by Looking at the Distribution of 50th Percentile LSAT Scores Across Six LSAT Categories

Obviously, this change in the composition of the entering class has resulted in corresponding changes in the LSAT profiles of law schools. Based on the data law schools reported in their Standard 509 Reports from 2010 to 2015, the chart below lists the numbers of law schools reporting a 50th percentile LSAT within certain LSAT ranges. (This chart excludes law schools in Puerto Rico and provisionally-approved law schools.)

Number of Law Schools with a 50th Percentile LSAT in Six LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

165+

30

31

26

23

21

21

160-164

47

41

39

31

29

28

155-159

59

57

56

53

51

48

150-154

50

52

53

56

59

59

145-149

9

14

22

28

29

33

<145

0

1

0

5

7

7

Total

195

196

196

196

196

196

The table above pretty clearly demonstrates the changes that have taken place since 2010, with declines in the number of law schools with a 50th percentile LSAT in higher LSAT categories and increases in the number of law schools with a 50th percentile LSAT in the lower LSAT categories, although 2015 saw only modest changes from 2014 at 160-164 (down 1), at 155-159 (down 3) and at 145-149 (up 4). 

  Thumbnail

As shown in the chart above, the number of law schools with a 50th percentile LSAT of 155 or higher has declined from 136 to 97. By contrast, the number of law schools with a 50th percentile LSAT of 154 or lower has increased from 59 to 99. In 2010, therefore, there were more than twice as many law schools with a 50th percentile LSAT of 155 or higher as compared with the number with a 50th percentile LSAT of 154 or lower (136 and 59, respectively), but as of 2015, those numbers were nearly identical (97 and 99, respectively).

The “mode” in 2010 was in the 155-159 category, with nearly 60 law schools, but by 2014, the “mode” had shifted to the 150-154 category with nearly 60 law schools.

Perhaps most pronounced is the shift in the upper and lower ranges. As shown in the chart below, the number of law schools with a 50th percentile LSAT of 160 or higher has dropped by more than one-third, from 77 to 49, while the number of law schools with a 50th percentile LSAT of 149 or lower has more than quadrupled from 9 to 40. In 2010, there were only three law schools with a 50th percentile LSAT of 145 or 146; as of 2015, there were 15 law schools with a 50th percentile LSAT of 146 or lower, of which five were at 143 or lower, with the two lowest being 142 and 141.
  Thumbnail

   
Thumbnail
Tracking Changes in Law Schools by Looking at the Distribution of 25th Percentile LSAT Scores Across Six LSAT Categories

For those who want to focus on the bottom 25th percentile of LSAT profile among law schools, the table below shows changes in distribution of the bottom 25th percentile LSAT among law schools across six LSAT categories between 2010 and 2015.

Number of Law Schools with a 25th Percentile LSAT in Six LSAT Categories – 2010-2015

 

2010

2011

2012

2013

2014

2015

165+

17

16

11

10

10

7

160-164

26

20

21

17

15

17

155-159

55

54

49

42

41

38

150-154

67

69

59

65

57

59

145-149

26

33

46

48

48

52

<145

4

4

10

14

25

23

Total

195

196

196

196

196

196

With respect to changes between 2014 and 2015, this table shows a little more variability, with decreases in three categories -- 165+ (down 3l 155-159 (down 3) and less than 145 (down 2) -- and with increases in three categories -- 160-164 (up 2), 150-154 (up 2), and 145-149 (up 4).

Looking at changes between 2010 and 2015, note that the four top categories have all declined, while the number of law schools with a 25th percentile LSAT of 145-149 has doubled and the number of law schools with a 25th percentile LSAT of <145 has more than quintupled from four in 2010 (two at 144 and two at 143), to 23 in 2015, with 13 of them at 142 and below.

  Thumbnail

As shown in the chart below, in 2010, the number of law schools with a 25th percentile LSAT of 155 or higher and the number with a 25th percentile LSATs of 154 or lower were nearly identical (98 and 97, respectively). As of 2015, however, there were more than twice as many law schools with a 25th percentile LSAT of 154 or lower when compared with those with a 25th percentile LSAT of 155 or higher (134 and 62, respectively).
  Thumbnail

Moreover, between 2010 and 2015, the number of law schools with a 25th percentile LSAT of 160 or higher has fallen more than 40% from 43 to 24, while the number with a 25th percentile LSAT of 149 or lower has more than doubled from 30 to 75. 

 
Thumbnail
 

Changes in Average 75th, 50th and 25th Percentile LSATs Across Fully-Accredited ABA Law Schools

One other way of looking at this is just to see how the average first-year LSAT and UGPA profiles have changed over the last six years.

Average LSATs of Matriculants at Fully-Accredited ABA Law Schools

 

75th Percentile

50th Percentile

25th Percentile

2010

160.5

158.1

155.2

2011

160.1

157.8

154.5

2012

159.6

157

153.6

2013

158.7

156

152.6

2014

158.2

155.4

151.8

2015

157.9

155.3

151.8

Overall Drop

-2.6

-2.8

-3.4

(Note that these are not weighted averages based on the number of matriculants at each school, but are simply averages across law schools.)

Notably, over this same period of time the average UGPAs have fallen modestly as well from a 75th/50th/25th profile of 3.63 – 3.41 – 3.14 in 2010 to 3.6 – 3.37 – 3.09 in 2015.

Conclusion

If one focuses on the LSAT scores and UGPAs as measures of “quality” of the entering class of law students each year, then the period from 2010-2015 not only has seen a significant decline in enrollment, it also has seen a significant decline in “quality.”

The LSAC’s most recent Current Volume Report (January 8, 2016) suggests that the pool of applicants to law schools is rebounding slightly in this current cycle. With 22,662 applicants at a point in the cycle at which 40% of applicants had been received last year, one can project an applicant pool of roughly 56,600. The “quality” of applicants also appears to be stronger, with double digit percentage increases in applicants to date in LSAT categories of 165 and higher. If these trends continue in the applicant pool for the current cycle, then the fall 2015 entering class may represent the “bottom” both in terms of the number of matriculants and in terms of the “quality” of the matriculants as measured by LSAT and UGPA. Of course, we won’t know for sure about that until next December when the 2016 Standard 509 Reports are published.

(I am grateful for the helpful comments of Scott Norberg on an earlier draft of this blog.)

January 18, 2016 in Data on legal education | Permalink | Comments (1)

Thursday, December 31, 2015

Conditional Scholarship Programs: Comparing 2014-15 with 2011-12

A few years ago, the Council for the Section of Legal Education and Admissions to the Bar approved revisions to Standard 509, requiring that law schools post a chart identifying the number of conditional scholarships given to incoming first years and the number of those scholarship recipients whose scholarships were reduced or eliminated at the end of the first year.

As a result of this requirement, there is now a much greater universe of publicly available information about law school scholarship programs. In the summer of 2013, I posted to SSRN an article entitled Better Understanding the Scope of Conditional Scholarship Programs among American Law Schools, summarizing the first year of available data on conditional scholarship programs, covering the 2011-12 academic year.

Law schools have now published this data for four years, with data covering the 2014-15 academic year having just been released as of December 15.

This blog posting highlights the smaller number of law schools with conditional scholarship programs as of 2014-15, summarizes the extent to which the number and percentage of first-year students with conditional scholarships and the number and percentage of rising second-year students whose scholarships were reduced or eliminated has changed since 2011-12, and looks at how the distribution of retention rates by decile has changed since 2011-12. It also analyzes both the prevalence of conditional scholarship programs among law schools across different rankings categories and the extent to which scholarship retention rates differ among law schools across different rankings categories.

1. Number of Law Schools with Conditional Scholarship Programs Declines

Excluding the three law schools in Puerto Rico, there were 140 fully-accredited ABA law schools with conditional scholarship programs in 2011-12. For the 2014-15 academic year, however, the number of fully-accredited ABA law schools with conditional scholarship programs had dropped by 27, to 113, a decline of nearly 20%.

2. Average Conditional Scholarship Retention Rate Increases Modestly

In 2011-12, the average scholarship retention rate across the 140 law schools with conditional scholarship programs was 69%. In total, 12,681 students who entered law school in the fall of 2011 and continued into their second year of law school at the same school entered with conditional scholarships and 4,332 of those students had their scholarships reduced or eliminated, a retention rate across individual students of roughly 66%.

For the 2014-15 academic year, the average retention rate across the 113 law schools with conditional scholarship programs increased to 73.2%. In total, 10,099 students who entered law school in the fall of 2014 and continued into their second year of law school at the same school entered with conditional scholarships and 2,880 of those students had their scholarships reduced or eliminated. Thus, the retention rate across individual students also increased to roughly 71.5%.

3. Percentage of First-Year Students with Conditional Scholarships Stays the Same While the Percentage of Rising Second-Year Students Whose Scholarships were Reduced or Eliminated Declines Slightly 

Across the 194 law schools on which I compiled data for the 2011-12 academic year, the fall 2011 entering first-year class totaled 46,388. Thus, roughly 27.3% (12,681/46,388) of the entering first-year students in the fall 2011 entering first-year class were on conditional scholarships. Roughly 9.4% (4,382/46,388) of all the first-year students in the fall 2011 entering first-year class failed to retain their conditional scholarship as they moved into the second year of law school.

Interestingly, of the 37,086 first-years who matriculated at fully-accredited ABA law schools in fall 2014, roughly 27.2% (10,099/37,086) were on conditional scholarships, almost the exact same percentage as in 2011-12. But a smaller percentage, roughly 7.8% (2,880/37,086) of all the first-year students who entered law school in fall 2014 failed to retain their conditional scholarship as they moved into the second year of law school.

Therefore, even though fewer law schools had conditional scholarship programs, those with such programs offered conditional scholarships to a larger percentage of students, such that the overall percentage of students with conditional scholarships remained roughly the same (27.3% in 2011-12 compared with 27.2% in 2014-15). Nonetheless, because there was a modest increase in retention rates, a smaller percentage of the overall population of students (7.8% in 2015 compared with 9.4% in 2012) saw their conditional scholarships reduced or eliminated.

4. Distribution of Retention Rates by Decile Shows Fewer Schools with Lower Retention Rates

The distribution of retention rates by deciles across all 140 law schools reporting conditional scholarship programs for 2011-2012 and all 113 law schools reporting conditional scholarship programs for 2014-2015 is set forth in Table 1. The biggest change reflected Table 1 is the decrease in the number of law schools with retention rates of less than 70%, falling from 73 in 2011-12 to 40 in 2014-15.

Table 1: Number of Law Schools Reporting Retention Rates by Decile Range

Retention Rate

Number

Description

 

2012      2015

             2012                                     2015

Less than 40%

8

6

Four of the eight were ranked alphabetically

Five of the six were ranked 100 or lower

40-49%

12

4

Six of the 12 were law schools ranked between 50 and 99

Two of the four were ranked 51-99

50-59%

23

17

18 of the 23 were law schools ranked 99 or lower

14 of the 17 were ranked 100 or lower

60-69%

30

13

23 the 30 were ranked in 100 or lower

Nine of the 13 were ranked 100 or lower

70-79%

23

23

13 of the 23 were ranked in the top 100

17 of the 23 were ranked 100 or lower

80-90%

19

24

14 of the 19 were ranked between 51 and 145

10 of the 24 were ranked 51-99

90% or better

25

26

12 of the 25 were ranked in the top 50

14 of the 26 were ranked in the top 100, five in the top 50

TOTAL

140

113

 

5. Differences in Conditional Scholarship Programs across Law School Rankings Categories

As noted in some of the descriptors in Table 1, there also were some differences based on the rankings of the law schools in U.S. News, using the March 2012 rankings and the March 2015 rankings. These differences are summarized in the Tables 2A and 2B and the discussion that follows.

Table 2A – Changes in Number and Percentage of First-Year Students with Conditional Scholarships across Different Categories of Law Schools Based on U.S. News Rankings for 2012 and 2015

Rank

# of Schools

# of Schools with Cond. Scholar.

12        15

% of Schools with Cond. Scholar.

12     15

 # of 1Ls

12        15

# of 1Ls with Conditional Scholarships

12         15

.% of 1Ls with Cond. Scholar.

12          15

Top 50

50

51

20

10

40

20

13109

11715

1656

814

12.8

6.9

51/100

50

50

40

31

80

62

11592

8972

4179

3159

36

35.2

101/150

46

52

36

38

78

73

9293

7899

2803

2997

30.1

37.9

Alpha.

48

42

44

34

92

81

12394

8500

4043

3129

32.6

36.8

TOTAL

194

195

140

113

72

58

46388

37086

12681

10099

27.3

27.2

Table 2A shows that the percentage of fully-accredited ABA law schools with conditional scholarship programs increases as you move down the U.S. News Rankings. Indeed, of the 27 law schools that moved away from conditional scholarship programs between 2011-12 and 2014-15, 19 were ranked in the top 100. As a result, only 41 of the top 100 law schools had conditional scholarship programs as of 2014-15, while 72 law schools ranked 101 or lower had conditional scholarship programs.

Table 2A also shows that the percentage of students with conditional scholarships is basically divided into two camps. Within the top 50 law schools, only 10 law schools have conditional scholarship programs, and five of those have retention rates of 100%, such that really only five of the 51 law schools in the top 50 have meaningful conditional scholarship programs. (At these five schools, however, roughly one in three students with conditional scholarships saw their scholarships reduced or eliminated.) Across all 51 law schools in the top 50, only 6.9% of first-year students have conditional scholarships. Throughout the rest of the law schools, however, roughly 35% to 38% of first-year law students in each rankings category have conditional scholarships.

Even though the percentage of first-year students with conditional scholarships declined among top 50 law schools between 2011-12 and 2014-15, the percentage increased among law schools ranked 101 and below, such that the overall percentage of first-year students with conditional scholarship has remained almost the same between 2011-12 and 2014-15, at slightly more than 27%.

Table 2B – Changes in Retention Rates of Conditional Scholarship Recipients across Different Categories of Law Schools Based on U.S. News and World Report Rankings for 2012 and 2015

Rank

# Scholarships Not Retained

12            15

% of Scholarships Not Retained.

12           15

% of All 1Ls Who Did Not Retain Scholarship

12            15

Top 50

186

167

11.2

20.5

1.4

1.4

51/100

1452

672

34.7

21.3

12.5

7.5

101/150

1069

917

38.1

30.6

11.5

11.6

Alpha.

1625

1124

40.2

35.9

13.1

13.2

TOTAL

4332

2880

34.2

28.5

9.4

7.8

Table 2B shows that the percentage of all students whose conditional scholarships were reduced or eliminated in 2014-15 consistently climbs as one moves down the rankings categories – going from 1.4% among law schools ranked in the top 50, to 7.5% for law schools ranked 51-100, to 11.6% for law schools ranked 101-150, to 13.2% for law schools ranked alphabetically.

Table 2B shows that among law schools ranked in the top 50, the average percentage of conditional scholarship recipients whose scholarships were reduced or eliminated increased between 2011-12 and 2014-15 from 11.2% to 20.5%. (But remember, this is across a relatively small sample of 10 schools and 814 students.) By contrast, across each of the other three categories of law schools, the average percentage of conditional scholarship recipients whose scholarships were reduced or eliminated declined between 2011-12 and 2014-15, from 34.7% to 21.3% among schools ranked 51-100, from 38.1% to 30.6% among schools ranked 101-150, and from 40.2% to 35.9% among law schools ranked alphabetically.

Nonetheless, law schools ranked 51-100 were the only category of law schools which saw a decrease in the percentage of all rising second-year students whose scholarships were reduced or eliminated, as the percentage fell from 12.5% to 7.5%. For top 50 law schools, the combination of fewer students with conditional scholarships, but a higher rate at which scholarships were reduced or eliminated, meant that 1.4% of all students saw their scholarships reduced or eliminated, the same percentage as in 2011-12. For law schools ranked 101 and below, the combination of more students with conditional scholarships, but a lower rate at which scholarships were reduced or eliminated, meant that roughly the same percentage of all students saw their scholarships reduced or eliminated in 2014-15 as in 2011-12 (for law schools ranked 101-150, 11.6% in 2014-15 compared with 11.5% in 2011-12; for law schools ranked alphabetically, 13.2% in 2014-15 compared with 13.1% in 2011-12).  Because of the decrease in the percentage of students whose scholarships were reduced or eliminated in the category of law schools ranked 51-100, however, the percentage of all students who saw their scholarships reduced or eliminated fell from 9.4% in 2011-12 to 7.8% in 2014-15.

Conclusion

Even though 27 fewer law schools had conditional scholarship programs in 2014-15 than in 2011-12, the percentage of all first-year students on conditional scholarships in 2014-15 was nearly the same as in 2011-12 because the 113 schools with conditional scholarship programs, on average, gave conditional scholarships to a larger percentage of their students.

Only 20 percent of law schools in the top 50 have conditional scholarship programs and only 10 percent actually reduced or eliminated scholarships for some of their students. Outside the top 50 law schools, however, more than two-thirds of law schools have conditional scholarship programs and roughly 36.5% of all law students have conditional scholarships. This means more than one-third of first-year students in law schools ranked 51 and below in the U.S. News 2015 rankings needed to be concerned about whether they would perform well enough to retain their conditional scholarships.

December 31, 2015 in Data on legal education | Permalink | Comments (0)

Tuesday, December 29, 2015

Updating the Transfer Market Analysis for 2015

This blog posting updates my blog postings here and here of December 2014 regarding what we know about the transfer market. With the release of the 2015 Standard 509 Reports, we know have two years of more detailed transfer data from which to glean insights about the transfer market among law schools.

NUMBERS AND PERCENTAGES OF TRANSFERS – 2011-2015

The number of transfers dropped to 1,979 in 2015, down from 2,187 in 2014 and 2,501 in 2013. The percentage of the previous fall’s entering class that engaged in the transfer market also dropped slightly to 5.2%, down from 5.5% in 2014 and 5.6% in 2013. In other words, there is no reason to believe the transfer market is “growing” as a general matter. It has been fairly consistently in the 4.6% to 5.6% range for the last five years, with an average of 5.2%

 

2011

2012

2013

2014

2015

Number of Transfers

2427

2438

2501

2187

1979

Previous Year First Year Enrollment

52,500

48,700

44,500

39700

37900

%   of Previous First-Year Total

4.6%

5%

5.6%

5.5%

5.2%

SOME LAW SCHOOLS CONTINUE TO DOMINATE THE TRANSFER MARKET

The following two charts list the top 15 law schools in terms of receiving transfer students in descending order in Summer 2013 (fall 2012 entering class), Summer 2014 (fall 2013 entering class), and Summer 2015 (fall 2014 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first-year class.

Note that in these two charts, the “repeat players” are bolded – those schools in the top 15 for all three years are in black, those schools in the top 15 for two of the three years are in blue.   Seven of the top ten by number in 2015 and seven of the top ten by percentage 2015 have been among the top 15 on each list for all three years.

Largest Law Schools by Number of Transfers from 2013-2015

School

Number in 2013

School

bNumber in 2014

School

Number in 2015

Georgetown

122

Georgetown

113

Georgetown

110

George   Wash.

93

George Wash.

97

George Wash.

109

Florida   St.

90

Arizona St.

66

Arizona St.

65

Emory

75

Idaho

57

Harvard

55

Arizona   State

73

Cal. Berkeley

55

Emory

51

American

68

NYU

53

NYU

51

Texas

59

Emory

50

Cal. Berkeley

49

Columbia

52

Columbia

46

Rutgers

45

NYU

47

American

44

Columbia

44

Minnesota

45

UCLA

44

Miami

44

Arizona

44

Wash. Univ.

44

UCLA

43

Northwestern

44

Texas

43

Texas

37

UCLA

41

Minnesota

37

American

33

Cardozo

38

Northwestern

35

Florida St.

32

Southern   Cal.

37

Harvard

33

Minnesota

31

TOTAL

928

 

817

 

799

% of All Transfers

37.1%

 

37.4%

 

40.4%

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class - 2013-2015

School

% 2013

School

% 2014

School

% 2015

Florida State

48.1

Arizona State

51.6

Arizona State

45.5

Arizona State

48

Idaho

51.4

Emory

22.9

Utah

34.7

Washington Univ.

23.3

George Wash.

20.2

Emory

29.6

Emory

22.9

Miami

19.2

Arizona

28.9

Georgetown

20.8

Georgetown

19

Minnesota

22

George Wash.

20.2

Cal. Berkeley

17.9

George Wash.

21.8

Cal. Berkeley

19.4

Florida St.

17

Georgetown

21.2

Florida St.

18.2

Florida Int’l

16.7

Rutgers – Camden

20.7

Rutgers – Camden

17.1

Minnesota

16.1

Southern Cal.

19.7

Southern Cal.

17.1

Utah

16

Texas

19.1

Minnesota

16.7

UNLV

14.3

Cincinnati

17.5

Utah

15.9

UCLA

13.7

Northwestern

17.1

Northwestern

15.3

Texas

12.3

Washington Univ.

15.4

UCLA

15

Chicago

12.1

Univ. Washington

15.3

Seton Hall

14.5

Rutgers

12.1

Interestingly, the number of law schools welcoming transfers representing more than 20% of their first-year class has fallen from nine in 2013 to only three in 2015.

Nonetheless, as shown in the following chart, we are continuing to see a modest increase in concentration in the transfer market between 2011 and 2015 as the ten law schools with the most students transferring in captured an increasing share of the transfer market, from 23.5% in 2011 to 31.5% in 2015.  Nearly one-third of all transfers in 2015 transferred to one of the ten schools with the most transfers.

Top Ten Law Schools as a Percentage of All Transfers

 

2011

2012

2013

2014

2015

Total Transfers

2427

2438

2501

2187

1979

Transfers to 10 Law Schools with Most Transfers

570

587

724

625

623

Transfers to 10 Law Schools with Most Transfers as % of Total Transfers

23.5%

24.1%

28.9%

28.6%

31.5%

NATIONAL AND REGIONAL MARKETS

Starting in 2014, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring schools with more than twelve transfers in to report not only the number of students who have transferred in, but also the law schools from which they came (indicating the number from each law school) along with the 75%, 50% and 25% first-year, law school GPAs of the students who transferred in. This allows us to look at where students are coming from and are going to, and to look at the first-year GPA profile of students transferring in to different law schools. The following chart focuses on the top 15 law schools in terms of transfers in for 2015 presented in descending USNews ranking. It indicates the extent to which these law schools were attracting transfers from their geographic region and also identifies the law school that provided the largest number of transfers to each listed law school as well as the percentage of transfers that came from that school.

Percentage of Transfers from Within Geographic Region and Top Feeder School(s)

School

# of Transfers

14/15

Region

Regional # of Transfers

14/15

Regional % of Transfers

14/15

School from Which Largest Number of Transfers Came in 2015

#/% of Transfers

Harvard

33/55

NE

6/15

18/27

GWU

6/11%

Columbia

46/44

NE

19/19

41/43

Wash. Univ.

5/11%

NYU

50/51

NE

20/15

40/29

Georgetown

7/14%

Berkeley

55/49

CA

43/29

78/59

Hastings

19/39%

Georgetown

113/110

Mid-Atl

49/43

43/39

GWU

11/10%

Texas

43/37

TX

27/22

63/59

Texas Tech

6/16%

UCLA

44/43

CA

31/26

70/60

Hastings

9/21%

Emory

53/51

SE

40/31

75/61

Atlanta’s   John

Marshall

11/22%

Minnesota

37/31

MW

21/17

57/55

Hamline

9/29%

GWU

97/109

Mid-Atl

78/70

80/64

American

44/40%

Azizona St.

66/65

SW

51/48

77/74

Arizona Sum.

47/72%

Florida St.

31/32

SE

24/27

77/84

Florida Coastal

11/34%

Miami

29/44

SE

21/27

72/61

St. Thomas

12/27%

American

44/33

Mid-Atl

14/6

32/18

Charleston

3/9%

Rutgers*

45

NE

29

64

Widener-Delaware

10/22%

*Rutgers is a unified school as of 2015, but for 2014 reported data separately for the Newark campus and the Camden campus, so this only reports the 2015 data.

For these top 15 law schools for transfer students in 2015, 10 schools obtained most of their transfers (55% or more) from within the geographic region within which the law school is located, while five schools (Harvard, Columbia, NYU, Georgetown and American) had fewer than 45% of their transfers from within the region in which they are located.  Interestingly, 11 of the 14 law schools with data for both 2014 and 2015 saw a decline in the percentage of transfers from within the region in which the law school is located. Only two law schools in 2015 had more than 70% of their transfers from within the region in which the law school is located (Arizona State and Florida State), down from seven such law schools in 2014.

Moreover, several law schools had a significant percentage of their transfers from one particular feeder school.  For Miami, roughly 34% of its transfers came from St. Thomas University (Florida); for Berkeley, roughly 39% of its transfers came from Hastings; for George Washington, 40% of its transfers came from American; and for Arizona State, 72% of its transfers came from Arizona Summit.

The chart below shows the tiers of law schools from which the largest 15 law schools in the transfer market received their transfer students.  Ten of the top 15 law schools for transfers are ranked in the top 20 in USNews, but of those 10, only six had 75% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, Berkeley, UCLA and George Washington.  Two additional schools, Georgetown and Texas, had at least 50% of their transfers from schools ranked between 1 and 99.  The remaining two law schools ranked in the top 20 in USNews (Emory and Minnesota) and the other five law schools in the list had at least half of their transfer students from law schools ranked 100 or lower, with five of those law schools having 75% or more of their transfers from law schools ranked 100 or lower. 

In addition, it shows that as you move down the rankings of law schools that are large players in the transfer market, the general trend in first-year law school GPA shows a significant decline, with several schools taking a number of transfers with first-year GPAs below a 3.0, including Minnesota, Arizona State, Florida State, Miami and American.

 Percentage of Transfers from Different Tiers of School(s) for 2014 and 2015 Along With First-Year Law School GPA 75th/50th/25th)

(In each column, the number on the left is the 2014 number and the number on the right is the 2015 number.)

(Highlighted cells indicate the modal response for each law school.)

(Color-coding of GPA data Reflects increases (Green) or decreases (Red) of .05 or more points.)

 

# of Transfers

 

Rank Top 50

   #              %

Rank 51-99

   #              %

Rank 100+

   #              %

GPA 75th

 

GPA 50th

 

GPA 25th

 

Harvard

33/55

23/41

70/75

10/13

30/24

0/1

0/2

3.95/3.98

3.9/3.92

3.83/3.85

Columbia

46/44

29/30

63/68

14/10

30/23

3/4

7/9

3.81/3.82

3.75/3.76

3.69/3.66

NYU

50/51

41/40

82/78

7/10

14/20

2/1

4/2

3.74/3.76

3.62/3.68

3.47/3.52

Berkeley

55/49

17/15

31/31

27/26

49/53

11/8

20/16

3.9/3.87

3.75/3.81

3.68/3.69

Georgetown

113/110

27/30

24/27

38/30

34/27

48/50

42/45

3.77/3.77

3.67/3.66

3.55/3.59

Texas

43/37

17/10

40/27

13/13

30/35

13/14

30/38

3.62/3.6

3.45/3.46

3.11/3.32

UCLA

44/43

15/15

34/35

23/23

52/53

6/5

14/12

3.73/3.7

3.58/3.58

3.44/3.46

Emory

53/51

3/5

6/10

7/8

13/16

43/38

81/75

3.42/3.45

3.27/3.3

2.93/3.06

GWU

97/109

13/21

13/19

73/63

75/58

11/25

11/23

3.53/3.46

3.35/3.32

3.21/3.15

Minnesota

37/31

4/6

11/19

12/7

32/23

21/18

57/58

3.3/3.43

3.1/3.12

2.64/2.96

Arizona St.

66/65

4/0

6/0

5/6

8/9

57/59

86/91

3.51/3.5

3.23/3.17

2.97/2.95

Florida St.

31/32

2/0

6/0

4/2

13/6

25/30

81/94

3.29/3.32

3.1/3.14

2.9/2.96

Miami

29/44

1/3

3/7

4/7

14/16

24/34

83/77

3.3/3.26

3.07/3.05

2.87/2.9

American

44/33

2/0

5/0

14/1

32/3

28/32

64/97

3.25/3.04

2.94/2.89

2.78/2.74

Rutgers

45

0

0

2

4

43

96

/3.29

/3.05

/2.75

STILL MANY UNKNOWNS

As I noted last year, this more detailed transfer data should be very helpful to prospective law students and pre-law advisors, and to current law students who are considering transferring.  This data gives them a better idea of what transfer opportunities might be available depending upon where they go to law school (or are presently enrolled as a first-year student).

Even with this more granular data now available, however, there still are a significant number of unknowns relating to transfer students.  In an upcoming post, I will touch on some questions that remain unanswered about the transfer market as well as a few other aspects of the transfer experience.

December 29, 2015 in Data on legal education | Permalink | Comments (0)

Sunday, December 6, 2015

The Opaqueness of Bar Passage Data and the Need for Greater Transparency

There has been a great deal of discussion lately over at The Faculty Lounge regarding declines in law school admissions standards, declines in bar passage rates, and the general relationship between LSAT scores and bar passage. Much of this discussion is clouded by the lack of meaningful data regarding bar passage results.  In this blog posting I will delineate several questions that just cannot be answered meaningfully based on the presently available bar passage data.

The national first-time bar passage rate among graduates of ABA-accredited law schools fell significantly in 2014. According to the NCBE’s statistics, the average pass rate from 2007-2013 for July first-time test-takers from ABA-accredited law schools was 83.6%, but fell to 78% in 2014. (2015 data won’t be available until next Spring when it is released by the NCBE.)

While there might be some reasons to believe these results were somewhat aberrational given that the objective criteria of the entering class in 2011 was only modestly less robust than the objective criteria of the entering class in 2010, and given the ExamSoft debacle with the July 2014 bar exam, the results are concerning, given that the objective criteria of the entering classes in 2012, 2013 and 2014 showed continued erosion. As the last two years have seen declines in the median MBE scaled score among those taking the July bar exam, the changes in entering class credentials over time suggest further declines in median MBE scaled scores (and bar passage rates) may be on the horizon.

In 2010, there were roughly 1,800 matriculants nationwide with LSATs of 144 or less. In 2012, there were roughly 2,600 matriculants nationwide with LSATs of 144 or less. In 2014, there were roughly 3,200 matriculants nationwide with LSATs of 144 or less. Recognizing that law school grades will be a better predictor of bar passage than LSAT scores, I think it is safe to say that entering law students with LSATs in this range are more likely than entering law students with higher LSATs to struggle on the bar exam.  Because the number of those entering law school with LSAT scores of 144 or less has grown substantially (particularly as a percentage of the entering class, more than doubling from less than 4% in 2010 to more than 8% in 2014), many are concerned that bar passage rates will continue to decline in the coming years.

While there has been a great deal of discussion regarding declines in admission standards and corresponding declines in bar passage standards, this discussion is profoundly limited because the lack of meaningful bar passage data presently provided by state boards of law examiners and by the ABA and ABA-accredited law schools means that we do not have answers to several important questions that would inform this discussion.

  1. What number/percentage of graduates from each law school (and collectively across law schools) sits for the bar exam in July following graduation and in the following February? Phrased differently, what number/percentage of graduates do not take a bar exam in the year following graduation?

This is a profoundly important set of questions as we look at employment outcomes and the number/percentage of graduates employed in full-time, long-term bar passage required positions. Given that only those who pass the bar exam can be in full-time, long-term bar passage required positions, it would be helpful to know the number/percentage of graduates who “sought” eligibility for such positions by taking a bar exam and the number/percentage of graduates who did not seek such eligibility. It also would be helpful to understand whether there are significant variations across law schools in terms of the number of graduates who take a bar exam (or do not take a bar exam) and whether those who do not take a bar exam are distributed throughout the graduating class at a given law school or are concentrated among those at the bottom of the graduating class. At present, however, this information simply is not available.

  1. What is the first-time, bar passage rate for graduates from ABA-accredited law schools?

One might think this would be known as ABA-accredited law schools are required to report first-time bar passage results. But the way in which first-time bar passage results are reported makes the data relatively unhelpful. Law schools are not required to report first-time bar passage for all graduates or even for all graduates who took a bar exam. Rather, law schools are only required to report first-time bar passage results for at least 70% of the total number of graduates each year. This means we do not know anything about first-time bar passage results for up to 30% of graduates of a given law school. Across all law schools, reported results account for roughly 84% of graduates, leaving a not insignificant margin of error with respect to estimating bar passage rates.

People would have been flabbergasted if the ABA had required reporting of employment outcomes for only 70% of graduates. Now that the ABA is requiring reporting on employment outcomes for all graduates, there is no good reason why the ABA should not be requiring bar passage accounting for all graduates, requiring law schools to note those who didn't take a bar exam, those who took and passed a bar exam, those who took and failed a bar exam, and those for whom bar status is unknown.  (Up until recently, some boards of law examiners were not reporting results to law schools, but my understanding is that the number of state boards of law examiners not reporting results to law schools is now fairly small.)

Notably, for 2011, 2012, and 2013, the average bar passage rate for first-time takers from all ABA-accredited law schools based on data reported by the law schools was consistently higher than the data reported by NCBE for the corresponding years (2011 – 83.8% v. 82%, 2012 – 81.8% v. 79%, 2013 – 82.4% v. 81%. (Moreover, first-time takers are not measured equivalently by the ABA and by the NCBE. The ABA reporting requirement focuses on graduates who took any bar exam for the first-time. The NCBE defines as first-time takers any person taking a bar exam in a given jurisdiction for the first-time. Thus, the NCBE set of first-time takers is broader, as it includes some people taking a bar exam for the second time (having taken the bar exam in another jurisdiction previously).

  1. What is the “ultimate” bar passage rate for graduates from ABA-accredited law schools?

Even though a number of commenters have noted that “ultimate” bar passage is more important than first-time bar passage, there is no publicly available data indicating the ultimate bar passage rate on a law school by law school basis for the graduates of each ABA-accredited law school. What number/percentage of graduates of a given law school who take a bar exam pass after the second attempt? What number/percentage of graduates of a given law school who take a bar exam pass after the third attempt? What number/percentage of graduates of a given law school never pass a bar exam? This information just is not publicly available at present.

While Standard 316, the bar passage accreditation standard, allows schools to meet the standard by demonstrating that 75% or more of those graduates who sat for a bar exam in the five most recent calendar years passed a bar exam, this “ultimate” bar passage data is not publicly disseminated. Thus, while first-time bar passage data is limited and incomplete for the reasons noted above, “ultimate” bar passage data on a law school by law school basis is actually not available.

The modest amount of information available on “ultimate” bar passage rates is not very helpful.  The LSAC National Longitudinal Bar Passage Study contains some analysis of "ultimate" bar passage rates, but it focused on the entering class in the fall of 1991, which it described as being “among the most academically able ever to enter” law school based on entering class statistics (page 14), a description that could not be used with the classes that have entered in the last year or two or three. It also does not contain any information about "ultimate" bar passage for graduates of individual law schools.  In addition, Law School Transparency has recently received some information from at least one law school that has requested anonymity. Much better “ultimate” bar passage information is needed to better inform many of the discussions about the relationship between entering class credentials and bar passage.

  1. How can we compare bar passage results from one jurisdiction to another?

Most state boards of law examiners do not present data regarding bar passage that allows reasonable bases for analyzing the results in ways that provide meaningful insight and a meaningful basis for comparison. Fewer than one-third of states publicly provide information in which a delineation is made between first-time takers and repeat takers on a law school by law school basis and only a few of these provide information about MBE scores on a school by school basis. Accordingly, it is very difficult to make meaningful comparisons of year-over-year results in the months following the July bar exam, because data is rarely reported in a consistent manner. The NCBE does provide statistics annually (in the spring) which includes a delineation of bar passage rates by state based on first-time test takers from ABA-accredited schools, but the NCBE does not provide MBE scores on a state by state basis (although it seemingly should be able to do this).

Conclusion

There is a need for much greater transparency in bar passage data from boards of law examiners and from the ABA and ABA-accredited law schools. It well may be that some law schools would be a more meaningful investment for "at-risk" students, those whose entering credentials might suggest they are at risk of failing the bar exam, because those law schools have done a better job of helping "at risk" students learn the law so that they are capable of passing the bar exam at higher rates than graduates of other law schools with comparable numbers of at risk students. It may well be that some jurisdictions provide "at risk" students a greater likelihood of passing the bar exam.  At the moment, however, that information just isn’t available. Much of the disagreement among various commentators about the relationships between admission standards and bar passage rates could be resolved with greater transparency – with the availability of much better data regarding bar passage results.

December 6, 2015 in Current events, Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Monday, November 9, 2015

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

This is Part IV of a blog series that focuses on alumni surveys based on data for Northeastern Law alumni who graduated between 1971 and 2012 (n = 833, 21% response rate).  Prior posts covered data related to the pre-law (Part II) and law school (Part III) experience.  This final installment summarizes data on the careers of Northeastern alumni. 

Varied Careers

One of the most significant post-law school findings from the Northeastern alumni survey is the sheer breadth of careers.  Sure, we all know in a general sense that lawyers have very diverse careers, yet I found the sheer magnitude of that diversity both striking and surprising.

Below is a graphic that summarizes the percentage of Northeastern alumni who have worked in a particular practice settings,by decade of graduation.

% Alumni/ae who have worked in Practice Setting, by Decade of Graduation

Job_Type_Percentages_By_Decade

To interpret this graphic [click on to enlarge], it is important to understand the composition of the underlying data.  The survey question asks, “Describe your previous employment history starting with your most recent employer first.”  Some graduates have only one job to report -- the one they started after graduation; others have had many.  These jobs are then classified by practice setting and binned into the six categories shown in the above graphic.  Note that bars total well beyond 100%. Why?  Because alumni are changing not just jobs, but also practice settings—on average, at least once, but sometimes two, three, or even four times over the course of several decades.

The graphic above conveys several significant pieces of information:

General point.  Legal careers are extremely varied.  As it has tightened up, the entry level market has become an area of intense scrutiny, and rightly so because it affects early career lawyers and law school applicant volume.  In contrast, the chart above reflects the longer view. It suggests that very able, motivated people who attend law school go on to varied careers that no one could have predicted at the time of enrollment, including--most significantly--the entering student.  These generational cohorts are a versatile group that comprise a disproportionate number of leaders in industry, government, and the nonprofit world.  Law schools cannot take full credit for this; we admit people of enormous potential.  Yet many alumni tell me that their legal training and knowledge has given them an enormous leg up. One law grad who is now a successful business executive recently asked me, "Why is it JD-advantaged? Why not the advantage of the JD?" 

Northeastern.  It is somewhat surprising that for Northeastern alumni who graduated during the 1970s, 80s, and 90s, 48% have worked in government.  That is a big number.  Northeastern’s mission and faculty emphasize public service. This same emphasis appears to be reflected in the careers of its graduates.

Changing Legal Ecosystem.  As noted in Posts II and III, because the Northeastern alumni survey spans multiple decades, it is possible that responses will be influenced by changes in the underlying legal economy. Stated simply, career opportunities and competition may have changed substantially between 1971 and 2012.  Such a pattern appears to be present here.  Specifically, 30% or more of graduates of the 1990s and 2000s have worked in private industry compared to 24% or less for those graduating in the 1970s and 80s.  This would be consistent with the incomplete absorption theory discussed in Part III.  See also Henderson, “Is the Legal Profession Showing its Age,” LWB, Oct 12, 2015.

Practicing versus Non-Practicing Lawyers

Another significant finding that flows from the Northeastern alumni survey are the workplace experiences of practicing versus non-practicing lawyers. 

Approximately 25% of respondents were not practicing lawyers but working, with no significant difference by decade cohort. The chart below compares these two groups based on 19 dimensions of workplace satisfaction. The question is drawn directly from the AJD Wave III:  “How satisfied are you with the following aspects of your current position?” 

Dimensions of Workplace Satisfaction, Practicing vs. Non-Practicing Lawyer

  SatisfactionDifferential-1

Choices ranged from 1 (highly dissatisfied) to 7 (highly satisfied).  The chart above summarizes the differential between the two groups.  For example, on Intellectual Challenge, we subtracted the non-practicing attorney average from the practicing attorney average.  The result is +.35 difference for practicing attorneys, meaning that they are more likely to find intellectual challenge in their work.  Likewise, the same results holds for the substance of one's work.  

In contrast, on workplace diversity, non-practicing lawyers were significantly more satisfied – on average, roughly 2/3 of a response point.  In fact, non-practicing lawyers were more likely to rate their workplaces higher on several surprising factors, including social value of work, performance reviews, work/life balance, and pro bono opportunities.

Can we generalize from these findings?

The results presented in this blog series reflect the collective experience of one law school’s alumni base – Northeastern.  There is no way to know if these results can be fairly generalized to the larger law graduate population, though there is a reasonable basis to believe that at least some of them can (e.g., the changing ecology of the legal job economy).  Yet, why speculate when the cost of collecting and analyzing the data is going down and the value of such applied research is going up?

AbfLet me reiterate my suggestion from Part I that a consortium of law schools should begin this effort under the aegis of the American Bar Foundation (the prime architect of the AJD Project).  Northeastern has agreed to donate the survey and research tools we created as part of the Outcomes Assessment Project.   Such an initiative would enable researchers to draw stronger conclusions from these data, including potentially laudatory school-level effects that can help the rest of legal education. 

I have been researching legal education for many years.  I have spent enough time with alumni at Indiana Law, Northeastern Law, and several other law schools to gain a strong impression that law school graduates are having, on balance, important, satisfying and high-impact careers.  Further, there is strong evidence that the legal industry is undergoing a significant structural change – that is much of what the Legal Whiteboard catalogs.  This structural change topic is of great interest to prospective students, lawyers, and the mainstream press.  Yet, these two themes--the careers of alumni and structural change--are related. 

If legal education wants to influence the narrative on the value of the JD degree, it is far better to rely on data rather than rhetoric.  My sense is that data on our alumni will tell a rich, balanced story that will enable us to make better decisions for all stakeholders, including prospective law students. Further, if we don’t gather high quality facts, we can expect to get outflanked by a blogosphere and a mainstream press that are armed with little more than anecdotes.  To a large extent, that is already happening.  Now is the time to catch up.

EvanparkerCredits

This blog post series would not have been possible without the dedication and world-class expertise of my colleague, Evan Parker PhD, Director of Analytics at Lawyer Metrics.  Evan generated all the graphics for the Northeastern Alumni/ae Survey and was indispensable in the subsequent analysis. He is a highly talented applied statistician who specializes in data visualization.  Evan, thanks for you great work!

For other “Varied Career Path” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part II, Alumni Surveys, Before-Law School

Part III: Alumni Surveys, During Law School


November 9, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Structural change | Permalink | Comments (0)

Wednesday, November 4, 2015

Part III: Alumni Surveys, Responses on the Law School Experience

Part II of this blog series reported that the top motivations to attend law school have remained the same for over four decades, at least for Northeastern University School of Law (NUSL).  Alumni reported the same underlying desire: to build a satisfying, intellectually challenging career where they could help individuals and improve society. This may be an image forged by pop culture and the idealism of youth, but it is also likely sincere.  It is the better side of our human nature. 

Part II also showed two motivations to attend law school – the desire for “transferable skills” and “eventual financial security"-- that did appear to be shifting over time.  I suggested that these shifts are more likely about a changing ecosystem than a fundamental shift in the type of people applying to law school.  

A similar ecological theme can be observed in the "During Law School" data. For example, since its reopening in 1968, Northeastern Law has required every graduate to complete four 11-week cooperative placements, usually in four different practice settings (e.g., government agency, public defender, large firm, public interest organization). As noted in Part I, students can be paid during co-op because it is a university rather than an ABA requirement. Cf. Karen Sloan, “The ABA says No to Paid Student Externships,” Nat’l L J, June 10, 2014.

One series of questions in the alumni survey specifically focused on the co-op experience, including co-op quality, what was learned, and whether they were paid.  The chart below reveals a steady, four-decade decline in the number of paid co-ops.

  NPaidCoopsByYear-1

In the early 1970s, essentially all four co-ops were paid.  By the mid-80s, the average was down to three. Since the 2000s, the average has been two or fewer paid co-ops.

To my mind, the above trendline is compelling evidence of a steady, systemic shift in the legal ecosystem. I have written about this pattern in the past, suggesting that the rate of absorption of law grads into the licensed bar has been going down since the 1980s.  See Henderson, “Is the Legal Profession Showing its Age,” LWB, Oct 12, 2014 (noting that between 1980 and 2005, the average age of licensed lawyers increased from 39 to 49).  

When I saw this downward trendline for the first time, I recalled my numerous interviews with NUSL alumni/ae from the 1970s. In describing their co-ops, they spoke of opportunities that were plentiful and varied. I often heard the refrain, “I paid for law school mostly with my income from co-op.”  Note that during the 1970s, graduating from college was much less prevalent than today.  Law firms were also growing, with 1970 becoming a major inflection point in the rise of the large law firm. See Galanter & Palay, Tournament of Lawyers (1991) (seminal text collecting and analyzing data on the growth of large firms).

The trendline on paid co-ops also made me rethink what I heard from NUSL co-op employers. The school has roughly 900 employers who regularly or semi-regularly participate in co-op.  I heard several regular employers express strong preferences for “third or fourth co-ops." Why?  Because third or fourth co-op students already had significant legal experience and needed less training to be valuable to the employer.  Training is costly. Even if the trainee is unpaid, the lawyer-as-teacher is expending their own valuable time.  If an employer is going to provide training, they need a way to recapture that investment. Unpaid labor for eleven weeks is one potential way; if the labor is already partially trained, that is even better.

Unfortunately, doing a great job for a co-op employer does not guarantee permanent employment or even a modest wage for temporary work.  The legal ecosystem does not reliably and consistently support those outcomes. Yet, 20, 30, or 40 years ago, the dynamics were far more favorable. 

Obviously, in the year 2015, law grads are having a difficult time finding permanent, long-term professional employment (bar passage-required, JD-advantaged, or non-legal professional jobs).  The shortage of high-quality entry level jobs has given rise to criticisms that legal education needs more practical training.  The implicit assumption is that such a change will cure the underemployment problem.  I am skeptical that is true. 

A more likely explanation for law grad underemployment is that the supply of trained lawyers is in excess of demand, partially due to demographics and partially due to the inability of most citizens to afford several hours of a lawyer's time.  This is a very difficult problem to fix. But misdiagnosing the problem does not help.

To the extent a legal employer is looking for a practice-ready law grad, Northeastern’s co-op model is as likely to deliver that outcome as anything else I have observed.  My in-depth review for how co-op affects professional development is written up in OAP Research Bulletin No. 3.  Ironically, what may be the best practice-ready model among ABA-accredited law schools is a 50-year old program that most critics may not know exists. But see Mike Stetz, “Best Schools for Practical Training,” Nat’l Jurist, March 2015 (ranking Northeastern No. 1).

The experiential education crowd will be heartened by another “During Law School” finding.  Among 833 alumni respondents, there were more than 3,200 co-ops identified by practice setting.  Alumni were asked to identify their most valuable co-op and provide a narrative as to why. 

Below is a chart that plots the difference between the baseline frequency of a particular co-op practice setting and how often that practice setting was picked as the most valuable.  The scale is in standard deviation units, with “par” meaning that the practice setting was most valuable in the same proportion as its frequency in the overall sample.

  MostImportantCoop-1

It is not hard to see the common theme.  Co-ops where students can observe lawyers in action – or better yet, get stand-up time in court – were rated as much more valuable.  The table below captures some of the underlying narrative comments.

Narrativetable

For other “During Law School” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part II, Alumni Surveys, Before-Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 4, 2015 in Blog posts worth reading, Data on legal education, Data on the profession | Permalink | Comments (0)

Monday, November 2, 2015

Part II: Alumni Surveys, Pre-Law Characteristics and Motivations

Building on the introduction (Part I) of this blog series, our alumni survey of Northeastern University School of Law yielded cross-sectional data that span graduates from 1971 to 2012.  Because of the large time span, some of the most interesting responses to questions tend to fall into two buckets:

  1. What is staying the same?  Here we are looking for response patterns that are relatively stable and constant across age cohorts.
  2. What is changing?  Likewise, we are also interested in responses that appear to be changing as a function of time of graduation.

In the portion of our analysis that looked at pre-law characteristics and motivations, our most striking findings tended to fall into bucket #1. 

For example, below is a graphic summarizing responses to the question, “How important were the following goals in your decision to attend law school?” Responses are organized by decade of graduation.  They are ordered by most important to least important for respondents who graduated in 2000 or later.

                              Goals for Attending Law School, by Decade of Graduation

GoalsLawSchoolDecade-1
One of the most striking features is that the top three responses are essentially identical for all four age cohorts.  For each group, the desire to have a satisfying career, help individuals, and improve society were all, on average, very important in the decision to attend law school. 

Although there are differences across age cohorts, there remains relatively clear clustering by decade of graduation. (Query: would this same pattern hold true at other law schools?  One of the advantages of pooling data across schools is the ability to isolate a self-selection effect that operates at the school level.)

Yet, some factors appear to be changing over time, such as the importance of transferable skills and eventual financial security.  With each decade cohort, respondents are rating these factors progressively more important to their decision to attend law schools. Likewise, “other goals” appear to be progressively less important. 

These patterns (and others survey results I will report in Parts III and IV) suggest gradual changes in the knowledge worker ecosystem that require students to be more deliberate and focused in their decision to attend law school.  For example, costs of all of higher education are going up at the same time that the financial payoffs of traditional graduate and professional education are becoming less certain.  This is an ecological effect that is bound to have an influence on students and student decision making.  Although legal education would be part of this shift, the shift itself would not be unique to law.

This interpretation is consistent with our focus group discussions with Northeastern alumni.  This group queried whether the term “transferable skills” was even part of the lexicon when they were graduating from college.  Likewise, the group commented that the decision to attend law school during the 1970s and 1980s was not difficult because tuition was relatively low and jobs, including paid co-op jobs, were relatively plentiful. Although the legal market may be tighter and more complex than in earlier decades, the Northeastern alumni commented that the tradeoffs were changing for all knowledge workers.  

For other “Before Law School” findings, see the full Alumni/ae Survey Report at the OAP website

Links:

Part I:  What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

Part III: Alumni Surveys, During Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 2, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Important research | Permalink | Comments (1)

Part I: What Can We Learn by Studying Law School Alumni? A Case Study of One Law School

BryantgarthSeveral years ago, as the legal academy was beginning to work its way through the implications of the landmark “After the JD” Project (AJD), one of the principal investigators, Bryant Garth, commented to a group of fellow law professors that “within a few years it will be educational malpractice for law schools to not study their own alumni.”

Garth had special standing to make this claim, as he had launched the AJD during his long tenure at the American Bar Foundation and then went on to serve as Dean of Southwestern Law School in Los Angeles. While at Southwestern, Garth taught a short 1L course about legal careers that combined AJD findings with live interviews with Southwestern alumni. Despite decades of research studying lawyers, Garth gushed at how much he personally learned from these interviews and how the narratives were often surprising and inspiring, particularly for Southwestern students filled with apprehension at what the future might hold.

I had occasion to remember Garth’s observations in early 2011 when Emily Spieler, then the Dean of Northeastern University School of Law (NUSL), suggested that I study her alumni.

Northeastern Law

Northeastern is an interesting case study because for nearly 50 years the school has required four 11-week cooperative placements (or “co-ops”) as a condition of graduation. To facilitate completion within three years, the 1L year at Northeastern is taught in semesters while the 2L and 3L years are taught over eight alternating quarters. Summer-winter co-op students take classes during the fall and spring quarters, while fall-spring co-op students attend classes in the summer and winter quarters. Because co-ops are not for academic credit – they fulfill Northeastern University rather than ABA-accreditation requirements – students can be paid for the full 11 weeks. (More on that in Part III of this series.)

Dean Spieler wanted a third party to study Northeastern because, in her experience as dean, her many encounters with Northeastern alumni suggested to her that the School’s unusual education model was accelerating the professional development of its students and enabling them to make better, more informed career choices.

Acceleration of profession development is a very difficult effect to measure, but it is certainly plausible. In fact, the entire experiential law movement is largely premised on this claim. So I signed onto a multi-year initiative that we called the Outcomes Assessment Project (OAP).

The premise of the OAP was very unusual. Through a generous but anonymous benefactor, the research tools and templates developed for the OAP would be made available to other law schools interested in studying graduates. The intent is for law schools to accumulate data using similar methods and instruments, driving up the value of the data (because it is comparable across schools) while driving down the cost of collection and analysis.

There are many phases to the OAP, including those focused on admissions, the student experience, and co-op employers. Here, however, I wanted to write about what we learned from a survey of Northeastern’s alumni.

Last fall, we sent a survey instrument to Northeastern alumni who graduated from the law school between 1971 and 2012 (~4,000 law grads for which NUSL had a current email address). The survey instrument was substantially based on the AJD Wave III survey instrument, which was sent to a representative sample of law graduates from all ABA-accredited law schools who took the bar in the year 2000.

In contrast to the AJD, which has produced remarkable knowledge about law school grads from the year 2000, the OAP Alumni/ae Survey included four decades of law graduates from a single law school. Although this is not a true longitudinal sample, which samples the same people over time, this methodology enables cross-sectional comparisons between different cohorts of graduates (e.g., by decade of graduate or pre/post AJD).

The response rate of the Northeastern alumni survey was 21% (833 total completed questionnaires), which is relatively high for a long online survey. Because the resulting sample substantially mirrored the baseline data we had for Northeastern alumni practice areas and years of graduation, we were confident that the resulting sample was both representative and reliable.

Applied Research

Similar to the AJD, the OAP Alumni/ae Survey produced enough data to keep researchers busy for several years. Hopefully, these data will eventually be archived and aggregated at the American Bar Foundation or a similar institution in order to facilitate a broader and deeper understanding of legal careers.

However, the OAP was largely set up to be applied research. What does this mean? Here, the goal is, at least in part, to obtain data that is operational in nature, thus enabling a law school to examine and test fundamental assumptions and generate insights related to its stated goals and mission. In a word, to improve.

Further, when skillfully boiled down using data visualization, the findings themselves tend to be of great interest to all law school stakeholders, including alumni, faculty, administrative staff, current students, and prospective students. Interest is particularly piqued during times of transition and uncertainty, such as now, when law schools and the practicing bar are looking to each other to provide potential answers and support.

To makes results as accessible as possible, we decided to present the preliminary Alumni Survey results in a simple three-part framework:

  • Before Law School: pre-law characteristics and motivations
  • During Law School: the law school experience
  • After Law School: job mobility and satisfaction

This week, I am going to give a sampling of findings from all three sections – findings that will likely be of interest to a non-Northeastern audience of law faculty, practicing lawyers, and students. If you are interested in reading the entire preliminary report, it can be found online at the Northeastern OAP website.

Links:

Part II, Before-Law School

Part III: Alumni Surveys, During Law School

Part IV: Alumni Surveys, The Varied Career Paths of Law School Graduates

November 2, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Important research | Permalink | Comments (0)

Sunday, October 25, 2015

Is there a right way to respond to the "Law School Debt Crisis" Editorial?

Amidst all the other newsworthy topics, the New York Times editorial board made law school debt the lead editorial for today's Sunday edition. And the story line is not good.  

The editorial starts with the bleak statistics for Florida Coastal Law School -- low median LSAT scores and high debt loads, casting doubt on whether its graduates can pass the bar exam and repay their federally financed student loans.  The editorial highlights Florida Coastal' for-profit status but goes on to note that the rest of legal education is not much better. 

A majority of American law schools, which have nonprofit status, are increasingly engaging in such behavior, and in the process threatening the future of legal education.

Why? The most significant explanation is also the simplest — free money.

The editorial details changes in federal higher education finance that created the Direct PLUS Loan program, which, over-and-above Federal Stafford Loans, underwrites up to the full cost of attendance as determined by each law school.  The combination of poor job prospects and high debt have depressed applicant volume.  As the Times editorial notes, the systemic impact has been to lower admissions standards to sweep in students who will, as a group, struggle to pass the bar exam following graduation.  Virtually all of this is financed by DOE loan money.

I don't think the typical member of the legal academy understands the precarious financial condition of legal education.  The precariousness exists on two levels: (1) our financial fate is in the hands of the federal government rather than private markets; and (2) the Times editorial suggests that we have a serious appearance problem, which draws down the political capital needed to control our own destiny.  With the political winds so goes our budgets. 

I think it is important for the Association of American Law Schools (AALS) to take some decisive action in the very near future.  In this blog post, I explain where the money comes from to keep the law school doors open and why, as a consequence, we need to pay closer attention to the public image of legal education.  I then offer some unsolicited advice to the AALS leadership. 

(1) Who pays our bills?  

Over the last decade, the federal government has, as a practical matter, taken over the financing of higher ed, including legal education.  

Here is how it works.  Any law student who needs to borrow money to attend law school is strongly incentivized to borrow money from the Department of Education (DOE).  Although the DOE loans carry high interest rates -- 6.8% for Stafford Loans and 7.9% for Grad Plus -- they include built-in debt relief programs that functionally act as insurance policies for the risk that a graduate's income is insufficient to make timely loan repayments.  Law school financial aid offices are set up around this financial aid model and make it very easy for students to sign the loan documents, pay their tuition, and get disbursements for living expenses.

In the short to medium term, this is good for the federal government because the loans are viewed as income-producing assets in the budgets that get presented to and approved by Congress. But in the longer term this could backfire if a large portion of students fail to repay their full loans plus interest.  Federal government accounting rules don't require projections beyond ten years.  But already the government is beginning to see the size of the coming write-downs for the large number of graduates who are utilizing the Public Service Loan Forgiven program, which has a ten-year loan forgiveness horizon. And it is causing the feds to revise their budgets in ways that are politically painful.  With the loan forgiveness programs for private sector law grads operating on a 20- to 25-year repayment window, the magnitude of this problem will only grow.  

The enormous risk here for law schools is that Congress or the DOE will change this system of higher education finance.  For example, the Times editorial calls for capping the amount of federal loans that can be used to finance a law degree.  Currently, the limit on Stafford Loans for graduate education is $20,500, but Grad Plus loans have no limit at all.  If the DOE were to cap Grad Plus at $29,500 per year, leading to a total three-year federal outlay of $150,000 per law student, this would have an enormous adverse impact on the typical law school budget.

Law School Transparency reports that the average law school debt load for a 2014 law graduate is $118,570, but we know very little about the full distribution.  Because of the pervasiveness of the reverse Robin Hood policy, which uses tuition dollars of low credentialed students to finance scholarships for their high credentialed peers, there is likely a significant percentage of students at most law schools who graduate with more than $150,000 in law school debt.   Further, according to US News, there are twelve law schools -- including three in the T14 -- where the average law school debt load is more the $150,000.  Although there are no statistics on the percentage of law students graduating with greater than $200,000 in law school debt, law students tell me this amount is common. 

I have translated this meager public information into the chart below. The area in green is the volume of money that could disappear from law school budgets if the federal government imposed a hard limit on federally financed law school lending.

Lawschooldebtv3

Why would this money be at grave risk?  Two reasons:

First, private lenders will be reluctant to cover the entire shortfall.  For decades, private lenders played an important roll in law school finance.  But these lenders got pushed out of the market by the changes in federal higher ed finances described above.  Unfortunately, in the intervening years, the ratio of earning-power-to-debt has gotten too far out of whack.  To come back into this market, private lenders would need to be confident that loans would be repaid.  That likelihood is going to vary by law school and by law student, raising the cost of lending.  This means that, to varying degrees, virtually all law schools would have to sweat over money.  Unlike Grad Plus, private lenders may balk at financing full sticker tuition for lower credentialed students trying to attend the highest ranked school that admitted them.

Second, private lenders will not offer the same loan forgiveness options, such as IBR and Public Service Loan Forgiveness, currently offered by the federal government.  With the curtailed scope of these functional insurance programs, some portion of prospective law students will likely be unwilling to sign loan documents in excess of the federal lending cap.  Even very elite schools will feel the pain here.

(2) An appearance problem in the world of politics

I would bet a lot of money that law faculty have been emailing the Times editorial to one another, criticizing its lack of nuance.  But here is our problem.  We are not in a court where a judge will listen to our elegant presentation of facts and law.  Nor are we in the world of private markets where we can expect people to reliably follow their own economic self-interest.  We are in the realm of politics where sides get drawn based on appearance and political expediency.  To make matters worse, the legal academy just got lambasted by the paper of record on the left.

It is hard to argue that a cap on federal funding of legal education would be bad policy for students, the legal profession, taxpayers, or broader society.  Such a change would:

  1. Reduce the number of law grads going into a saturated labor market;
  2. Reduce the number of low credentialed students admitted to law school who will one day struggle to pass the bar;
  3. Reduce the risk of nonpayment of students loans currently borne by US taxpayers;
  4. Put in place serious cost-containment on legal education.

For law schools, however, such a change would produce layoffs and pay reductions.  And that may be the fate of the luckier schools.   It is widely known that most law schools are running deficits.  Central universities are looking for ways to wait out the storm.  But the cliff-like quality of a federal cap on law school lending would call the question of how much support is too much.  

What's the solution?

Legal education has a cost problem, but so does the entire higher ed establishment. Here is my unsolicited advice.

The leadership of the AALS needs to take a very strong public position that the trend lines plaguing higher ed need to be reversed.  This is not risky because it is so painfully obvious.  The AALS should then, in conjunction with the ABA, send a very public delegation to the Dept of Education. The delegation should be given a very simple charge:  Help the DOE

  1. Outline the systemic problems that plague higher education 
  2. Articulate the importance of sound policy to the national interest
  3. Formulate a fair and sustainable solution. 

I have faith that my legal colleagues would do a masterful job solving the problems of higher education.  And in the process, we'll discover that we have become the architects of a new system of higher ed finance that will be fair and equitable system for all stakeholders, including those employed in legal education.  That's right: act decisively to ensure a fair and equitable deal.  The only drawback is that it won't be the status quo that we'd instinctively like to preserve. 

October 25, 2015 in Blog posts worth reading, Current events, Data on legal education | Permalink | Comments (25)

Friday, October 2, 2015

Part Two - The Impact of Attrition on the Composition of Graduating Classes of Law Students -- 2013-2016

In late December 2014, I posted a blog entitled Part One – The Composition of the Graduating Classes of Law Students – 2013-2016.  That blog posting described how the composition of the entering classes between 2010 and 2013 has shifted.  During that time, the percentage at or above an LSAT of 160 dropped by nearly 20% from 40.8% to 33.4%.  Meanwhile, the percentage at or below an LSAT of 149 increased by over 50% from 14.2% to 22.5%. 

But this reflects the composition of the entering classes.   How do the graduating classes compare with the entering classes?  This depends upon the attrition experienced by the students in a given entering class.  This much belated Part Two discusses what we know about first-year attrition rates among law schools.

I have compiled attrition data from all of the fully-accredited ABA law schools outside of Puerto Rico for the last four full academic years.  I have calculated average attrition rates for the class as a whole and then broken out average attrition rates by law schools in different median LSAT categories – 160+, 155-159, 150-154 and <150.

In a nutshell, overall first-year attrition increases as the median LSAT of the law school decreases.  Over the last few years, while “academic attrition” has declined for law schools with median LSATs of 150 or greater, “other attrition” has increased modestly, particularly for law schools with median LSATs <150, resulting in a slight increase in overall first-year attrition between 2010 and 2013.

Overall First-Year Attrition Rates Have Increased Slightly

In calculating attrition rates, I wanted to capture those students who are no longer in law school anywhere.  Thus, for these purposes, “attrition” is the sum of “academic attrition” and “other attrition.”  “Academic attrition” occurs when a law school asks someone to leave because of inadequate academic performance.  “Other attrition” occurs when a student departs from the law school volitionally. Both of these categories exclude “transfers.”

The following chart shows that despite the declining “LSAT profile” of the entering classes between 2010 and 2013, there has been no meaningful change in the average “academic attrition” rate.  The modest increase in overall first-year attrition over this period, from roughly 5.8% to roughly 6.6%, is largely due to a growth in the “other attrition” category from roughly 2.5% to roughly 3.2%.

Overall First-Year Attrition for Classes Entering in 2010, 2011, 2012, and 2013

 

Beg. Enrollment

Academic Attrition

% Academic

Other Attrition

% Other

Total Attrition

% Attrition

2010-11

50408

1673

3.32

1256

2.49

2929

5.81%

2011-12

46477

1551

3.34

1262

2.72

2813

6.06%

2012-13

42399

1461

3.45

1186

2.8

2647

6.25%

2013-14

38837

1316

3.39

1236

3.18

2552

6.57%

 (Calculating attrition rates for 2010-11, 2011-12 and 2012-13, is a little more complicated than one might think.  For ABA reporting years of 2011, 2012, and 2013, “academic attrition” was reported separately, but “other attrition” included “transfers out.” Thus, to generate the real “other attrition” number, one needs to “subtract” from “other attrition” the numbers associated with “transfers out.” Because some schools occasionally listed transfers out in “second year” “other attrition,” this analysis should be understood to have a little fuzziness to it for years 2010-11, 2011-12 and 2012-13.  For ABA reporting year 2014, transfers out were not commingled with “other attrition,” so the calculations were based solely on the sum of “academic attrition” and “other attrition.”  Beginning with reporting this fall, “academic attrition” will include both involuntary academic attrition as well as voluntary academic attrition (students who withdrew before completing the first-year, but were already on academic probation).)

Academic Attrition Rates Increase as Law School Median LSAT Decreases

Notably, there are different rates of attrition across law schools in different LSAT categories.  The following chart breaks down attrition by groups of law schools based on median LSAT for the law school for the entering class each year.  For each year, the chart shows the average first-year attrition rates for law schools with median LSATs of 160 or higher, for law schools with median LSATs of 155-159, for law schools with median LSATs of 150-154 and for law schools with median LSATs less than 150.  In addition, it breaks out “academic attrition” and “other attrition” as separate categories for each category of law school and for each year and then provides the total overall attrition rate each year along with the four-year average attrition rate.

Average Attrition Rates by Category of Schools Based on Median LSAT

 

2010-11

2011-12

2012-13

2013-14

 

Median LSAT

Acad

Other

Total

Acad

Other

Total

Acad

Other

Total

Acad

Other

Total

Four-Year Average

160+

0.6

1.7

2.3

0.6

1.9

2.5

0.4

2.0

2.4

0.3

1.5

1.8

2.3

155-159

2.9

2.6

5.5

2.2

2.8

5.1

2.1

2.9

5.1

1.7

3.2

4.9

5.2

150-154

6.3

3.8

10.1

6.2

3.4

9.6

6.0

3.7

9.7

4.2

4.3

8.5

9.4

<150

10.1

2.4

12.5

9.4

3.8

13.2

9.1

3.0

12.2

9.7

4.7

14.4

13.1

 

When looking at this data, some things are worth noting. 

First, across different LSAT categories, overall attrition increases as you move from law schools with higher median LSATs to law schools with lower median LSATs, going from an average over the four years of 2.3% for law schools with median LSATs of 160+, to 5.2% for law schools with median LSATs of 155-159, to 9.4% for law schools with median LSATs of 150-154, to 13.1% for law schools with median LSATs of <150.  “Academic attrition” consistently increases as median LSAT decreases, while “other attrition” is mixed. (Although this analysis is focused on four LSAT categories, the trend of having overall attrition increase as median LSAT decreases continues if you add a fifth LSAT category. In 2010-11 there was only one law school with a median LSAT of 145 or less, with only 320 students.  By 2013-14, however, there were nine law schools with a median LSAT of 145 or less, with 2,075 students.  The overall first-year attrition rate (encompassing academic attrition and other attrition) at these nine schools in 2013-14 was 15.9 percent.  The overall attrition rate at the other 24 law schools with a median LSAT less than 150 was 13.6 percent.) 

Second, over the period from 2010-2013, “academic attrition” generally appears to be flat to decreasing for schools in all LSAT categories except for 2013-14 year for law schools with median LSATs <150, where it increased slightly (largely because of the larger number of schools with median LSATs of 145 or less).  By contrast, “other attrition” presents more of a mixed record, but generally appears to be increasing between 2010 and 2013 for schools in most LSAT categories.  Nonetheless, average overall first-year attrition is lower in 2013-14 for law schools in the top three LSAT categories.

Third, if you are wondering why the average overall attrition could be increasing while the overall attrition rates for the top three LSAT categories are decreasing, the answer is because of the changing number of students in each category over time.  As noted in Part I, the number of students and percentage of students in the top LSAT category has declined significantly, while the number of students and percentage of students in the bottom LSAT category has increased significantly.  This results in the average overall attrition rate increasing even as rates in various categories are decreasing.

Thoughts on Attrition Rates

It makes sense that “academic attrition” increases as law school median LSAT decreases.  It seems reasonable to expect that law schools with median LSATs of <155 or <150 will have higher “academic attrition” rates than those with median LSATs of 155-159 or 160 and higher. 

It may make less sense, however, that “academic attrition” generally decreased across all four categories of law schools between 2010-11 and 2013-14 (with the exception of law schools with a median LSAT <150 in 2013-14), even as the LSAT profile of each entering class continued to decline.  With an increase in the number and percentage of law students with LSATs of <150, particularly those with LSATs of <145, one might have anticipated that the average rate of “academic attrition” would have increased, particularly among law schools with median LSATs of 150-154 (who might have seen an increase in the number of students with LSATs less than 150) and among law schools with median LSATs of <150, given the increase in the number of law schools with median LSATs of 145 or less. 

Cynics might argue that from a revenue standpoint, law schools are making a concerted effort to retain a higher percentage of a smaller group of students.  But this assumes a degree of institutional purposefulness (coordination among faculty) that is rare among law schools.  Moreover, my sense is that there are much more benign explanations.

First, if law schools have not adjusted their grading curves to reflect a different student profile, then the standard approach to first-year grading – which involves a forced curve at most schools -- is likely to produce a similar percentage of “at risk” students year over year even though the objective credentials of each entering class have declined. 

Second, with the decline in the number of applicants to law school, one might surmise that those choosing to go to law school really are serious about their investment in a legal education and may be working harder to be successful in law school, resulting in fewer students facing academic disqualification, even though the credentials for each entering class have been weaker year over year.  This may be particularly true in law schools with robust academic support programs which may be helping some students on the margin find sufficient success to avoid academic attrition.

Third, and perhaps most significantly, however, is the reality that “academic attrition” and “other attrition” are related.  Indeed, that is why I have reported them together in the charts above as two components of overall attrition.  Some students who might be at risk for “academic attrition” may decide to withdraw from law school voluntarily (and be classified under “other attrition” rather than “academic attrition”). In addition, it is possible that other students, particularly at law schools with median LSATs <150, may be voluntarily withdrawing from law school because they have decided that further investment in a legal education doesn’t make sense if they are performing relatively poorly, even though the law school would not have asked them to leave under the school’s policy for good academic standing. 

The fact that the percentage of students in each entering class with LSATs of <150 and even <145 has increased substantially between 2010 and 2013, while the rate of overall first-year attrition has increased only modestly over this time period, suggests that the composition of graduating classes (based on LSATs) will continue to weaken into 2016 (and probably 2017 if attrition patterns did not change in 2014-15).  As a result, the declines in the median MBE scaled score in 2014 and 2015 could be expected to continue in 2016 and 2017.  Some law schools also are likely to see bar passage rates for their graduates decline, perhaps significantly, in 2015, 2016 and 2017.

Unanswered Questions

This analysis focuses on first-year attrition.  There continues to be attrition during the second year and third year of law school, generally at lower rates, perhaps 2-3% of second-year students and 1-2% of third-year students.  (On average, the number of graduates in a given class has been around 90% of the entering class.)  It is not clear yet whether attrition among upper level students follows similar patterns across different categories of law schools.  The publicly-reported attrition data also does not provide any information regarding the gender or ethnicity or socio-economic background of students leaving law school.  Therefore, we don’t know whether there are different rates of attrition for women as compared with men or whether students of different ethnic backgrounds have different rates of attrition.  We also don’t know whether first-generation law students experience attrition at greater rates than other law students, or whether students of lower socio-economic status experience attrition at greater rates than students of higher socio-economic status. 

(I am very grateful for the insights of Bernie Burk and Scott Norberg on earlier drafts of this blog posting.)

October 2, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)

Thursday, September 17, 2015

2015 Median MBE Scaled Score Arguably Declines Less Than Expected

Natalie Kitroeff at Bloomberg published earlier today an article with the first release of the median MBE scaled score for the July 2015 Bar Exam -- 139.9 -- a decline of 1.6 points from the July 2014 score of 141.5. 

While this represents a continuation of the downward trend that started last year (when the median MBE fell a historic 2.8 points from 144.3 in July 2013), the result is nonetheless somewhat surprising. 

The historic decline in the median MBE scaled score between 2013 and 2014 corresponded to a modest decline in the LSAT score profile of the entering classes between 2010 and 2011. 

As I discussed in my December blog posting on changing compositions of the entering classes since 2010, however, the decline in LSAT score profile of the entering classes between 2011 and 2012 was much more pronounced than the decline between 2010 and 2011.  Thus, one might have expected that the decline in the median MBE scaled score for 2015 would have been even larger than the decline between 2013 and 2014. 

But instead, the decline was only 1.6 points, just slightly more than half of the 2.8 point decline of the previous year.

Why would a demonstrably greater decline in the LSAT profile of the entering class between 2011 and 2012 (compared with 2010-2011) yield a manifestly smaller decline in the median MBE scaled score between 2014 and 2015 (compared with 2013-2014)?

This likely will remain a mystery for a long time, but my guess is that the ExamSoft debacle resulted in an aberrationally large decline in the median MBE scaled score between 2013 and 2014, such that the corresponding decline between 2014 and 2015 seems disproportionately smaller than one would have been expected.

Over on Law School Cafe, Debby Merritt has a very good description of the different factors that likely have impacted bar passage performance in July 2015.

Derek Muller has collected bar passage results for the several states that have released at least some results so far and has posted them on his Excess of Democracy blog.  Focusing only on overall bar passage rates, two states are "up," (North Dakota (6%) and Iowa (5%)), six are down between 1-5% (Missouri (-1%), Washington (-1%), Montana (-2%), Kansas (-3%), North Carolina (-4%), West Virginia (5%)), and four are down double-digits (Mississippi (-27%), New Mexico (-12%), Oklahoma (-11%), and Wisconsin (-10%).  (Last year 21 states were down 6% or more on first-time bar passage and six of those were down 10% or more.)

September 17, 2015 in Blog posts worth reading, Data on legal education, Data on the profession | Permalink | Comments (0)

Wednesday, August 12, 2015

Of Transfers and Law-School-Funded Positions

1.      Many Elite Law Schools with Large Numbers of Transfers also Have Large Numbers of Law-School-Funded Positions

Several weeks ago, I participated in two separate conversations.  One was about when law-school-funded positions should be categorized as full-time, long-term, bar-passage-required (FLB) positions and one was about transfer students.  This prompted me to compare those schools that are “big players” in law-school-funded positions with those schools that are big players in the “transfer” market.  Interestingly, as shown in the chart below, there is a significant amount of overlap.

For the Class of 2014, of the 15 law schools with the most graduates in FLB positions, ten had a significant (net) number of transfer students in the summer of 2012.  (The chart is sorted based on 2014 FLB positions (in bold).  To provide context, the chart also includes the 2011 net transfer data and 2013 law-school-funded FLB data for these 10 schools.)

Law School

2011 Net Transfers

2013 Law-School-Funded FLB

2012 Net Transfers

2014 Law-School-Funded FLB

GEORGE WASHINGTON

94

88

46

78

GEORGETOWN

63

73

75

64

EMORY

8

62

32

52

NYU

55

42

50

36

MICHIGAN

36

3

28

33

SOUTHERN CALIFORNIA

20

10

28

31

UCLA

35

31

33

31

COLUMBIA

44

29

57

31

HARVARD

30

11

31

26

BERKELEY

12

25

38

20

 Total

397

374

418

402

 

Note that in both 2013 and in 2014, six of the ten schools had more transfers than law-school-funded positions, suggesting that had they taken fewer transfers they might not have needed to provide as many law-school-funded positions. Phrased differently, this data suggests that with the transfer students, these law schools have too many graduates compared to the number of jobs the market is able to provide for their graduates.

2.      Adjusting to the Employment Market or Continuing to Attract Transfers and Provide Law-School-Funded Positions?

One might expect that a natural response to this “mismatch” between the number of graduates and the number of meaningful employment opportunities provided by the market would be to have fewer graduates (and fewer law-school-funded positions).  Indeed, for many of the schools in the chart above, the simplest way to do this would involve not accepting any transfer students (or accepting very few transfer students).  The first-year enrollment at these schools appears to be fairly-well calibrated with the number of meaningful employment opportunities provided by the market.  Of course, this would mean a significant loss of revenue.

But what happened at these ten law schools in the summer of 2013 and the summer 2014 with respect to transfer students? As shown in the chart below, almost all have continued to take large numbers of transfer students.  With knowledge that a not insignificant percentage of their graduates need the support of law-school-funded positions because they can’t find market positions, these law schools continue to take large numbers of transfers.  Indeed, the total number of net transfers at these ten law schools is even higher in 2013 and 2014 than in 2011 and 2012.

 

 

2014   Net Transfers

2013   Net Transfers

 

GEORGE   WASHINGTON

77

71

GEORGETOWN  

106

115

EMORY 

47

69

NYU

48

46

MICHIGAN

14

20

SOUTHERN   CALIFORNIA

27

34

UCLA

33

36

COLUMBIA

41

50

HARVARD 

31

34

BERKELEY

53

24

 Total

477

499

 

3.   Why are These Schools Continuing to be Big Players in the Transfer Market and in Providing Law-School-Funded Jobs and Why Aren’t Other Schools Doing This as Well?

Many elite law schools are participating heavily in the transfer market and in providing law-school-funded jobs because they can and because it makes financial sense to do so.

As a general matter, only relatively elite law schools are able to attract large number of transfer students willing to pay $50,000 per year in tuition.  (This assumes that most transfers are paying full tuition. There is very little information available about scholarships in the transfer market, but anecdotes suggest that scholarships are uncommon.)  By taking large numbers of transfers, these schools generate revenue that funds general operations AND enables the school to fund post-graduate employment opportunities for a significant number of graduates.  According to NALP, most graduates in law-school-funded positions receive salaries of roughly $15,000-$30,000 per year.  Even if they have as many law-school-funded positions as they do transfers, the schools still net $70,000 to $88,000 per transfer student over the second and third year of law school even after accounting for the salaries for law-school-funded positions. (To be fair, some modest percentage of law-school-funded positions at several of these law schools may be great opportunities that are highly competitive and pay a salary comparable to a market salary – in excess of $40,000 per year.  Some of these may be public interest opportunities that some students find particularly attractive.  But the proliferation of law school funded positions (having grown from just over 500 in 2012 to more than 800 in 2014), with most of the growth occurring at relatively elite law schools, suggests that many of these positions do not fit the profile described in the preceding two sentences.)

Other schools would love to do this, but most simply don’t have the ability to attract significant numbers of transfer students.  Moreover, in the present legal education environment with declining enrollment at most schools, many law schools are running a deficit, and simply can’t afford to invest money in law-school-funded positions for their graduates.

Notably, up until this year, this effort was aided by the reporting of law-school-funded jobs as if they were the same as jobs provided by the market.  A school with law-school-funded positions that otherwise met the definition of FLB positions could report a higher percentage of its graduates in such positions.  This minimized the extent to which less than robust employment results might erode the schools’ ability to attract students and has allowed these elite schools to continue to attract large numbers of relatively highly-credentialed entering students (and transfers) along with the revenue they bring to the school.  For the Class of 2015, however, these law-school-funded positions will be reported separately from FLB positions provided by the market.

4.      What Questions Might this Raise for Students?

Students considering transferring to one of these elite schools should ask two questions: 1) What percentage of law-school-funded positions went to transfer students? and 2) How do employment outcomes for transfer students compare with employment outcomes for students who began at the school as first years?  (Even with the increased attention on transparency with respect to employment outcomes, one data point not presently collected relates to employment outcomes for transfer students.)  This isn’t to suggest that all transfers end up in law-school-funded positions.  Some transfer students may outperform some of the students who began at a law school as first years, both in terms of academic performance and in terms of relationship skills.  These transfer students may outcompete original first-year students for market employment opportunities.  But students considering transferring might want to assess whether their employment prospects really will be better at the school to which they might transfer as compared with the opportunities available to them if they remained at the school from which they are considering transferring, particularly if they are near the top of the class at the school from which they are considering transferring.

Students who had matriculated as first-years at one of these elite law schools, might want to ask the law school administration how and why having a large number of transfers is a good thing for those who matriculated as first-years at the elite law school.  Having the additional revenue might enhance the educational experience in some way, but having significantly more students competing for jobs would seem to be an unnecessary challenge. 

5.      Conclusion

The data on transfers in 2013 and 2014 suggests that at many elite law schools, there will continue to be more graduates than jobs provided by the market.  As a result, these law schools are likely to continue to provide law-school-funded positions for some number of their graduates. Indeed, the prospect of law-school-funded positions as a fall-back option if a market position is not available might provide some solace for students, including transfer students, at these elite law schools. 

Nonetheless, there is a further ripple effect.  With dozens of graduates from these elite law schools in law-school-funded positions looking for market jobs, it makes it even more challenging for the next year’s graduates from these elite schools to find market jobs and almost assures that many graduates will still need the support of law-school-funded positions in the coming years.

(I am grateful to Bernie Burk and others for helpful comments on earlier drafts of this posting.)

August 12, 2015 in Data on legal education, Data on the profession | Permalink | Comments (0)

Thursday, August 6, 2015

How is the entry-level legal job market in Australia?

AlsaNot good.  There are more law graduates than jobs, yet law schools are making matters worse by admitting more students in order to generate subsidies for other parts of the university. That the basic charge of the Australian Law Students Association (ALSA), according to this story in the Lawyers Weekly, a publication that covers the business of law in Australia.

Legal education is Australia is very different than the U.S.,  yet the dynamics of the two entry-level markets seem to be converging.  Law has historically been an undergraduate degree in Australia (LLB), but in recent years the JD has been added as a new and more prestigious way into the profession. Here is the statement of an ALSA spokesperson based on recent survey results of the ALSA membership.

ALSA are of the position that there is still an oversupply of graduates because of the increasing sizes of law schools and the duplication in the number of law schools across the country. ...

Many who have undertaken the Juris Doctor particularly expressed concerns in their survey responses, highlighting that they undertook the postgraduate law degree to further their job prospects. Instead, they are facing the worrying reality that there are fewer jobs available for law graduates as well as the fact that they are completing their degrees with a sizeable student debt.

The article then goes on to describe growing law student anxiety over employment and student loan debt.  Wow, different system but a very similar result.  

One of the advantages of the Australian LLB degree is that it is often combined with another undergraduate degree, typically by adding one year of additional study.  As a result, many LLBs don't go on to qualify for practice, but the legal training probably augments their worldly knowledge and critical thinking skills.  But alas, the Australians are starting to dilute their extremely generous higher education subsidies -- we are just much further down that road. Further, the true undercurrent here is the growing insecurity facing virtually all knowledge workers, Australian or US.  Legal education is just the bleeding edge of this problem.

August 6, 2015 in Current events, Data on legal education, Data on the profession, New and Noteworthy | Permalink | Comments (0)

Wednesday, July 22, 2015

What is more important for lawyers: where you go to law school or what you learned? (Part II)

If you're trying to maximize the financial value of an undergraduate degree, it is better to bet on course of study than college prestige.  Indeed, prestige is largely irrelevant to those who major in engineering, computer science, or math.  In contrast, prestige does matter for art & humanities grads, albeit the financial returns are significantly lower than their tech counterparts.  

These are some of the takeaways from Part I of this blog post. Part I also presented data showing that law is a mix of both: financial returns have been high (cf. "red" tech majors) and prestige matters (cf. "blue" arts & humanities crowd).  

The goal of Part II is to address the question of whether the pattern of high earnings/prestige sensitivity will change in the future. I think the answer to this question is yes, albeit most readers would agree that if law will change is a less interesting and important question than how it will change.  Speed of change is also relevant because, as humans, we want to know if the change is going to affect us or just the next generation of lawyers.

Shifts in the Legal Market

There are a lot of changes occurring in the legal market, and those changes are altering historical patterns of how legal services are being sold and delivered to clients. In the past, I have thrown around the term structural change, yet not with any clear definition.  To advance the conversation, I need to correct that lack of precision. 

In economics, there is a literature on structural change as applied to national or regional economies (e.g. moving from a developing nation to an industrial nation; or moving from an industrial to a knowledge-based economy).  Investors also focus on structural change within a specific industry because, obviously, large changes can affect investor returns.  When I have used the term structural change on this blog, it has been much closer to investor conceptions.  Investopedia offers a useful definition even if it's somewhat colloquial: 

Definition of 'structural change': An economic condition that occurs when an industry or market changes how it functions or operates. A structural change will shift the parameters of an entity, which can be represented by significant changes in time series data.

Under this definition, the legal industry is certainly undergoing structural change.  The proportion of law graduates getting a job in private practice has been on the decline for 30 years; over the last 35 years, the average age of the licensed lawyer has climbed from 39 to 49 despite record numbers of new law school graduates; the proportion of associates to partners has plummeted since the late 1980s.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014.  Since the early 2000s, long before the great recession, associate-level hiring has been cut in half. See Sea Change in the Legal Market, NALP Bulletin, August 2013.

Likewise, among consumers of legal services, there is a lot of evidence to suggest that lower and middle class citizens can't afford a lawyer to solve life's most basic legal problems, thus leading to a glut of pro se litigants in state courts and many more who simply go without things like contracts and wills.  This troubling trend line was obscured by a boom in corporate legal practice, albeit now even rich corporations have become more sensitive to legal costs -- the sheer volume and complexity of legal need is outstripping their budgets.  In response to the lag in lawyer productivity and innovation, there is a ton of investor-backed enterprises that are now elbowing their way into the legal industry.  See A Counterpoint to "the most robust legal market that ever existed in this country"LWB, March 17, 2014.  

The impact of all this change -- structural or otherwise -- is now being felt by law schools. Applicants are down to levels not seen since the 1970s, yet we have dozens more law schools. It has been said by many that law schools are losing money, albeit we have zero data to quantify the problem.  Based on my knowledge of my own law school and several others I am close to, I am comfortable saying that we have real changes afoot that affect how the legal education market "functions or operates."

There is a sense among many lawyers and legal academics that the legal world changed after 2008. None of the "structural" changes I cite above are pegged in any way to the events of that year.  

What did change in 2008, however, was the national conversation on the legal industry, partially due to the news coverage of the mass law firm layoffs, partially due to important books by Richard Susskind and later Brian Tamanaha and Steve Harper, and partially due to a robust blogosphere.  This change in conversation emboldened corporate legal departments to aggressively use their new found market power, with "worthless" young associates getting hit the hardest.  This new conversation in turn exposed some of the risks of attending law school, which affected law school demand.  But alas, this was all fallout from deeper shifts in the market that were building for decades. Let's not blame the messengers.

Dimensions of Change

I am confident that the future of law is going to be a lot different than its past. But I want to make sure I break these changes into more discrete, digestible parts because (a) multiple stakeholders are affected, and (b) the drivers of change are coming from multiple directions.

Dimension 1: basic supply and demand for legal education

To unpack my point regarding multiple dimensions, let's start with legal education. Some of the challenges facing law schools today are entirely within the four corners of our own house.  Yet, legal education also has challenges (and opportunities) that arise from our connection to the broader legal industry.  This can be illustrated by looking at the relationship between the cost of legal education (which law schools control, although we may blame US News or the ABA) and entry level salaries (which are driven largely by the vagaries of a client-driven market).  

The chart below looks at these factors.  My proxy for cost is average student debt (public and private law schools) supplied by the ABA.  My income variables are median entry level salaries from NALP for law firm jobs and all entry level jobs.  2002 is the first year where I have all the requisite data.  But here is my twist:  I plot debt against entry-level salary based on percentage change since 2002.  

Debtversusincome-2002

If a business nearly doubles its price during the same period when customer income is flat, demand is going to fall.  Thus, the sluggish entry-level market presents a difficult problem for legal education.  Sure, we can point to the favorable statistics from the AJD or the premium that a JD has historically conferred on lifetime earnings, but law professors are not the people who are signing the loan papers.  The chart above documents a changing risk/reward tradeoff.  To use the frame of Part I, the red dots are sinking into the blue dot territory, or at least that is the way prospective students are likely to view things.

Fortunately, smaller law school classes are going to be a partial corrective to low entry-level salaries.  The biggest law school class on record entered in the fall of 2010 (52,488); in 2014, the entering class had shrunk by over 27% (37,942). When entry-level supply is reduced by 25+%, upward pressure on salaries will build.  Yet, the composition of the legal economy and the nature of legal work is clearly changing.  Further, the rate of absorption of law school graduates into the licensed bar has been slowing for decades.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014. It would be foolhardy to believe that time and fiscal austerity alone are going to solve our business problems. Instead, we need to better understand our role as suppliers to a labor market.

Dimension 2:  The content of legal education

The content of legal education is not necessarily fixed or static.  We could change the content, thus affecting how the market responds.  

To provide a simple example, one of my students is starting work this fall at Kirkland & Ellis.  From a financial perspective, this is a good employment outcome.  He will be moving to Chicago with his girlfriend who just received her MS in Information Systems from IU's Kelley School of Business.  The MS from Kelley is a very "red" degree.  It can also be completed in one year (30 credit hours).  Well before she graduated, this recent grad had competing offers from PWC and Deloitte, both in the $80,000 range.   For many Indiana Law students, an ideal post-grad outcome would be $80K in Chicago at an employer who provides challenging work and high-quality training.  Yet, my student's girlfriend got this ideal outcome in 1/3 the time and likely 1/2 the cost of an Indiana Law grad.  

Perhaps we should consider cross-pollinating these disciplines. A huge portion of the legal profession's economic challenges is attributable to flat lawyer productivity -- customers are struggling to pay for solutions to their legal needs.  Information systems are a huge part law's productivity puzzle.  Below is a chart I use in many of my presentations on the legal industry.  The chart summarizes the emerging legal ecosystem by plotting the Heinz-Laumann two-hemisphere model against Richard Susskind's bespoke-to-commodity continuum. [Click-on to enlarge.]

Ecosystem

The key takeaway from this diagram is that the largest area of growth is going to be in the multidisciplinary green zone -- the legally trained working shoulder-to-shoulder with those skilled in information systems, statistics, software development, and computational linguistics, to name but a few.  These are "red" disciplines.  Do law schools want to be part of this movement?  Let me ask this another way -- do law schools want to be relevant to the bulk of the legal market that needs to be rationalized in order to maintain its affordability? Harvard grads will have options on Wall Street for the foreseeable future.  But 98% of law schools operate in a different market.  Further, some HLS grads, or students who might qualify for admission to Harvard, might prefer the big upside rewards that are only available in the green zone.  In short, a new hierarchy is emerging in law that is still very much up for grabs.

If an academic wants to better understand the rapidly changing nature of legal work, I would urge them to visit a large legal department with a substantial legal operations ("legal ops") staff.  These are the professionals who have been empowered by general counsel to find ways to drive up quality and drive down cost using data, process, and technology.  These are the folks who are making build-versus-buy decisions, putting pressure on law firms to innovate in order to hang on to legal work, and experimenting with NewLaw legal vendors. 

I am finishing up a story on legal ops professionals for the ABA Journal.  (By the way, legal ops exist in law firms as well as legal departments and green zone legal vendors. The role is most developed, however, in legal departments.)  My editor flagged the issue that virtually all of the legal ops people in the story did not graduate from prestigious law schools (or any law school).

My only response is that legal operations people have specialized skills and knowledge (often "red" but sometimes involving EQ) that others lack; without these skills, they can't do the job.  Legal ops people live in a world of outputs and metrics.  For example, are legal expenses and settlement amounts trending down over time -- yes or no? If so, by how much?  How much internal staff time does it take to negotiate a revenue contract? How much of this process can be automated? What will it take to get our staff to accept the new system?

As these examples show, a legal ops person is typically going to be evaluated based on measurable outputs -- do they get results? Where someone went to law school is an input that is likely irrelevant to the question.  The only qualifier is whether the curriculum of that school provided valuable, specialized domain knowledge -- most likely non-legal red skills but also skills related to teams, communication, and collaboration. 

Dimension 3:  The value of pedigree to the customer 

Law has historically been what economists call a “credence good.”  This means that a layperson has a difficult time assessing quality.  As a result, proxies for quality, such as pedigree or prestige, have historically been very important when hiring a lawyer or law firm.  

One of the reasons that the field of legal operations is gaining momentum is because it is creating tools and systems that enable clients to look past credentials to obtain information on things they really care about, such as cost, outcome, and speed of delivery. There are now companies coming into existence that are gathering data on lawyers' win-loss rates. See Another Example of Using Big Data to Improve Odds of Winning in Court, LWB, April 12, 2015.  Sure, apples-to-apples comparisons are very difficult to make -- every case is unique in some respect. But the amount of money at stake is large enough that the data challenges will be surmounted.  When that day arrives, we won't opine on the value of pedigree to legal outcomes; we'll just calculate it. More significantly, clients focused on outcomes will change their buying patterns.  Early returns I have seen suggest that the value of pedigree to legal outcomes may be close to negligible.

Do any of us care where the engineers who designed our smart phones went to college? Not really. We just care how well the smart phone works. 

In this respect, the future of law is likely headed in the direction of Google (a pure red company).  In the early days, the founders of Google favored grads of Caltech, Stanford and Berkeley.  But over time, the company learned that prestige of graduate school was a poor predictor of job success. Because Google lives and dies by its outputs, the company changed its hiring model to attract the most qualified engineers.  See George Anders, The Rare Find: How Great Talent Stand Out 1-5 (2012) (telling the story of how data changed the attitudes of Google founders regarding elite credentials and altered the Google hiring model).

I have lived long enough to know that the changes I describe above are not necessarily going to be welcomed by many lawyers and law professors.  If a group benefits from a lifelong presumption of merit, it is natural that group will resist evidence that the presumption is not fully warranted. Indeed, much of the skepticism will be rooted in subconscious emotion.  If the presumption is dashed, those of us in the elite crowd will have to spend our days competing with others and proving ourselves, or even worse, watching our kids soldier through it.  We have little to gain and a lot to lose in the world we are heading into.  Yet, behind the Rawls veil of ignorance, how can we complain?

So with the red-blue crosscurrents, is law school still worth the investment?

That is a relevant and reasonable question that many young people are contemplating.  I will offer my opinion, but markets are bound to follow their own logic. 

This is a time of enormous uncertainty for young people. Education clearly opens doors, but tuition is going up much faster than earnings.  Further, competition among knowledge workers is becoming more global, which is a check on wages.  Of course, if you don't invest in education, what are your options?

I am generally on the side of Michael Simkovic and Frank McIntrye that the education provided by a law degree, on average, significantly increases lifetime earnings.  See The Economic Value of a Law Degree (April 2013).  How could it not?  The law is too interconnected to every facet of society to not, on average, enhance the law grad's critical thinking skills. Nearly 15 years of out of law school and I regularly use what I learned at Chicago Law to solve problems and communicate solutions, particularly in my applied research work with law firms and legal departments. While my Chicago Law credential has value independent of the skills and knowledge I obtained (the red AJD bar chart in Part I strongly suggests that), I can't deny the additional value of the actual skills and knowledge I obtained to solve real world business problems. It's been substantial.

In general, I also agree with Deborah Jones Merritt that there is significant evidence that the entry-level market for lawyers is weak and oversaturated.  See What Happened to the Class of 2010? Empirical Evidence of Structural Change in the Legal Profession (April 2015).   The class of 2010 is not faring as well as the class of 2000.  Indeed, the lead economist for Payscale, Katie Bardaro, recently noted that wages are stagnating in many fields, but especially in the legal profession. "More law schools are graduating people than there are jobs for them...There’s an over-saturated labor market right now. That works to drive down the pay rate.” See Susan Adams, The Law Schools Whose Grads Earn the Biggest Paychecks in 2014, Forbes, Mar. 14, 2014. 

In the face of these stiff headwinds, I think law schools have an opportunity to pack more value into three years of education. See Dimension 2 above.  To be more specific, if you are a protege of Dan Katz at Chicago-Kent, you will have a lot of career options. Ron Staudt, also at Chicago-Kent, has quietly built a pipeline into the  law and technology space.  Oliver Goodenough and his colleague at Vermont Law are making rapid progress with a tech law curriculum.  And at Georgetown Law, Tanina Rostain and Ed Walters (CEO of Fastcase) provide courses that are cutting edge.  

But absent these types of future-oriented instruction, what is the value of a JD degree as it is commonly taught today? That value is clearly positive; I would even call it high.  But whether the value is sufficient to cover the cost of attendance is likely to vary from law grad to law grad.  Lord knows, in a world of variable tuition based on merit scholarships and merit scholarships that go away after the 1L year, the swing in cost can be a $250K plus interest.

What is killing law school applications these days is the lack of near certainty among prospective students that the time and expense of law school will pay off.  The world looks different than it did in the fall of 1997 when the vast majority of the AJD respondents entered law school. Tuition and debt loads are higher and high paying entry-level jobs are harder to obtain.

So what is the solution?  For students, it's to bargain shop for law schools, which is bad news for law schools.  For law schools, it's to add more value to an already valuable degree.  Some of that value will come in the form of red technical skills that will make lawyers more productive.  In turn, this will prime demand for more legal products and services.

July 22, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Legal Departments, Structural change | Permalink | Comments (0)

Sunday, July 19, 2015

What is more important for lawyers: where you go to law school or what you learned? (Part I)

The Economist reports a very interesting analysis from Payscale.  The questions being asked are pretty simple: If you want to generate earnings that justify the time and cost of an undergraduate education, what should you study and where should you enroll?

Lots of people have strong opinions on this set of questions, but Payscale has the data to answer them empirically. It turns out that at the undergraduates level, course of study is much more important than the prestige of the college or university you attend.  The hard evidence is shown below.

Payscalegraphic

For those working in law or thinking about attending law school, a natural question to ask is whether the legal industry is closer to the blue dot (art & humanities) or red dot pattern (engineering/CS/math).  A second, related question whether the future of law is more blue or more red.

This a two-part blog post.  Part I tries to answer the first question, starting with a careful analysis of the undergraduate chart, which provides a valuable frame of reference that can be discussed more dispassionately (at least among lawyers and law students) than an analysis that questions the value of law school prestige and hierarchy.  

Part II, which I will post on Wednesday, explores the second, future-oriented question.  I will tip my hand now and say that the future of law will be less blue (arts & humanity) and more red (math/CS/engineering).  Within the legal industry, there will be winners and losers; but from the perspective of broader society, this change is a very good thing. 

Undergraduate ROI

In the Payscale chart above, the y-axis (vertical) is 20-year annualized returns from college fees paid.  The x-axis is selectivity, running from under 10 percent to near open admissions.  

The Payscale chart is a very good example of how data visualization can be used to communicate both core facts and useful nuance.  Here, the lede is unmistakable:  the red dots (engineering/CS/math) are overwhelming higher on the ROI scale than the blue dots (arts & humanities).  Sure, there are exceptions to this rule, but they don't occur very often. (Observe how rarely a blue dot is above the red fit-line.) This suggests it would be very foolish to get a blue degree and expect a red paycheck unless you have very good information (or skills or talent) that others lack.

The chart conveys another important piece of information -- the red fit-line is flat.  This means that for engineering/CS/math majors, prestige has not been very relevant to their eventual earnings.  I'll add a nuance here that some empirically savvy readers are bound to point out:  It is possible (indeed likely) that fees are higher at more selective schools. So if MIT costs twice as much as a public polytech, and both yield 12% over 20 years, one might wish they had gone to MIT.   Still, the flat trendline is surprising.  As a general matter, lower ranked schools are not dramatically cheaper than higher ranked schools, and many public schools are highly selective.  The flat red trendline suggests that there are (or were, remember these are historical data) many bargains out there.  If one is trying to maximize financial returns, the goal is to find a school that will, in the future, be well above the red fit-line (and avoid those below).  

The flat red fit-line is also surprising because college selectivity is almost certainly highly correlated with ACT or SAT scores, which our society often views as measures of general intelligence. Yet, there we have it -- a flat trendline. Four years of education seem to be more relevant than a standardized test score taken during high school.  That is heartening at many levels.

A third interesting trend -- the blue fit-line is sloped downward.  This suggests that in the arts & humanities, selectivity/prestige does have a financial payoff.  I don't think this will surprise many readers, albeit the prestige payoff is not very large. To use a simple metaphor, if you attend a more selective college or university to get your arts or humanity degree, you are likely to have a better house in the arts & humanities neighborhood.  But on average, you won't be able to afford the same neighborhood as the engineers, computer scientists, and math majors.

What about Law?

Moving on to law, if we want to examine the relationship between earnings and law school attended, the best available evidence is probably the After the JD Study (AJD), which is large, representative sample of law graduates who took and passed the bar in 2000.  

Data from AJD Wave 3 suggests that the financial returns are relatively strong for all law school graduates -- ten years out and graduates of Tier 4 schools have median earnings of $100,000 per year. As shown in chart below, this is akin to shifting the blue dots up into the red territory.  

AJDearnnings

The downward sloping fit-line remains, but that doesn't seem to matter very much to happiness. Other AJD data shows that regardless of tier of graduating school, AJD respondents show relatively high and uniform satisfaction with (a) the decision to become a lawyer, and (b) the value of the law degree as an investment. By 2010, 48% of respondents had no debt; only 5.1% had more than $100K in educational debt remaining. 

This is all good news.  But is it reasonable to extrapolate forward and assume the past is a fairly accurate barometer of the present and the future? 

One way to address that question is to ascertain what has changed since 2000.  As noted earlier, the AJD sample was composed of law graduates who passed the bar in the year 2000. Figures published by NALP and the ABA show that the percentage of full-time bar passage required jobs has dropped significantly over the last 13+ years -- from 77.3% for the class of 2000 to 57% for the class of 2013. That is a huge delta.

Barpassagerequiredjob

One of the reasons why law school applicants have plummeted is that the career path from JD graduates has become murky.  And that is a good place to start Part II

July 19, 2015 in Blog posts worth reading, Cross industry comparisons, Data on legal education, Data on the profession, Structural change | Permalink | Comments (3)

Thursday, May 14, 2015

Further Thoughts on the July 2014 Bar Results -- A Response to Erica Moeser

Late last fall, Erica Moeser responded to a letter from Dean Kathryn Rand of the University of North Dakota (on behalf of a large number of law school deans), reiterating that the NCBE had double-checked its scoring of the MBE on the July 2014 bar examination and could find no errors in its calculations.  Erica Moeser also took to the pages of the December 2014 issue of The Bar Examiner to further validate her conclusion that the historic drop in the mean MBE scaled score is attributable solely to the fact that the class that sat for the July 2014 bar exam was “less able” than the class that sat for the July 2013 bar exam.  In January, Dean Stephen Ferruolo of the University of San Diego also wrote to Erica Moeser requesting the release of more information on which to assess the July 2014 bar examination results in comparison with previous years’ results.  In February, Erica Moeser responded to Dean Ferruolo’s request by declining to provide more detailed information and reiterating her belief that the July 2014 scores “represent the first phase of results reflecting the dramatic and continuing downturn in law school applications.”

In an earlier blog posting, I explained why Erica Moeser is partly right (that the Class of 2014 could be understood to be slightly less able than the Class of 2013), but also explained why the decline in “quality” of the Class of 2014 does not explain the historic drop in mean MBE scaled score.  The decline in “quality” between the Class of 2013 and the Class of 2014 was modest, not historic, and would suggest that the decline in the mean MBE scaled score also should have been modest, rather than historic.  Similar declines in “quality” in the 2000s resulted in only modest declines in the MBE, suggesting that more was going on with the July 2014 exam. 

Others have written about these issues as well.  In January, Vikram Amar had a thoughtful reflection on Moeser’s statements and in recent weeks Debby Merritt has written a series of posts -- here, here, and here -- indicating in some detail why she believes, as I do, that the ExamSoft debacle in July could have impacted the MBE scaled scores in jurisdictions that used ExamSoft as well as in other jurisdictions.

I write now to take issue with four statements from Erica Moeser – three from her President’s Page in the December 2014 issue of the Bar Examiner and one from her letter responding to Dean Kathryn Rand.  I remain unpersuaded that the historic decline in the mean MBE scaled score is solely attributable to a decline in quality of the class that sat for the July 2014 bar examination and remain baffled that the NCBE refuses to acknowledge the possibility that issues with test administration may have exacerbated the decline in the performance on the July 2014 MBE.

Item One – Differential Declines in MBE Scores

In her December article, Moeser stated: 

I then looked to two areas for further corrobo­ration. The first was internal to NCBE. Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that col­lect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent dur­ing the previous 10 years. (Emphasis in original.)

Moeser starts by referencing data that is not publicly available to support her cause.  This is unfortunate, because it makes it really hard to understand and critique the data.  Nevertheless, there are some inferences we can take from what she does disclose and some questions we can ask.  Moeser asserts that the 19% of MBE “retakers” saw an MBE drop of 1.7 points compared with MBE “retakers” in July 2013, while the 65% believed to be first-time takers saw a drop of 2.7 points compared with first-time takers in July 2013.  It would have been helpful here if Erica Moeser would have released publicly the declines among MBE retakers in the previous 10 years and the declines among first-time takers in the previous 10 years so that patterns could be assessed, particularly in relation to the changes in class composition for each of those years.  Without that information available it is hard to do much more with Moeser’s assertion.  (I find it odd that she would reference this point without providing the underlying data.) 

Nonetheless, this assertion raises other questions.  First, the overall decline in the mean MBE scaled score was 2.8 points. Moeser notes that 19% of takers (MBE retakers) had an average drop of 1.7 points, while 65% of takers (first-time takers) had an average drop of 2.7 points.  Unless there is something I am missing here, that should mean the remaining 16% of test-takers had to have an average decline of 4.51 points!  (This 16% of test-takers represents those who Moeser notes could not be tracked as first-time takers or MBE retakers “because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.”) (Here is the equation --- 2.8 = (.19*1.7)+(.65*2.7)+(.16*x).  Solve for X. This translates to 2.8 = .323+1.755+.16x.  This translates to .722 = .16x and then .722/.16 = X.  X then equals 4.51.)  It would have helped, again, if Moeser had indicated which jurisdictions had these even larger declines in mean MBE scaled scores, as we could then look at the composition of graduates taking the bar in those jurisdictions to see if there was an unusual decline in entering class statistics in 2011 at the law schools from which most bar takers in those states graduated.

Item Two – The MPRE

In the December article, Moeser also stated:

I also looked at what the results from the Multistate Professional Responsibility Examination (MPRE), separately administered three times each year, might tell me. The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candi­dates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

At first blush, this looks like a pretty compelling argument, but Moeser’s selectiveness in looking at the data is troubling, and her failure to discuss whether the MPRE and MBE are meaningfully comparable test-taking experiences also is troubling.  Essentially, Moeser is making the following assertion – because the mean MPRE scaled score declined by 1.92 points between 2012 and 2013, we should have expected a large decline in the mean MBE scaled score in July 2014 (and because the mean MPRE scaled score declined another 2.08 points between 2013 and 2014, we should expect another large decline in the mean MBE scaled score in July 2015).

But the “relationship” between changes in the mean MPRE scaled score and changes in the mean MBE scaled score over the last decade does not support this assertion. If one looks at a decade’s worth of data, rather than data just for the last couple of years, the picture looks significantly more complicated, and suggests the collective performance on the MPRE may not tell us much at all about likely collective performance on the MBE in the following year. 

MPRE Year

Mean MPRE Score

Change

MBE Year

July Mean MBE Scaled Score

Change

2004

99.1

 

2005

141.6

 

2005

98.7

-0.4

2006

143.3

+1.7

2006

98

-0.7

2007

143.7

+0.4

2007

98.6

+0.6

2008

145.6

+1.9

2008

97.6

-1.0

2009

144.5

-1.1

2009

97.4

-0.2

2010

143.6

-.9

2010

96.8

-0.6

2011

143.8

+0.2

2011

95.7

-1.1

2012

143.4

-0.4

2012

97.6

+1.9

2013

144.3

+0.9

2013

95.6

-2.0

2014

141.5

-2.8

2014

93.6

-2.0

2015

????

????

The data Moeser cites from the last two years conveniently makes her point, but it consists of a very small sample size.  The data over the last decade looks much more random.  In three of the nine years, the change is not in the same direction (MPRE 2005, 2006, 2010, MBE 2006, 2007, 2011).  In the six years where the change is in the same direction, there are two years in which the MBE change is significantly larger than the MPRE change (MPRE 2007, 2009, MBE 2008, 2010) and there are two years in which the MBE change is significantly smaller than the MBE change (MPRE 2011, 2012, MBE 2012, 2013).  In only two of the nine years, do the changes in the MPRE and MBE roughly approximate each other (MPRE 2008, 2013, MBE 2009, 2014).   Nonetheless, this remains a very small sample and more analysis of data over a longer period might be helpful to better understand how/whether changes in mean MPRE scores inform meaningfully changes in mean MBE scores the following year.  At this point, I think the predictive value seems marginal given the wide range of changes on a year-over-year basis.

Item Three – Mean LSAT Scores

In the December article, Moeser further stated:

Specifically, I looked at what happened to the overall mean LSAT score as reported by the Law School Admission Council for the first-year matricu­lants between 2010 (the class of 2013) and 2011 (the class of 2014). The reported mean dropped a modest amount for those completing the first year (from 157.7 to 157.4). What is unknown is the extent to which the effect of a change to reporting LSAT scores (from the average of all scores to the highest score earned) has offset what would otherwise have been a greater drop. (LSAC Research Reports indicate that roughly 30% of LSAT takers are repeaters and that this num­ber has increased in recent years.

This assertion is misguided for purposes of this comparison, a point Vikram Amar made in his post.  If we were comparing the first-year matriculants in 2009 with the first-year matriculants in 2010, the question of the change in reporting from average LSAT score to highest LSAT score would have mattered.  But the 2010 matriculants were the first class for which the mean was reported based on highest LSAT score and the 2011 matriculants were the second class for which the mean was reported based on highest LSAT score.  Thus, there is no “unknown” here.  The reported mean LSAT dropped only a modest amount between the matriculants in 2010 and the matriculants in 2011.  Nonetheless, the mean MBE scaled score in July 2014 decreased by an historic 2.8 points from the mean MBE scaled score in July 2013. 

Item Four – Administration Issues

In her letter to Dean Kathryn Rand, Moeser stated:  "To the extent that the statement you attached referenced both administration and scoring of the July 2014, bar examination, note that NCBE does not administer the exam; jurisdictions do."

This response suggests not only that the NCBE is not responsible for administering the bar examinations in the many different jurisdictions, but implicitly suggests that issues with administration could not have contributed to the historic decline in the mean MBE scaled score. 

Were there issues with administration?  Yes.   Could they have contributed to the historic decline in the mean MBE scaled score?  Yes.

Debby Merritt’s recent posts discuss the administration issues and the potential consequences of the administration issues in some detail.  In over forty states that used ExamSoft to administer the bar examination, the MBE came on Wednesday, after the essay portion of the exam on Tuesday.  But because of an ExamSoft technical problem, tens of thousands of test-takers, who were initially informed by their respective state board of bar examiners that they would FAIL THE EXAM if their essay answers were not uploaded in a timely manner, spent most of Tuesday night dealing with the profound stress of not being able to upload their exam answers and not being able to contact anyone at the board of bar examiners (who were not answering phones) or at ExamSoft (due to the flood of calls and emails from anxious, frustrated, stressed out exam takers) to figure out what was going on and what they should do. 

Given that this “administration” issue caused untold stress and anxiety for thousands of test-takers, who spent Tuesday night completely anxious and stressed out trying repeatedly and unsuccessfully to upload their essay answers, should it be a surprise that they might have underperformed somewhat on the MBE on Wednesday?  (If you want a sense of the stress and anxiety, check the twitter feed for the evening of Tuesday, July 29, 2014)

The responses from the boards of bar examiners to this issue with administration of the bar examination were far from uniform.  Different jurisdictions granted extensions at different times of the night on Tuesday, July 29, or on Wednesday, July 30, with some granting short extensions and some granting longer extensions.  Thus, in states that gave notice of an extension out earlier on Tuesday, July 29, test-takers may have had less stress and anxiety, while in those states that didn’t give notice of an extension out until later (or for which the extension was relatively short), or where there may not have been any communication regarding extensions of the submission deadline, test takers likely experienced more stress and anxiety.  (It would be worth studying exactly when each jurisdiction gave notice of an extension and whether there is any correlation between timing of notice of the extension and the relative performance of bar takers in those states.)

The NCBE’s unwillingness to acknowledge any issues with administration of the bar examination is all the more surprising at a time when the NCBE is pushing for adoption of the Uniform Bar Examination.  On its webpage, the NCBE states: “[The UBE] is uniformly administered, graded, and scored by user jurisdictions and results in a portable score that can be transferred to other UBE jurisdictions.” (Emphasis added.)  This simply was not true in July 2014.  The Uniform Bar Examination was administered under different exam conditions across jurisdictions.  First, three of the states administering the Uniform Bar Examination in July 2014 did not use ExamSoft – Arizona, Nebraska and Wyoming -- and therefore, bar takers in those states had a vastly different “exam administration” experience than bar takers in ExamSoft jurisdictions.  Across ExamSoft jurisdictions, different approaches to extensions also meant different administration experiences. Given the significance of consistent administration for the purpose of equating performance on a standardized exam like the bar exam, that the NCBE allows such varied approaches to administering a supposedly “uniform” exam strikes me as very problematic.

Many questions remain unanswered, largely because adequate information has not been made available on which to assess the various factors that might have contributed to the historic decline in the mean MBE scaled score.  With the release of February bar results and the NCBE’s publication of the 2014 statistical report, some additional information is now available to put the results of July 2014 in context.  In my next blog posting regarding the July 2014 bar results, I will delve into some of those statistics to see what they tell us.

(Edited as of May 20 to correct the 2013 MPRE and 2014 MBE change and corresponding discussion.)

May 14, 2015 in Current events, Data on legal education, Data on the profession | Permalink | Comments (0)

Thursday, May 7, 2015

Revisiting Conditional Scholarships

Having been one of the people who brought attention to the issue of conditional scholarships a few years ago, I feel compelled to offer a few insights on a rekindled conversation about conditional scholarships involving Jeremy Telman and Michael Simkovic and Debby Merritt.

I am not sure what prompted Prof. Telman to write about conditional scholarships, but the first sentence of his initial post seems to be a few years late: 

One of the ways in which law schools are allegedly inadequately transparent is in the award of merit scholarships conditional on the students’ achievement of a certain grade point average. (Emphasis added)

A few years ago, one accurately could have said that law schools were inadequately transparent regarding the awarding and retention of conditional scholarships.  I did say that in an article Prof. Telman describes as “interesting.” 

Today, this is no longer accurate, because we have much greater transparency regarding conditional scholarships given the disclosures mandated pursuant to Standard 509.

Thus, I am not sure anyone is alleging that law schools are inadequately transparent regarding conditional scholarships and I am not sure why this is once again an item for discussion.  It has been well settled and law schools and prospective law students have adjusted to a new reality.  Indeed, in his follow up posting, Prof. Telman essentially acknowledges this point:

It seems we are all agreed that the disclosure problems related to conditional scholarships have largely been addressed through the ABA website that enables students to comparison shop among scholarship offers from various schools and know their chances of retaining their conditional scholarships.

That said, given that Prof. Telman got the conversation started, I have a response to one of his assertions and some observations to share.

The general context of his posting (and Prof. Simkovic’s related posts) is that college students have lived with conditional scholarships without apparent problems so conditional scholarships shouldn’t present a concern for law students.  In making his case, Prof. Telman relies on my 2011 article to support a proposition that the article actually disproves in some detail.  Specifically, Prof. Telman states:

Professor Organ was able to find information about how scholarships work at 160 law schools.  That means that the information was out there.  Since Professor Organ was able to gather information about 160 law schools, it should not be difficult for students to gather relevant information about the one law school that they are considering attending. 

He further states:  “Why are law students assumed to be incapable of looking into standard grade normalizations curves for the first year?”  Prof. Telman seems to be suggesting that there actually weren’t any disclosure problems because “the information was out there.”  The information was not out there. 

To be more precise, in putting together the article, with the efforts of research assistants as well as my own sleuthing, I was able to find sufficient information from the NAPLA-SAPLA Book of Lists, the ABA-LSAC Guide, and law school web pages from which to classify 160 law schools regarding whether the law school had a competitive scholarship program or some other type of scholarship program.  If Prof. Telman would have looked carefully at the article, however, he would have noted that “only four of these 160 schools had any information posted on their webpages indicating renewal rates on scholarships.”  (A point Derek Tokarz makes in the comments to Prof. Telman’s post.)

Prospective law students not only need relevant information about one law school, they need relevant and comparable information about the set of three or five or seven law schools they are considering seriously.  Prior to the Standard 509 mandated disclosure of conditional scholarship information, it was profoundly difficult if not impossible for students to gather relevant information from a few or several law schools.  The information simply was not “out there.”

Indeed, two of the primary points of my article were to highlight the information asymmetry between law schools and prospective law students relating to competitive scholarships and to recommend greater disclosure of the number of students receiving competitive scholarships and the number who had them renewed (or had them reduced or eliminated).   

Prof. Merritt discusses in some depth this information asymmetry, noting particularly that college students who have been successful in retaining their conditional scholarships as undergrads do not appreciate the reality of the mandatory curve they will encounter in law school, a point Stephen Lubet also makes cogently in a comment to Prof. Telman’s post. (Indeed, to his credit, Prof. Telman acknowledges that prospective law students also may suffer from optimism bias in assessing their likelihood of retaining their scholarship.)

Regarding the need for greater disclosure, regardless of how savvy and sophisticated we would like to believe prospective law students might have been or might be, the nuances of conditional scholarships and mandatory curves were not things that were clearly understood in the era prior to the mandatory Standard 509 disclosure.  I noted in my article that many students posting on Law School Numbers valued their scholarships based on a three-year total, regardless of whether they were conditional scholarships, suggesting these students failed to appreciate that the “value” should be discounted by the risk of non-renewal.  I also spoke with pre-law advisors around the country regarding conditional scholarship and consistently was told that this information was very helpful because pre-law students (and sometimes pre-law advisors) had not appreciated the realities of conditional scholarships.

While there are other things mentioned by Prof. Telman, Prof. Simkovic and Prof. Merritt to which I could respond, this post is already long enough and I am not interested in a prolonged exchange, particularly given that many of the points to which I would respond would require a much more detailed discussion and more nuance than blog postings sometimes facilitate.  My 2011 article describes my views on competitive scholarship programs and their impact on law school culture well enough.  Accordingly, let me end with one additional set of observations about what has happened with conditional scholarships in an era of increased transparency.

In my follow up article available on SSRN, I analyzed the frequency of conditional scholarships generally and the extent to which conditional scholarships were utilized by law schools in different rankings tiers for the 2011-2012 academic year (the first year following the ABA's mandated disclosure of conditional scholarship retention rates). 

For the entering class in the fall of 2011, I noted that there were 140 law schools with conditional scholarship programs, and 54 law schools with scholarship renewal based only on good academic standing, one-year scholarships, or only need-based scholarship assistance.  I also noted that conditional scholarship programs were much less common among top-50 law schools than among bottom-100 law schools. 

Based on the data reported in fall of 2014 compiled by the ABA for the entering class in the fall of 2013 (the 2013-2014 academic year), the percentage of all entering first-year students with conditional scholarships has increased slightly (from 26.1% in fall 2011 to 29% in fall 2013), while the percentage of all entering first-year students who had their scholarships reduced or eliminated has decreased slightly (from 9% as of summer of 2012 to 8.4% as of summer of 2014).  The average renewal rate across law schools increased from 68.5% to 73%.

More significantly, however, the number of law schools with conditional scholarship programs has declined, while the number with other types of scholarship programs has increased fairly significantly.  By 2013-2014, there were 78 law schools with scholarships renewed based on good academic standing, with one-year scholarships or with only need-based scholarship assistance, a significant growth in just two years in the number of law schools going away from conditional scholarship programs -- 24 more law schools, a 40% increase.  This would seem to indicate that at least some law schools have decided conditional scholarships aren’t as good for law schools or for law students. 

May 7, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)

Monday, April 13, 2015

PROJECTIONS FOR LAW SCHOOL ENROLLMENT FOR FALL 2015

          This blog posting is designed to do three things.  First, following up on recent discussions regarding trends in applicants by Al Brophy at The Faculty Lounge and Derek Muller at Excess of Democracy, I provide a detailed analysis to project the likely total applicant pool we can expect at the end of the cycle based on trends from March through the end of the cycle in 2013 and 2014.  Second, using the likely total pool of applicants, I estimate the number of admitted students and matriculants, but also question whether the estimates might be too high given the decline in quality of the applicant pool in this cycle.  Third, building on the second point, I suggest that law schools in the lower half of the top tier are likely to see unusual enrollment/profile pressure that may then have a ripple effect down through the rankings.

1. ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS

Reviewing the 2013 and 2014 Cycles to Inform the 2015 Cycle

2013   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   25, 2013

30,098

56%

53,750

Mar.   8, 2013

46,587

84%

55,460

May   17, 2013

55,764

95%

58,700

End   of Cycle

 

 

59,400

 

2014   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   31, 2014

29,638

58%

51,110

Mar.   7, 2014

42,068

79%

53,250

April   25, 2014

48,698

89%

54,720

End   of Cycle

 

 

55,700

 

2015   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   30, 2015

26,702

54%

49,450

Mar.   6, 2015

39,646

76%

52,160

April   3, 2015

45,978

87%

52,848

End   of Cycle

 

 

54,000   (Estimate)

        In each of the last two years, a modest surge in late applicants meant the final count exceeded the March/April projections by a couple thousand.  That would suggest that the current projection (for just under 53,000) likely understates the end of cycle applicant pool, which I am now estimating conservatively at 54,000 (down about 3% from 2014).  (In 2014, the amount by which the final pool total exceeded the early March projection was nearly 2,500.  With an estimated pool of 54,000 applicants, I am estimating that the final pool in 2015 will exceed the early March projection by roughly 2,000.)  (That said, if the employment results for 2014 graduates, which will be released shortly, show modest improvement over 2013, I anticipate that even more people might come off the fence and perhaps apply late for the fall 2015 class.)

2. ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS  

        The chart below shows the number of applicants, admitted students and matriculants over the last three years along with an estimate for fall 2015 based on the assumption above that we have a total of 54,000 applicants this cycle.  With 1,700 fewer applicants, I am assuming 1,000 fewer admitted students (a slight increase in the percentage admitted from 2014), and then assuming the number of matriculants will reflect the three-year average for the percentage of admitted students who matriculate – 87%.  This would yield a first-year entering class of 36,975, down about 2.5% from 2014.   

Estimates of Admitted Students and Matriculants for 2015 Based on Trends in 2012-2014

 

Applicants

Admitted   Students

Percent   of Applicants

Matriculants

Percent  of Admitted

2012

67,900

50,600

74.5%

44,481

87.9%

2013

59,400

45,700

76.9%

39,675

86.8%

2014

55,700

43,500

78.1%

37,924

87.2%

2015   (est.)

54,000

42,500

78.7%

36,975

87%

Why These Estimates for Admitted Students and Matriculants Might be Too High

        a.      Significant Decline in Applicants with LSATs of 165+

        Because of changes in the nature of the applicant pool in 2015, however, the estimates of the number of admitted students and number of matriculants in the chart above may be too high.  In 2014, almost all of the decrease in applicants came among those with LSATs of <165.  The pool of applicants with LSATs of 165+ in 2014 was only slightly smaller than in 2013 (7,477 compared with 7,496). Indeed, as a percentage of the applicant pool, those with LSATs of 165+ increased from 12.6% in 2013 to 13.4% in 2014.  This resulted in a slight increase in the number of matriculants with LSATs of 165+ in 2014 compared to 2013 (6,189 compared with 6,154).

        In the current cycle, however, the number of applicants with LSATs of 165+ was only 6,320 as of March 6, 2015. In 2013, there were 7,228 on March 8, 2013 (of a final total of 7,496).  In 2014, there were 7,150 on March 7 (of a final total of 7,477).  Thus, the average increase in applicants with LSATs of 165+ between early March and the end of the cycle is only about 4%.  That would suggest that we could anticipate having roughly 6,585 applicants with LSATs of 165+ at the end of the cycle – down nearly 900 from 2014 – over 12%.

Estimate of Number of Total Applicants for 2015 with LSATs of 165+ Based on Trends in 2013 and 2014

 

Applicants at 165+

 

Applicants at 165+

# Increase to end of Cycle

% Increase to end of Cycle

March 8, 2013

7228

End of Cycle 2013

7496

268

3.7%

March 7, 2014

7150

End of Cycle 2014

7477

327

4.6%

March 6, 2015

6320

End of Cycle 2015 (est.)

6585

265

4.2%

        On a longer term basis, if the estimates in the preceding paragraphs are accurate, the entering class in fall of 2015 will again extend the slide in the number and percentage of first-year students with LSATs of 165+ that has been underway since the class that entered in fall of 2010.

Five-Year Trend in Applicants and Matriculants with LSATs of 165+  and Estimates for 2015

 

Applicants with LSATs of 165+

Matriculants   with LSATs of 165+

Percent   of Applicants Matriculating

2010

12,177

9,477  

77.8%

2011

11,190

8,952  

80%

2012

9,196

7,571  

82.3%

2013

7,496

6,154  

82.1%

2014

7,477

6,189

82.8%

2015 (est.)

6,585

5,420

82.4%

        Given that on average over the last three years roughly 82.4% of admitted students with LSATs of 165+ actually matriculated, one could expect that the 6,585 applicants would translate into 5,420 matriculants with LSATs of 165+ for fall 2015, a decline of nearly 770 from 2014.  Notably, this would represent a 45.9% drop in applicants with LSATs of 165+ since 2010 and a 42.8% drop in matriculants with LSATs of 165+ since 2010.

        b. Modest Decrease Among Applicants with LSATs <150

        On the other end of the LSAT distribution, it is a completely different story. Although the number of applicants with LSATs <150 also has declined, the decline has been more modest than among those with LSATs of 165+.  Moreover, those with LSATs of <150 are much more likely to apply late in the cycle.  In the last two years there has been significant growth among applicants with LSATs of <150 between early March and the end of the cycle.   As a result, I would estimate that we would have 18,350 applicants with LSATs of <150 by the end of this cycle, a decline of only about 4.5%.

Estimate of Number of Total Applicants for 2015 with LSATs of <150 Based on Trends in 2013 and 2014

 

Applicants with LSATs of <150

 

Applicants with LSATs of <150

# Increase

% Increase

March 8, 2013

13,364

End of Cycle 2013

20,706

6,642

49.7%

March 7, 2014

11,662

End of Cycle 2014

19,239

7,577

65%

March 6, 2015

11,467

End of Cycle 2015 (est.)

18,350

6,880

60%

        With applicants with LSATs <150 making up a larger percentage of the declining applicant pool, the number of matriculants with LSATs of <150 actually had grown each year up until 2014, when the slight increase in matriculants with LSATs of 165+ was mirrored by a slight decrease in matriculants with LSATs <150. 

Five-Year Trend in Applicants and Matriculants with LSATs of <150 and Estimates for 2015

 

Applicants   with LSATs of <150

Matriculants   with LSATs of <150

Percent   of Applicants Matriculating

2010

26,548

7,013

26.4%

2011

24,192

7,101

29.4%

2012

22,089

7,906

35.8%

2013

20,706

8,482

41%

2014

19,239

8,361

43.5%

2015 (est.)

18,350

8,700

47.4%

        Given that the percentage of applicants with LSATs <150 matriculating has increased each of the last five years, it seems reasonable to expect another increase – to 47.4% -- resulting in roughly 8,700 matriculants with LSATs of <150, particularly given the decrease in the number of applicants with LSATs of 165+.  Even so, it seems unlikely to make up for the drop of nearly 770 matriculants among those with LSATs of 165+.  Notably, while the pool of applicants with LSATs <150 has decreased by of 30.9% since 2010, the number of matriculants has increased by 24.2%.

        Thus, while the smaller decline in applicants that is expected this year might suggest a correspondingly smaller decline in matriculants, with the weaker profile of the applicant pool in 2015 compared to 2014, it is quite possible that the total number of admitted students will be lower than the chart above suggests and that the corresponding number of matriculants also will be lower than the chart above suggests.

        Phrased differently, if there really is going to be a decline of roughly 770 matriculants just in the group with LSATs of 165+, then the total decline in matriculants may well be greater than the 950 estimated in the chart above.  Between 2013 and 2014, a decline in applicants of 3,700, almost all with LSATs of 164 and below, resulted in a decline in matriculants of 1,750, all with LSATs of 164 and below.  If the decline in applicants is 1,700 this cycle, with over half the decline among those with LSATs of 165+, with a decline of perhaps several hundred with LSATs between 150-164, and with a modest decrease (or possibly a slight increase) among those with LSATs <150, we may well see that the decline in admitted students and in matriculants is slightly larger than estimated in the chart above.

3. PROFILE CHALLENGES AMONG ELITE SCHOOLS

        One interesting side note is that the significant decrease in the number of applicants with LSATs of 165+ is likely to put significant pressure on a number of top-50 law schools as they try to hold their enrollment and their LSAT profiles.  Simply put, there are not enough applicants with LSATs of 165+ to allow all the law schools in the top-50 or so to maintain their profiles and their enrollment. 

        If the estimates above are correct – that there will be roughly 5420 matriculants with LSATs of 165+– and if we assume that at least a few hundred of these matriculants are going to be going to law schools ranked 50 or below either due to geography or scholarships or both – and if we assume that the top 15 law schools are likely to leverage rankings prestige (and perhaps scholarships) to hold enrollment and profile -- then the decrease of roughly 770 matriculants with LSATs of 165+ is going to be felt mostly among the law schools ranked 16-50 or so. 

        In 2014, the top 15 law schools probably had roughly 3,800 first-year matriculants with LSATs of 165+.  The schools ranked 16-50 likely had another 1,900 or so.  The remaining 500 plus matriculants with LSATs of 165 and above likely were scattered among other law schools lower in the rankings. Let’s assume the top-15 law schools manage to keep roughly 3,700 of the 3,800 they had in 2014.  Let’s assume law schools ranked 50 and below keep roughly 500 or so.  That means the law schools ranked between 16 and 50 have to get by with 1,220 matriculants with LSATs of 165+ rather than 1,900 last year.  While many schools will be dealing with the challenges of maintaining enrollment (and revenue) while trying to hold profile, this likely will be a particularly challenging year for law schools ranked between 16 and 50 that are trying to navigate concerns about enrollment (and revenue) with concerns about profile.  To the extent that those schools look toward applicants with lower LSAT profiles to maintain enrollment, that will then have a ripple effect through the law schools lower in the rankings.

April 13, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, January 6, 2015

The Variable Affordability of Law School – How Geography and LSAT Profile Impact Tuition Costs

I have posted to SSRN the PowerPoint slides I presented yesterday at the AALS Conference session sponsored by the Section on Law School Administration and Finance.  The presentation was entitled The Variable Affordability of Law School – How Geography and LSAT Impact Tuition Cost.   (I am very grateful to my research assistant, Kate Jirik, and her husband, Sam, for awesome work on the spreadsheet that supported the data I presented.)

The presentation begins with two slides summarizing data presented in my article Reflections on the Decreasing Affordability of Legal Education showing the extent to which average public school and private school tuition increased between 1985 and 2011 relative to law school graduate income.  While many have observed that law school has become increasingly expensive over the last few decades, this "macro" discussion fails to highlight the extent to which differences in tuition exist at a “micro” level either based on geography or on LSAT score.

Using 2012 tuition data, the first set of slides focuses on geographic differences – noting some states where legal education generally is very expensive, some states where legal education generally is very affordable and the balance of states in which tuition costs are in the middle or have a mix of affordable and expensive. 

Following those slides, there is a set of slides that describe the process I used to calculate net tuition costs after accounting for scholarships for all entering first-year students at the 195 fully accredited and ranked law schools in fall 2012 in an effort to allocate all students into a five-by-five grid with five LSAT categories (165+, 160-164, 155-159, 150-154 and <150) and five cost categories ($0-$10,000, $10,000-$20,000, $20,000-$30,000, $30,000-$40,000, and $40,000+).  There then are a set of slides summarizing this data and trying to explain what we can learn from how students are allocated across the five-by-five grid, which includes a set of slides showing the average rank of the schools at which students in each LSAT/Cost category cell are enrolled.

The concluding slide sets forth a couple of short observations about the data. There was a robust discussion with some great questions following the presentation of this data.

Here are four of the slides to give you a flavor for the presentation on net cost generally and then net cost relative to LSAT categories -- Image1
Image1

 

Image1

 

Image1


Image1

January 6, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Monday, December 29, 2014

The Composition of Graduating Classes of Law Students -- 2013-2016 -- Part One

PART ONE -- Analyzing the LSAT Profile/Composition of Entering First-Years from 2010 to 2013 and 2014

In the fall of 2013, I had a series of blog posting about the changing demographics of law students.  In the first, I noted that fewer students were coming to law school from elite colleges and universities.  In the second, I noted that between 2010 and 2013 there had been a decline in the number of matriculants with high LSATs and an increase in the number of matriculants with low LSATs such that the “composition” of the class that entered law school in the fall of 2013 was demonstrably less robust (in terms of LSAT profile) than the “composition” of the class that entered law school in the fall of 2010.  In describing this phenomenon, I noted that when the entering class in fall 2013 graduates in 2016, it might encounter greater problems with bar passage than previous classes. 

In light of the significant decline in the median MBE scaled score in July, which Derek Muller has discussed here and here, and which I have discussed here, and a significant decline in first-time bar passage rates in many jurisdictions this year, it seems like an appropriate time to look more closely at changing class profiles and the likely impact on bar passage in the next few years.

This is the first of two blog posts regarding the changing composition of entering classes and the changing composition of graduating classes.  In Part I, I analyze the distribution of LSAT scores across categories based on the LSAC’s National Decision Profiles for the years 2009-2010 through 2012-2013, and then analyze the distribution of law school median LSATs and the 25th percentile LSATs across ranges of LSAT scores.  In Part II, I will analyze how attrition trends have changed since 2010 to assess what that might tell us about the composition of graduating classes three years after entering law school as a way of thinking about the likely impact on bar passage over time.

Tracking Changes Based on National Decision Profiles – 2010-2013

The following discussion summarizes data in the LSAC’s National Decision Profiles from the 2009-10 admission cycle (fall 2010) through the 2012-13 admission cycle (fall 2013).  The National Decision Profile for the 2013-14 admission cycle (fall 2014) has not yet been released.

Let’s start with the big picture.  If you take the matriculants each year and break them into three LSAT categories – 160+, 150-159, and <150 – the following chart and graph show the changes in percentages of matriculants in each of these categories over the last four years. 

Percentage of Matriculants in LSAT Categories – 2010-2013

                        2010    2011    2012    2013

160+                40.8     39        36.3     33.4

150-159           45        45.3     44.3     44.1

<150                14.2     15.7     19.3     22.5

Image1
Notably, this chart and graph show almost no change in the “middle” category (150-159 -- purple) with most of the change at the top (160+ -- orange -- decreasing from 40.8% to 33.4%) and bottom (<150 -- blue -- increasing from 14.2% to 22.5%).  This chart and graph also show only a modest change between 2010 and 2011 with more significant changes in 2012 and again in 2013 – when the percentage of students with LSATs of 160+ declines more substantially and the percentage of students with LSATs of <150 grows more substantially.

While I think this tells the story pretty clearly, for those interested in more detail, the following charts provide a more granular analysis.

Changes in LSAT Distributions of Matriculants – 2010-2013       

                            2010    2011    2012    2013         Chg in Number     % Chg in Number       

170+                3635    3330    2788    2072                -1563               -43%   

165-169           5842    5622    4783    4082                -1760               -30%   

160-164           10666  8678    7281    6442                -4224               -39.6%

155-159           11570   10657  9700    8459                -3111                -26.9%

150-154           10626  9885    8444    8163                -2463               -23.2%

145-149           5131     5196    5334    5541                 410                  8%      

<145                1869    1888    2564    2930                1061    `           56.8% 

                        49339  45256  40894  37689 

Note that in terms of percentage change in the number of matriculants in each LSAT category, the five highest LSAT categories are all down at least 20%, with 160-164 down nearly 40% and 170+ down over 40%, while the two lowest LSAT categories are up, with <145 being up over 50%.

 

Image1
Note that in the line graph above, the top two categories have been combined into 165+ while the bottom two categories have been combined into <150.  Perhaps most significantly, in 2010, the <150 group, with 7,000 students, was over 2,400 students smaller than the next smallest category (165+ with 9.477) and more than 4,500 students smaller than the largest category (155-159 with 11,570).  By 2013, however, the <150 category had become the largest category, with 8,471, just surpassing the 155-159 category, with 8,459, and now 2,300 larger than the smallest category, 165+ with only 6,154.

Changes in Percentage of Matriculants in LSAT Ranges – 2010-2013

                        PERCENTAGE OF MATRICULANTS

                        2010    2011    2012    2013    % Chg in %    

>169                0.074   0.074   0.068   0.055   -25.7%

165-169           0.118   0.124   0.117    0.108   -8.5%  

160-164           0.216   0.192   0.178   0.171   -20.8%

155-159           0.235   0.235   0.237   0.224   -4.7%  

150-154           0.215   0.218   0.206   0.217   0.9%   

145-149           0.104   0.115    0.13     0.147   41.3% 

<145                0.038   0.042   0.063   0.078   105.3%                       

In terms of the “composition” of the class, the percentage of matriculants in each LSAT category, as noted above, little has changed in the “middle” – 155-159 and 150-154, but significant changes have occurred at the top and bottom, with declines of 20% or more at 160-164 and 170+ and with increases of 40% at 145-149 and over 100% at <145.

Tracking Changes in Law School Median LSATs by LSAT Category

A different way of looking at this involves LSAT profiles among law schools over this period.  Based on the data law schools reported in their Standard 509 Reports, from 2010 to 2014, the chart below lists the numbers of law schools reporting median LSATs within certain LSAT ranges.  (This chart excludes law schools in Puerto Rico and provisionally-approved law schools.)

Number of Law Schools with LSAT Medians in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

30

31

26

23

21

160-164

47

41

39

31

29

155-159

59

57

56

53

51

150-154

50

52

53

56

59

145-149

9

14

22

28

29

<145

0

1

0

5

7

 

Image1

The chart above pretty clearly demonstrates the changes that have taken place since 2010, with declines in the number of law schools with median LSATs in higher LSAT categories and increases in the number of law schools with median LSATs in the lower LSAT categories.  The number of law schools with median LSATs of 160 or higher has declined from 77 to 50.  By contrast, the number of law schools with median LSATs of <150 has quadrupled, from 9 to 36.   Moreover, the “mode” in 2010 was in the 155-159 category, with nearly 60 law schools, but as of 2014, the “mode” had shifted to the 150-154 category with nearly 60 law schools.

Number of Law Schools with 25th Percentile LSAT in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

17

16

11

10

10

160-164

26

20

21

17

15

155-159

55

54

49

42

41

150-154

67

69

59

65

57

145-149

26

33

46

48

48

<145

4

4

10

14

25

 

Image1

For those who want to focus on the bottom 25th percentile of LSAT profile among law schools, the chart above shows a similar trend when compared with the medians, except that the number of law schools with a 25th  percentile LSAT between 150-154 also declined (as opposed to an increase with respect to medians). The number of law schools with 25th percentile LSATs of 160 or higher has declined from 43 to 25.  Similarly, the number of law schools with 25th percentile LSATs of 150-159 has declined from 122 to 98.  By contrast, the number of law schools with 25th percentile LSATs of 145-149 has nearly doubled from 26 to 48, while the number of law schools with 25th percentile LSATs of <145 has sextupled from 4 to 25. 

One other way of looking at this is just to see how the average first-year LSAT profiles have changed over the last four years. 

Average LSATs of Matriculants at Fully-Accredited ABA Law Schools

            75th Percentile             Median            25th Percentile

2010                160.5               158.1               155.2

2011                160.1               157.8               154.5

2012                159.6               157                  153.6

2013                158.7               156                  152.6

2014                158.2               155.4               151.8

This shows that between 2010 and 2014, the average 75th percentile LSAT has declined by 2.3 points, the average median LSAT has declined by 2.7 points and that the average 25th percentile LSAT has declined by 3.4 points.

Conclusion

If one focuses on the LSAT score as one measure of “quality” of the entering class of law students each year, then the period from 2010-2014 not only has seen a significant decline in enrollment, it also has seen a significant decline in quality.  On an axis with high LSATs to the left and low LSATs to the right, the “composition” of the entering class of law students between 2010 and 2014 has shifted markedly to the right, as shown in the graph below.  Moreover, the shape of the curve has changed somewhat, thinning among high LSAT ranges and growing among low LSAT ranges.  

Image1

This shift in entering class composition suggests that bar passage rates are likely to continue to decline in the coming years.  But in terms of bar passage, the entering class profile is less meaningful than the graduating class profile.  In part two, I will look at attrition data from 2011 to 2014 to try to quantify the likely “composition” of the graduating classes from 2010 to 2013, which will give us a more refined idea of what to expect in terms of trends in bar passage in 2015 and 2016.

(I am grateful to Bernie Burk and Alice Noble-Allgire for helpful comments on earlier drafts.)

December 29, 2014 in Data on legal education, Structural change | Permalink | Comments (7)