Saturday, December 20, 2014

Further Understanding the Transfer Market -- A Look at the 2014 Transfer Data

This blog posting is designed to update my recent blog posting on transfers to incorporate some of the newly available data on the Summer 2014 transfer market.  Derek Muller also has written about some of the transfer data and I anticipate others will be doing so as well.

NUMBERS AND PERCENTAGES OF TRANSFERS – 2006-2008, 2011-2014

While the number of transfers dropped to 2187 in 2014 down from 2501 in 2013, the percentage of the previous fall’s entering class that engaged in the transfer market remained the same at roughly 5.5%, down slightly from 5.6% in 2013, but still above the percentages that prevailed from 2006-2008 and in 2011 and 2012.

 

2006

2007

2008

2011

2012

2013

2014

Number   of Transfers

2265

2324

2400

2427

2438

2501

2187

Previous   Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

39700

%   of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

5.5%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET – 2012-2014

The following two charts list the top 20 transfer schools in Summer 2012 (fall 2011 entering class), Summer 2013 (fall 2012 entering class) and Summer 2014 (fall 2013 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers from 2012-2014

School

Number   in 2012

School

Number   in 2013

School

Number   in 2014

Florida   State

89

Georgetown

122

Georgetown

113

Georgetown

85

George   Wash.

93

George Wash.

97

George   Wash.

63

Florida   St.

90

Arizona St.

66

Columbia

58

Emory

75

Idaho

57

Mich. State

54

Arizona   State

73

Cal. Berkeley

55

NYU

53

American

68

NYU

53

American

49

Texas

59

Emory

50

Cardozo

48

Columbia

52

Columbia

46

Loyola Marymount

46

NYU

47

American

44

Rutgers   - Camden

42

Minnesota

45

UCLA

44

Minnesota

42

Arizona

44

Wash. Univ.

44

Arizona   State

42

Northwestern

44

Texas

43

Cal. Berkeley

41

UCLA

41

Minnesota

37

Emory

41

Cardozo

38

Northwestern

35

UCLA

39

Southern   Cal.

37

Harvard

33

Northwestern

38

Utah

34

Mich. State

33

Florida

37

Harvard

34

Loyola Marymount

32

Maryland

34

Florida

33

Florida State

31

Michigan

33

Cal. Berkeley

32

Southern   Cal.

30

SMU

31

Wash Univ.

31

Miami

29

Harvard

31

 

 

 

 

 

Largest Law Schools by Transfers as Percentage of Previous First-Year Class

2012-2014 

School

% 2012

School

% 2013

School

% 2014

 

Florida St.

44.5

Florida State

48.1

Arizona State

51.6

Arizona State

24.6

Arizona State

48

Idaho

51.4

Michigan State

17.5

Utah

34.7

Washington Univ.

23.3

Utah

17.5

Emory

29.6

Emory

22.9

Minnesota

17.1

Arizona

28.9

Georgetown

20.8

Emory

16.5

Minnesota

22

George Wash.

20.2

Cal. Berkeley

16.2

George Wash.

21.8

Cal. Berkeley

19.4

Rutgers - Camden

14.9

Georgetown

21.2

Florida St.

18.2

Georgetown

14.7

Rutgers – Camden

20.7

Rutgers - Camden

17.1

Southern Cal.

14.7

Southern Cal.

19.7

Southern Cal.

17.1

Northwestern

14.4

Texas

19.1

Minnesota

16.7

Cincinnati

14.3

Cincinnati

17.5

Utah

15.9

Columbia

14.3

Northwestern

17.1

Northwestern

15.3

Buffalo

14.2

Washington Univ.

15.4

UCLA

15

Arizona

14

Univ. Washington

15.3

Seton Hall

14.5

Cardozo

13.8

Columbia

14.2

Florida Int.

13.9

SMU

13.4

American

13.8

Texas

13.5

Florida

12.7

SMU

13.3

Columbia

13.1

Chicago

12.6

UCLA

13.3

Richmond

12.8

George Wash.

12.5

Chicago

13

Univ. Washington

12.6

 

 

 

 

Houston

12.6

 

Note that in these two charts, the “repeat players” -- those schools in the top 20 for all three years -- are bolded.  In  2013 and 2014, nine of the top ten schools for number of transfers repeated.  (The notable newcomer this year is Idaho, which received 55 transfers from the Concordia University School of Law when Concordia did not receive provisional accreditation from the ABA.)  Across all three years, eight of the top ten schools for percentage of transfers repeated.

Top Ten Law Schools as a Percentage of All Transfers

 

2006

2011

2012

2013

2014

Total Transfers

482

570

587

724

625

Transfers to 10 Schools with Most   Transfers

2265

2427

2438

2501

2187

Transfers to 10 Schools with Most   Transfers as % of   Transfers

21.3%

23.5%

24.1%

28.9%

28.6%

 

The chart above demonstrates an increasing concentration in the transfer market between 2006 and 2014 and even moreso between 2012 and 2014, as the ten law schools with the most students transferring captured an increasing share of the transfer market. 

NATIONAL AND REGIONAL MARKETS BASED ON NEW DATA

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring schools with more than five transfers in to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least twelve students transferred in to the school).  This allows us to begin to explore the nature of the transfer market by looking at where students are coming from and are going and by looking at the first-year GPA profile of students transferring in to different law schools. 

Percentage of Transfers from Within Geographic Region and Top Feeder School(s)

USNews

Ranking

School

# Transfers

Region

Regional

Transfers

Reg. %

Feeder

Schools

#

2

Harvard

33

NE

6

18

Emory-Wash. Univ.

3

4

Columbia

46

NE

19

41

Brooklyn

5

6

NYU

50

NE

20

40

Cornell

8

9

Berkeley

55

CA

43

78

Hastings

18

12

Northwestern

35

MW

24

69

DePaul-Chicago Kent-Loyola

5

13

Georgetown

113

Mid-Atl

49

43

American

13

15

Texas

43

TX

27

63

Baylor

5

16

UCLA

44

CA

31

70

Loyola Marymount

8

18

Wash. Univ.

44

MW

20

45

SLU

4

19

Emory

53

SE

40

75

Atlanta’s John Marshall

20

20

GWU

97

Mid-Atl

78

80

American

54

20

Minnesota

37

MW

21

57

William Mitchell

6

20

USC

30

CA

22

73

Southwestern

5

31

Azizona St.

66

SW

51

77

Arizona Summit

44

45

Florida St.

31

SE

24

77

Florida Coastal

9

61

Miami

29

SE

21

72

Florida Coastal

5

72

American

44

Mid-Atl

14

32

Baltimore-UDC

6

87

Michigan St.

33

MD

33

100

Thomas Cooley

31

87

Loyola Marymount

32

CA

26

81

Whittier

15

 

For this set of 19 schools with the most transfer students, the vast majority obtained most of the transfers from within the geographic region within which the law school is located.   Only two schools (Harvard and American) had fewer than 40% of their transfers from within the region in which they are located and only four others (Columbia, NYU, Georgetown and Washington University) had fewer than 50% of the transfers from within their regions.  Meanwhile, ten of the 19 schools had 70% or more of their transfers from within the region in which the school is located. 

Moreover, several schools had a significant percentage of their transfers from one particular feeder school.  For Berkeley, roughly 33% of its transfers came from Hastings; for Emory, nearly 40% of its transfers came from Atlanta’s John Marshall Law School; for George Washington, over 55% of its transfers came from American; for Arizona State, 67% of its transfers came from Arizona Summit; for Michigan State nearly 95% of its transfers came from Thomas Cooley; for Loyola Marymount, nearly 50% of its transfers came from Whittier; and for Idaho, over 95% of its transfers came from Concordia.

 Percentage of Transfers from Different Tiers of School(s)

Along With First-Year Law School GPA 75th/50th/25th

USNews Ranking

 

# of Trans.

Top 50

# -- %

51-99

# -- %

100-146

# -- %

Unranked

            # -- %

GPA 75th

GPA 50th

GPA 25th

2

Harvard

33

23

70

10

30

0

0

0

0

3.95

3.9

3.83

4

Columbia

46

29

63

14

30

3

7

0

0

3.81

3.75

3.69

6

NYU

50

41

82

7

14

2

4

0

0

3.74

3.62

3.47

9

Berkeley

55

17

31

27

33

6

11

5

9

3.9

3.75

3.68

12

Northwestern

35

16

46

12

34

6

17

1

3

3.73

3.56

3.4

13

Georgetown

113

27

24

38

34

17

15

31

27

3.77

3.67

3.55

15

Texas

43

17

40

13

3

9

21

4

9

3.62

3.45

3.11

16

UCLA

44

15

34

23

52

2

5

4

9

3.73

3.58

3.44

18

Wash. Univ.

44

3

7

25

57

1

2

15

34

3.43

3.2

3.06

19

Emory

53

3

6

7

13

8

15

35

66

3.42

3.27

2.93

20

GWU

97

13

13

73

75

11

11

0

0

3.53

3.35

3.21

20

Minnesota

37

4

11

12

32

18

49

3

8

3.3

3.1

2.64

20

USC

30

1

3

11

37

6

20

12

40

3.71

3.59

3.44

31

Arizona St.

66

4

6

5

8

8

12

49

74

3.51

3.23

2.97

45

Florida St.

31

2

6

4

13

3

10

22

71

3.29

3.1

2.9

61

Miami

29

1

3

4

14

6

21

18

62

3.3

3.07

2.87

72

American

44

2

5

14

32

3

7

25

57

3.25

2.94

2.78

87

Michigan St.

33

0

0

0

0

1

3

32

97

3.19

3.05

2.83

87

Loyola Mary

32

0

0

0

0

1

3

31

97

3

3

3

 

The chart above shows the tiers of law schools from which the largest schools in the transfer market received their transfer students.  Thirteen of the top 19 schools for transfers are ranked in the top 20 in USNews, but of those 13, only six had 80% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, Northwestern, UCLA and George Washington.  Three additional schools had at least 50% of their transfers from schools ranked between 1 and 99, Berkeley, Georgetown and Washington University.  The other ten schools had at least half of their transfer students from schools ranked 100 or lower, with some schools having a significant percentage of their transfers from schools ranked alphabetically.  This data largely confirms the analysis of Bill Henderson and Jeff Rensberger regarding the rankings migration of transfers – from lower ranked schools to higher ranked schools.

In addition, as you move down the rankings of transfer schools, the general trend in first-year law school GPA shows a significant decline, with several highly-ranked schools taking a number of transfers with first-year GPAs below a 3.0, including Emory, Minnesota, Arizona State, and Florida State.

STILL MANY UNKNOWNS

This new data should be very helpful to prospective law students and to current law students who are considering transferring.  This data gives them at least a little better idea of what transfer opportunities might be available to them depending upon where they go to law school as a first-year student.

Even with this more granular data now available, however, as I noted in my earlier posting on transfer students, there still are a significant number of unknowns relating to transfer students.  These unknowns cover several different points.  

First, what is the acceptance rater for transfers?  We now know how many transferred came from different schools and we have some idea of first-year GPA ranges for those admitted as transfers, but we do not know the acceptance rate on transfers.  Are a significant percentage of transfers not admitted or are most students interested in trasnferring finding a new home someplace.

Second, what are motivations of transfers and what are the demographics of transfers?  Are transfers primarily motivated by better employment opportunities perceived to be available at the higher-ranked law school?  Are some subset of transfers primarily motivated by issues regarding family or geography (with rankings and employment outcomes as secondary concerns)?

Third, how do the employment outcomes of transfer students compare with the employment outcomes of students who started at a given law school?  Does the data support the perception that those who transfer, in fact, have better employment outcomes by virtue of transferring?

Fourth, what are the social/educational experiences of transfers in their new schools and what is the learning community impact on those schools losing a significant number of students to the transfer market?

For those interested in these issues, it might make sense to design some longitudinal research projects that could help find answers to some of these questions.

December 20, 2014 in Current events, Data on legal education | Permalink | Comments (0)

Wednesday, December 10, 2014

BETTER UNDERSTANDING THE TRANSFER MARKET

What do we know about the transfer student market in legal education? 

Not enough.  But that will begin to change in the coming weeks.

NUMBER/PERCENTAGE OF TRANSFER STUDENTS HAS INCREASED MODESTLY

Up until this year, the ABA Section of Legal Education and Admissions to the Bar only asked law schools to report the number of transfer students “in” and the number of transfer students “out.”  This allowed us to understand roughly how many students are transferring and gave us some idea of where they are going, and where they are coming from, but not with any direct “matching” of exit and entrance.

Has the number and percentage of transfer students changed in recent years?

In 2010, Jeff Rensberger published an article in the Journal of Legal Education in which he analyzed much of the then available data regarding the transfer market and evaluated some of the issues associated with transfer students.  He noted that from 2006 to 2009 the number of transfer students had remained within a range that represented roughly 5% of the rising second-year class (after accounting for other attrition) – 2,265 in summer 2006, 2,324 in summer 2007, 2,400 in summer 2008, and 2,333 in summer 2009.)  

Using data published in the law school Standard 509 reports, the number of transfers in 2011, 2012 and 2013 has increased only marginally, from 2427 to 2438 to 2501, but, given the declining number of law students, it has increased as a percentage of the preceding year’s first-year “class,” from 4.6% to 5.6%.  Thus, there is a sense in which the transfer market is growing, even if not growing dramatically.

Numbers of Transfer Students 2006-2008 and 2011-2013

 

2006

2007

2008

2011

2012

2013

Number of Transfers

2265

2324

2400

2427

2438

2501

Previous Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

% of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET

In 2008, Bill Henderson and Brian Leiter highlighted issues associated with transfer students.   Henderson and Leiter were discussing the data from the summer of 2006.  Brian Leiter posted a list of the top ten law schools for net transfer students as a percentage of the first year class.  Bill Henderson noted the distribution of transfer students across tiers of law schools (with the law schools in the top two tiers generally having positive net transfers and the law schools in the bottom two tiers generally having negative net transfers), something Jeff Rensberger also noted in his 2010 article.   

Things haven’t changed too much since 2006.  In 2012, there were 118 law schools with fewer than 10 “transfers in” representing a total of 485 transfers – slightly less than 20% of all transfers.  On the other end, there were 21 schools with 30 or more “transfers in” totaling 996 transfers -- nearly 41% of all transfers. Thus, roughly 10% of the law schools occupied 40% of the market (increasing to nearly 44% of the market in 2013).

We also know who the leading transfer schools have been over the last three years.  The following two charts list the top 20 transfer schools in Summer 2011 (fall 2010 entering class), Summer 2012 (fall 2011 entering class) and Summer 2013 (fall 2012 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers in 2012 and 2013

(BOLD indicates presence on list all three year)

 

School

Number in 2011

School

Number in 2012

School

Number in 2013

George Wash.

104

Florida State

89

Georgetown

122

Georgetown

71

Georgetown

85

George Wash.

93

Florida St.

57

George Wash.

63

Florida St.

90

New York Univ.

56

Columbia

58

Emory

75

American

53

Michigan State

54

Arizona State

73

Michigan State

52

New York Univ.

53

American

68

Columbia

46

American

49

Texas

59

Cardozo

45

Cardozo

48

Columbia

52

Loyola Marymount

44

Loyola Marymount

46

New York Univ.

47

Washington Univ.

42

Rutgers - Camden

42

Minnesota

45

Cal. Los Angeles

40

Minnesota

42

Arizona

44

Michigan

39

Arizona State

42

Northwestern

44

Northwestern

39

Cal. Berkeley

41

Cal. Los Angeles

41

Rutgers - Camden

36

Emory

41

Cardozo

38

San Diego

35

Cal. Los Angeles

39

Southern Cal.

37

Arizona State

34

Northwestern

38

Utah

34

Brooklyn

33

Florida

37

Harvard

34

Cal. Hastings

32

Maryland

34

Florida

33

Minnesota

31

Michigan

33

Cal. Berkeley

32

Lewis & Clark

30

SMU

31

Washington Univ.

31

Harvard

30

Harvard

31

   

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class

(BOLD indicates presence on list in both years)

 

 School

Percentage 2011 (as a percentage of the 2010 first year class)

School

Percentage 2012

(as a percentage of the 2011 first year class)

School

Percentage 2013

(as a percentage of the 2012 first year class)

Florida St.

28.6

Florida St.

44.5

Florida State

48.1

George Wash.

19.9

Arizona State

24.6

Arizona State

48

Utah

19.7

Michigan State

17.5

Utah

34.7

Arizona State

17.8

Utah

17.5

Emory

29.6

Michigan State

17.4

Minnesota

17.1

Arizona

28.9

Washington and Lee

15.3

Emory

16.5

Minnesota

22

Washington Univ.

15.2

Cal. Berkeley

16.2

George Wash.

21.8

Loyola Marymount

15.1

Rutgers - Camden

14.9

Georgetown

21.2

Northwestern

14.2

Georgetown

14.7

Rutgers – Camden

20.7

Richmond

13.7

Southern Cal.

14.7

Southern Cal.

19.7

Rutgers - Camden

13.4

Northwestern

14.4

Texas

19.1

Cal. Los Angeles

13

Cincinnati

14.3

Cincinnati

17.5

Cal. Davis

12.8

Columbia

14.3

Northwestern

17.1

Lewis & Clark

12.1

Buffalo

14.2

Washington Univ.

15.4

Georgetown

12

Arizona

14

Univ. Washington

15.3

Minnesota

11.9

Cardozo

13.8

Columbia

14.2

New York Univ.

11.8

SMU

13.4

American

13.8

Cardozo

11.8

Florida

12.7

SMU

13.3

Columbia

11.4

Chicago

12.6

Cal. Los Angeles

13.3

Buffalo

11

George Wash.

12.5

Chicago

13

 

Note that in these two charts, the “repeat players” are bolded – those schools in the top 20 for all three years – 2011, 2012 and 2013.  (Four of the top ten schools Leiter highlighted from the summer of 2006 remain in the top ten as of the summer of 2013, with four others still in the top 20.)  In addition, it is worth noting some significant changes between 2011 and 2013.  For example, the number of schools with 50 or more transfers increased from six to eight with only two schools with more than 70 transfers in 2011 and 2012, but with five schools with more than 70 transfers in 2013. 

Leiter’s top ten law schools took in a total of 482 transfers, representing 21.3% of the 2,265 transfers that summer.  The top ten law schools in 2011 totaled 570 transfers, representing 23.5% of the 2427 transfer students that summer.  The top ten law schools in 2012 totaled 587 transfers, representing 24.1% of the 2438 transfers that summer.  The top ten law schools in 2013, however, totaled 724 students, representing 28.9% of the 2501 transfers in 2013, demonstrating an increasing concentration in the transfer market between 2006 and 2013 and even moreso between 2012 and 2013. 

In addition, three of the top four schools with the highest number of transfers were the same all three years, with Georgetown welcoming 71 in the summer of 2011, 85 in the summer of 2012, and 122 in the summer of 2013, George Washington, welcoming 104 in the summer of 2011, 63 in the summer of 2012, and 93 in the summer of 2013, and Florida State welcoming 57 in the summer of 2011, 89 in the summer of 2012 and 90 in the summer of 2013.  (Notably, Georgetown and Florida State were the two top schools for transfers in 2006, with 100 and 59 transfers in respectively.)

Similarly, three of the top four schools with the highest “percentage of transfers” were the same all three years, with Utah at 19.7% in 2011, 17.5% in 2012 and 34.7% in 2013, Arizona State at 17.8% in 2011, 24.6% in 2012 and 48% in 2013, and Florida State at 28.6% in 2011, 44.5% in 2012 and 48.1% in 2013.  The top five schools on the “percentage of transfers” chart all increased the “percentage” of transfer students they welcomed between 2011 and 2013, some significantly, which also suggests greater concentration in the transfer market between 2011 and 2013.

More specifically, there are several schools that have really “played” the transfer game in the last two years – increasing their engagement by a significant percentage.  These eight schools had 10.2% of the transfer market in 2011, but garnered 22.2% of the transfer market in 2013.

Schools with Significant Increases in Transfers 2011-2013

School

2011

2012

2013

Percentage Increase

Texas

6

9

59

883%

Arizona

6

24

44

633%

Emory

19

41

75

295%

Arizona State

34

42

73

115%

Georgetown

71

85

122

70%

Florida State

57

89

90

58%

Southern Cal

24

29

37

54%

Minnesota

31

42

45

45%

Totals

248

371

555

124%

 

REGIONAL MARKETS

There appear to be “regional” transfer markets.  In the Southeast in 2013, for example, three schools -- Florida State, Florida and Emory -- had a combined net inflow of 180 transfer students, while Stetson and Miami were flat (43 transfers in and 42 transfers in, combined) and eight other schools from the region -- Florida Coastal, Charlotte, Charleston, Atlanta’s John Marshall, St. Thomas University, Ave Maria, Florida A&M, Nova Southeastern – had a combined net outflow of 303.  It seems reasonable to assume that many of the transfers out of these schools found their way to Emory, Florida and Florida State (and perhaps to Miami and Stetson to the extent that Miami and Stetson lost students to Emory, Florida and Florida State).

NEW DATA – NEW INSIGHTS

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar is collecting and requiring schools to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least five students at the school transferred in).  As a result, we will be able to delineate the regional transfer markets (as well as those schools with more of a national transfer market.

Notably, even though the Section of Legal Education and Admissions to the Bar is not requiring the gathering and publication of the 75%, 50%, and 25% LSAT and UGPA, one thing we are very likely to learn is that for many schools, the “LSAT/UGPA” profile of transfers in is almost certainly lower than the LSAT/UGPA profile of the first-year matriculants in the prior year, a point that both Henderson and Rensberger highlight in their analyses. 

Just look at the schools in the Southeast as an example.  Assume Emory, Florida State and Florida (large “transfer in” schools) are, in fact, admitting a significant number of transfer students from other schools in the Southeast region, such as Miami and Stetson, and schools like Florida Coastal, St. Thomas University, Charlotte, Atlanta’s John Marshall and Ave Maria (large “transfer out” schools in the Southeast).  Even if they are taking students who only came from the top quarter of the entering classes at those schools, the incoming transfers would have a significantly less robust LSAT/UGPA profile when compared with the entering class profile at Emory, Florida State or Florida in the prior year.  Virtually every student who might be transferring in to Emory, Florida or Florida State from one of these transfer out schools (other than Miami and perhaps Stetson) is likely to be in the bottom quarter of the entering class LSAT profile at Emory, Florida, and Florida State.

Comparison of Relative Profiles of Southeast Region Transfer In/Out Schools

TRANSFER IN SCHOOLS

2012 LSAT

2012 UGPA

TRANSFER OUT SCHOOLS

2012 LSAT

 2012 UGPA

Emory

166/165/161

3.82/3.70/3.35

Miami

159/156/155

3.57/3.36/3.14

Florida

164/161/160

3.73/3.59/3.33

Stetson

157/157/152

3.52/3.28/3.02

Florida State

162/160/157

3.72/3.54/3.29

St. Thomas (FL)

150/148/146

3.33/3.10/2.83

 

 

 

Florida Coastal

151/146/143

3.26/3.01/2.71

 

 

 

Charlotte

150/146/142

3.32/2.97/2.65

 

 

 

Atlanta’s John Marshall

153/150/148

3.26/2.99/2.60

 

 

 

Ave Maria

153/148/144

3.48/3.10/2.81

 

This raises an interesting question about LSAT and UGPA profile data.  If we assume that LSAT and UGPA profile data are used not only by law schools as predictors of performance, but that third parties also use this data as evidence of the “strength” of the student body, and ultimately the graduates, of a given law school (for example, USNEWS in its rankings and employers in their assessment of the quality of schools at which to interview), what can we surmise about the impact from significant numbers of transfers?  For those law schools with a significant number/percentage of “transfers in” from law schools whose entering class profiles are seemingly much weaker, the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class.  Similarly, if the “transfers out” from a given school happen to come from the top half of the entering class profile, then for these schools as well the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class. 

Using the chart above, if Emory, Florida and Florida State are drawing a significant number of transfers from the regional transfer out schools, and if they had to report the LSAT and UGPA profile of their second-year class rather than their first-year class, their LSAT and UGPA profiles almost certainly would decline.   (The same likely would be true for other law schools with large numbers of transfers.)

STILL MANY UNKNOWNS

Even with more granular data available in the near future to delineate more clearly the transfer pathways between transfer out schools and transfer in schools, there still will be a significant number of unknowns relating to transfer students, regarding employment outcomes, the demographics of transfers, the experience of transfers and the motivation for transfers.

First, with respect to the employment outcomes of transfer students, how do they compare with the employment outcomes for students who started at a law school as first-years? Do the employment outcomes for transfer students track that of students who started at a law school as first-years, or is the employment market for transfer students less robust than it is for students who started at a law school as first-years?  Are the employment outcomes nonetheless better than they might have been at the school from which they transferred? These are important questions given the perception that many students transfer “up” in the rankings to improve their employment opportunities. 

Second, with respect to demographics, do students of color and women participate proportionately in the transfer market or is the market disproportionately occupied by white males?

Third, with respect to the experience of transfers, the Law School Survey of Student Engagement gathered some data from participating law schools in 2005 regarding the experience of transfers but more could be done to better understand how integrated transfer students are in the life of the learning community into which they transfer.

Fourth, with respect to the motivations of transfers, it is generally assumed that transfers are “climbing” the rankings, and Henderson’s data broadly suggests movement from lower-ranked schools to higher-ranked schools, but what percentage of transfers are doing so partly or primarily for geographic reasons – to be near family or a future career location?  How many are transferring for financial reasons because they lost a conditional scholarship after their first year of law school?  How many truly are transferring to get a JD from a higher ranked law school?  How many of those believe their job opportunities will be better at the school to which they are transferring?

We will have answers to some questions soon, but will still have many questions that remain unanswered.

December 10, 2014 in Data on legal education | Permalink | Comments (7)

Tuesday, December 2, 2014

The Market for Law School Applicants -- A Milestone to Remember

In early 2013, Michael Moffitt, the dean of Oregon Law, was interviewed by the New York Times about the tumult affecting law schools. Moffitt, who is a very thoughtful guy, reponded, "I feel like I am living a business school case study.”  

I think the analogy to the business school case study is a good one.  In the nearly two years since that story was published, the market for law school applicants has actually gotten worse.

Yesterday's Dealbook column in the New York Times featured Northwestern Law Dean Dan Rodriguez (who also serves at President of the AALS) speaking candidly about the meltdown dynamics that have taken hold.  See Elizabeth Olson, "Law School is Buyer's Market, with Top Students in Demand," New York Times, Dec. 1, 2014. 

DanRodriguez"It's insane," said Rodriguez, "We’re in hand-to-hand combat with other schools." The trendlines are indeed terrible.  Year-over-year, LSAT test-taker volume is down another 8.7%.  See Organ, LWB, Nov 11, 2014.  So we can expect the situation to get worse, at least in the near term.      

I applaud Dan Rodriguez for this leadership instincts.  He is being transparent and honest.  Several years ago the leadership of the AALS went to great lengths to avoid engagement with the media. Dan has gone the opposite direction, inviting the press into our living room and kitchen.  

Want to know what leadership and judgment look like?  It looks like Dan's interview with Elizabeth Olson.  Dan's words did not solve anyone's problem, but his honesty and candor made it more likely that we help ourselves.  Because it's Northwestern, and Dan is president of the AALS (something the story did not mention but most of us know), and this was reported by Elizabeth Olson in the New York Times, the substance and tenor of discussions within law school faculties is bound to shift, at least slightly and in the direction favoring change.   

What is the de facto plan at most law schools these days?  Universities are not going to backstop law schools indefinitely. I think the sign below is not far off the mark.  

Outrun-the-bear

We are indeed living through a business school case study, which is both bad and good.   At many schools -- likely well more than half --  hard choices need to be made to ensure survival.  (And for the record, virtually all schools, regardless of rank, are feeling uncomfortable levels of heat.)   A law school needs cash to pay its expenses.  But it also needs faculty and curricula to attract students. The deeper a law school cuts, the less attractive it becomes to students.  Likewise, pervasive steep discounts on tuition reflect a classic collective action problem. Some schools may eventually close, but a huge proportion of survivors are burning through their financial reserves.  

Open admissions, which might pay the bills today, will eventually force the ABA and DOE to do something neither really want to do -- aggressively regulate legal education.  This is not a game that is likely to produce many winners.  Rather than letting this play out, individual law schools would be much better off pursuing a realistic strategic plan that can actually move the market. 

The positive side of the business school case study is that a few legal academics are finding their voice and learning -- for the first time in several generations -- how to lead.  Necessity is a wonderful tutor.  Law is not an industry on the decline -- far from it.  The only thing on the decline is the archetypal artisan lawyer that law schools are geared to churn out.  Indeed, back in 2013 when Dean Moffitt commented about living through a business school case study, he was not referencing imminent failure.   Sure, Moffitt did not like the hand he was being dealt, but as the 2013 article showed, his school was proving to be remarkably resourceful in adapting.

The good news resides on the other side of a successful change effort.  The process of change is painful, yet the effects of change can be transformative and make people truly grateful for the pain that made it all possible.  In our case, for the first time in nearly a century, what we teach, and how we teach it, is actually going matter.  If we believe serious publications like The Economist, employers in law, business, and government need creative problem solvers who are excellent communicators, adept at learning new skills, and comfortable collaborating accross multiple disciplines -- this is, in fact, a meaningful subset of the growing JD-Advantage job market.

In the years to come, employers will become more aggressive looking for the most reliable sources of talent, in part because law schools are going to seek out preferred-provider relationships with high quality employers.  Hiring based on school prestige is a remarkably ineffective way to build a world-class workforce -- Google discovered this empirically.  

From an employer perspective, the best bet is likely to be three years of specialized training, ideally where applicants are admitted based on motivation, aptitude, and past accomplishments. The LSAT/UGPA grid method misses this by a wide margin. After that, the design and content of curricula are going to matter.  It is amazing how much motivated students can learn and grow in three years. And remarkably, legal educators control the quality of the soil.  It brings to mind that seemingly trite Spiderman cliche about great power.

For those of us working in legal education, the next several years could be the best of times or the worst of times.  We get to decide.  Yesterday's article in the Times made it a little more likely that we actually have the difficult conversations needed to get to the other side. 

December 2, 2014 in Current events, Data on legal education, Innovations in legal education, New and Noteworthy, Structural change | Permalink | Comments (4)

Tuesday, November 11, 2014

What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?

The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July.  Specifically, in a letter to law school deans, the NCBE stated that:  “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results.  All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”

Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.”  Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."

I am not persuaded.   

(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.”  Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)

I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.

If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.

The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score.  The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.

If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013.  But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.

Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0.  The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.

THINKING ABOUT GROUPS

The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam.  I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).

What about “first-time takers”?  That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction.  Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.

So who’s left?  Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014.  If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.

COMPARING LSAT PROFILES

The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013).  The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points? 

The answer to that is no.

The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.

To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade. 

Defining “student quality” can be a challenge.  A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150. 

While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.

In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.

To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134.  To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class. 

(In his article, Unpacking the BarOf Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores.  I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013. 

LSAT RANGE

Prediction of Success on the Bar Exam Based on Lowest LSAT in Range

175-180

.98

170-174

.97

165-169

.95

160-164

.91

155-159

.85

150-154

.76

145-149

.65

140-144

.50

135-139

.36

130-134

.25

Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)

One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010.  Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once.  Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once.  This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.

Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.

Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score.  Those two lines are plotted below for the period from 2005-2012.  (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)

Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012

  Image1

What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes.  But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores.  From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points.  Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points.  This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.

How does this help us understand what has happened in 2014 relative to 2013?  The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010).  Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively. 

Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores?  Certainly.  This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.

But that is not what we have experienced.  We have experienced an historic decline of 2.8 points.  Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.” 

THE EXAMSOFT DEBACLE

What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle

For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam.  This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions). 

In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before.  If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.

ONE STATE’S EXPERIENCE BELIES THE NCBE STORY

It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates.  It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates.  To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).

Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”

The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013.  In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years.  In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.

YEAR

2004

2005

2006

2007

2008

2009

2010

2011

2012

2013

2014

First Time Bar Passage Rate

 

87%

 

 

86%

 

88%

 

89%

 

90%

 

93%

 

91%

 

90%

 

92%

 

93%

 

 

84%

Change from Prior Year

 

 

-1

 

2

 

1

 

1

 

3

 

-2

 

-1

 

2

 

1

 

 

-9

 

The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013.  But here is the problem for the NCBE.

Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam.  The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa.  In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate).  In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year. 

Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively).  In both years, Iowa had a profile of 164/161/158.  In both years, Drake had a profile of 158/156/153.  This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade.  The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.

TIME WILL TELL

This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014.  What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.

When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers.  When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).

It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions.  In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.

November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (3)

Monday, October 20, 2014

What Law Schools Can Learn from Dental Schools in the 1980s Regarding the Consequences of a Decline in Applicants

For four consecutive years we have seen a decline in the number of applicants to law school and a corresponding decline in the number of matriculating first-year students.  Over the last year or two, some have suggested that as a result of this “market adjustment” some law schools would end up closing.  Most recently, the former AALS President, Michael Olivas, in response to the financial challenges facing the Thomas Jefferson Law School, was quoted as stating that he expects several law schools to close. 

To date, however, no law schools have closed (although the Western Michigan University Thomas M. Cooley Law School recently announced the closure of its Ann Arbor branch).  

Have law schools found ways to cut costs and manage expenses in the face of declining revenues such that all will remain financially viable and remain in operation?  Is it realistic to think that no law schools will close?

Although there may be a number of people in the legal academy who continue to believe that somehow legal education is “exceptional” – that market forces may impose financial challenges for law schools in the near term, but will not result in the closing of any law schools -- this strikes me as an unduly optimistic assessment of the situation. 

To understand why, I think those in legal education can learn from the experience of those in dental education in the 1980s.

The Dental School Experience from 1975-1990

In the 1980s, dental school deans, along with provosts and presidents at their host universities, had to deal with the challenge of a significant decline in applicants to dental school. 

At least partially in response to federal funding to support dental education, first-year enrollment at the country’s dental schools grew throughout the 1970s to a peak in 1979 of roughly 6,300 across roughly 60 dental schools.  Even at that point, however, for a number of reasons -- improved dental health from fluoridation, reductions in federal funding, high tuition costs and debt loads -- the number of applicants had already started to decline from the mid-1970s peak of over 15,000. 

By the mid-1980s, applicants had fallen to 6,300 and matriculants had fallen to 5,000.  As of 1985, no dental schools had closed.  But by the late 1980s and early 1990s there were fewer than 5000 applicants and barely 4000 first-year students – applicants had declined by more than two-thirds and first-year enrollment had declined by more than one-third from their earlier peaks. (Source – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author).)

How did dental schools and their associated universities respond to this changing market?  Between 1986 and 1993, six private universities closed their dental schools: Oral Roberts University, Tulsa, Oklahoma (1986); Emory University, Atlanta, Georgia (1988); Georgetown University, Washington, D.C. (1990); Fairleigh Dickinson University, Rutherford, New Jersey (1990); Washington University, St. Louis, Missouri (1991); and Loyola University, Chicago, Illinois (1993). (Source: Dental Education at the Crossroads:  Challenges and Change, Table 1.1 (Institute of Medicine 1995)).  According to a New York Times article from October 29, 1987, “Georgetown, formerly the nation's largest private dental school, decided to close after a Price Waterhouse study found that the school would have a $3.6 million deficit by 1992.” (Source: Tamar Lewin, Plagued by Falling Enrollment, Dental Schools Close or Cut Back, New York Times, Oct. 29, 1987).

Some of the primary factors contributing to the closing of dental schools were described as follows:

Financial issues were repeatedly described as critical. Dental education was cited as an expensive enterprise that is or may become a drain on university resources. On average, current-year expenditures for the average dental school are about $1 million more than current revenues. … The declining size and quality of the applicant pool during the 1980s played a role in some closures by threatening the tuition base and prestige on which private schools rely. Faculty and alumni resistance to change may feed impatience among university administrators. In some institutions, the comparative isolation of dental schools within the university has provided them with few allies or at least informed colleagues and has left them ill-prepared to counter proposals for "downsizing." (Source: Dental Education at the Crossroads:  Challenges and Change, at 202-203 (Institute of Medicine 1995)). 

The Law School Experience from 2004-2014

In terms of applicants and enrollment over the last decade, the trends law schools have experienced look remarkably comparable to the experience of dental schools in the 1970s and 1980s.  According to the LSAC Volume Summary, applicants to law schools peaked in 2004 with 100,600 applicants (and roughly 48,200 first-year students).  By 2010, applicants had fallen to roughly 87,600, but first-year enrollment peaked at 52,500.  Over the last four years, applicants have fallen steadily to roughly 54,700 for fall 2014, with a projected 37,000 first-years matriculating this fall, the smallest number since 1973-74, when there were 40 fewer law schools and over one thousand fewer law professors.  (Source - ABA Statistics)(For the analysis supporting this projection of 37,000 first-years, see my blog post on The Legal Whiteboard from March 18, 2014.)  

The two charts below compare the dental school experience from 1975 to 1990 with the law school experience in the last decade.  One chart compares dental school applicants with law school applicants and one chart compares dental school first-years with law school first-years.  (Note that for purposes of easy comparison, the law school numbers are presented as one-tenth of the actual numbers.)

Applicants

First years

(Sources – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author) and the LSAC’s Volume Summary  (with my own estimates for 2014 based on the LSAC’s Current Volume Summary)).

The Law School Experience 2014-2019

Notably, these charts do not bode well for law schools.  The law school experience tracks pretty closely the dental school experience over the first ten years reflected in the charts.  For law schools, 2014 looks a lot like 1985 did for dental schools.

There might be any number of reasons why the law school experience over the next several years might be different from the dental school experience in the late 1980s and early 1990s, such that the next several years do not continue as a downward trend in applicants and matriculants.  The market forces associated with changes in the dental profession and dental education in the 1980s are not the same as the market forces associated with changes in the legal profession and legal education in the 2010s and the cost structures for dental education and legal education are not exactly the same.

The problem for law schools, however, is that without an upward trend law schools will continue to face significant financial pressures for the next few years just as dental schools did in the late 1980s.  There might be some encouraging news on the employment front over the next few years as the decreasing number of matriculants will mean a decreasing number of graduates in 2015, 2016 and 2017.  Even without any meaningful growth in the employment market for law graduates, this decline in the number of graduates should mean significant increases in the percentage of graduates finding full-time, long-term employment in bar passage required jobs.  Over time, this market signal may begin to gain traction among those considering law school such that the number of applicants to law school stops declining and perhaps starts increasing modestly. 

But the near term remains discouraging.  The number of people taking the June 2014 LSAT was down roughly 9% compared to June 2013 and the anticipation is that the number of test-takers in the most recent administration in late September was down as well compared to October 2013.  Thus, applicants well might be down another 5-8% in the 2014-15 admissions cycle, resulting in perhaps as few as 51,000 applicants and perhaps as few as 35,000 matriculants in fall 2015.  Even if things flatten out and begin to rebound modestly in the next few years, it would appear to be unlikely that the number of matriculants will climb back near or above 40,000 before the fall of 2017 or 2018.

Moreover, if current trends continue, the matriculants in 2015 also are going to have a significantly less robust LSAT/GPA profile than the matriculants in fall 2010.   As I noted in a blog posting on March 2, 2014, between 2010 and 2013, the number of law schools with a median LSAT less than 150 grew from 9 to 32, and the number with a median LSAT of 145 or below grew from 1 to 9.

What Does this Mean for the Average Law School?

Assume you are the Dean at a hypothetical private law school that had 600 students (200 in each class) and a budget based on $18 million in JD tuition revenue in 2010-11.  (This reflects a net tuition of $30,000 from each student – with nominal tuition set at $40,000 but with a discount rate of 25%.)  Further assume that with this budget, your law school was providing $2.0 million annually to the university with which it is affiliated.  As of 2010-11, your entering class profile reflected a median LSAT of 155 and a median GPA of 3.4.

Assume first-year enrollment declined to 170 in 2011, to 145 in 2012, and to 125 in 2013, a cumulative decrease in first-year enrollment since 2010 of 37%.  As you tried to balance enrollment and profile, the law school managed to maintain its median LSAT and GPA in 2011, but saw its LSAT and GPA medians decline to 153 and 3.35 in 2012 and to 152 and 3.30 in 2013.

This means that for the 2013-14 academic year, the law school had only 440 students, a decrease of roughly 27% from its total enrollment of 600 in 2010, with a much less robust entering class profile in comparison with the entering class profile in 2010. (Note that this assumes no attrition and no transfers in or out, so if anything, it likely overstates total enrollment).  (For comparison purposes, the National Jurist recently listed 25 law schools with enrollment declines of 28% or more between 2010-11 and 2013-14.)

Assume further that the law school had to increase its scholarships to attract even this smaller pool of students with less robust LSAT/GPA profiles, such that the net tuition from each first-year student beginning in fall 2012 has been only $25,500 (with nominal tuition now set at $42,500, but with a discount rate of 40%). 

For the 2013-14 academic year, therefore, you were operating with a budget based on $12,411,000 in JD tuition revenue, a decrease in JD tuition revenue of over $5.5 million since the 2010-11 academic year, over 30%.  (170 x $32,500 for third years ($5.525 million), 145 x $25,500 for second years ($3.698 million), and 125 x $25,500 for first-years ($3.188 million)).

What does this mean?  This means you have been in budget-cutting mode for over three years.  Of course, this has been a challenge for the law school, given that a significant percentage of its costs are for faculty and staff salaries and associated fringe benefits.  Through the 2013-14 academic year, however, assume you cut costs by paring the library budget, eliminating summer research stipends for faculty, finding several other places to cut expenditures, cutting six staff positions and using the retirement or early retirement of ten of your 38 faculty members as a de facto “reduction in force,” resulting in net savings of $3.59 million.  In addition, assume you have gotten the university to agree to waive any “draw” saving another $2 million (based on the “draw” in 2010-2011).  Thus, albeit in a significantly leaner state, you managed to generate a “balanced” budget for the 2013-14 year while generating no revenue for your host university.    

The problem is that the worst is yet to come, as the law school welcomes a class of first-year students much smaller than the class of third-years that graduated in May.  With the continued decline in the number of applicants, the law school has lower first-year enrollment again for 2014-15, with only 120 first-year students with a median LSAT and GPA that has declined again to 151 and 3.2.  Projections for 2015-16 (based on the decline in June and October 2014 LSAT takers) suggest that the school should expect no more than 115 matriculants and may see a further decline in profile.  That means that the law school has only 390 students in 2014-15 and may have only 360 students in 2015-16 (an enrollment decline of 40% since 2010-11). Assuming net tuition for first-year students also remains at $25,500 due to the competition on scholarships to attract students (and this may be a generous assumption) – the JD tuition revenue for 2014-15 and 2015-16 is estimated to be $9,945,000, and $9,180,000, respectively (a decline in revenue of nearly 50% from the 2010-11 academic year). 

In reality, then, the “balanced” budget for the 2013-2014 academic year based on revenues of $12,411,000, now looks like a $2,500,000 budget shortfall in 2014-15 and a $3,200,000 budget shortfall for the 2015-16 academic year, absent significant additional budget cuts or new revenue streams (with most of the “low hanging fruit” in terms of budget cuts already “picked”). 

While you may be able to make some extraordinary draws on unrestricted endowment reserves to cover some of the shortfall (assuming the law school has some endowment of its own), and may be creative in pursuing new sources of revenue (a certificate program or a Master of Laws), even if you come up with an extra $400,000 annually in extraordinary draws on endowment and an extra $400,000 annually in terms of non-JD revenue you still are looking at losses of at least $1,700,000 in 2014-15 and at least $2,400,000 in 2015-16 absent further budget cuts.  Even with another round of early retirement offers to some tenured faculty and/or to staff (assuming there are still some that might qualify for early retirement), or the termination of untenured faculty and/or of staff, the budget shortfall well might remain in the $1,000,000 to $1,700,000 range for this year and next year (with similar projections for the ensuing years).  This means the law school may need subsidies from the university with which it is affiliated, or may need to make even more draconian cuts than it has contemplated to date.  (For indications that these estimates have some relation to reality, please see the recent stories about budget issues at Albany, Minnesota and UNLV.)

Difficult Conversations -- Difficult Decisions

This situation will make for some interesting conversations between you as the Dean of the law school and the Provost and President of the university.  As noted above in the discussion of dental schools, the provost and president of a university with a law school likely will be asking:  How “mission critical” is the law school to the university when the law school has transformed from a “cash cow” into a “money pit” and when reasonable projections suggest it may continue to be a money pit for the next few years?  How "mission critical" is the law school when its entering class profile is significantly weaker than it was just a few years ago, particularly if that weaker profile begins to translate into lower bar passage rates and even less robust employment outcomes?   How “mission critical” is the law school to the university if its faculty and alumni seem resistant to change and if the law school faculty and administration are somewhat disconnected from their colleagues in other schools and departments on campus?

Some universities are going to have difficult decisions to make (as may the Boards of Trustees of some of the independent law schools).  As of 1985, no dental schools had closed, but by the late 1980s and early 1990s, roughly ten percent of the dental schools were closed in response to significant declines in the number and quality of applicants and the corresponding financial pressures.  When faced with having to invest significantly to keep dental schools open, several universities decided that dental schools no longer were “mission critical” aspects of the university. 

I do not believe law schools should view themselves as so exceptional that they will have more immunity to these market forces than dental schools did in the 1980s.  I do not know whether ten percent of law schools will close, but just as some universities decided dental schools were no longer “mission critical” to the university, it is not only very possible, but perhaps even likely, that some universities now will decide that law schools that may require subsidies of $1 million or $2 million or more for a number of years are no longer “mission critical” to the university. 

(I am grateful to Bernie Burk and Derek Muller for their helpful comments on earlier drafts of this blog posting.)

 

October 20, 2014 in Cross industry comparisons, Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (5)

Tuesday, October 7, 2014

Does Cooperative Placement Accelerate Law Student Professional Development?

The title of an earlier essay posed a threshold question for legal ed reform: "If We Make Legal Education More Experiential, Would it Really Matter?" (Legal Whiteboard, Feb 2014) (PDF). I answered "yes" but admitted it was only my best guess.  Thus, to be more rigorous, I outlined the conditions necessary to prove the concept.

The essay below is a companion to the first essay.  It is a case study on how one type and brand of experiential education -- cooperative placements at Northeastern Law -- appears to accelerate the professional development of its law students. The outcome criteria are comprised of the three apprenticeships of Educating Lawyers (2007) (aka The Carnegie Report) --cognitive skills, practice skills, and professional identity.

The better outcomes flow from Northeastern's immersive, iterative, and integrative approach. First, students are immersed in full-time coops that last a standard 11 weeks. Second, students move through four iterations of coops interspersed with four quarters of upper-level classes. Third, this experiential approach is integrated into the Law School's value system -- i.e., the experiential component is perceived as central rather than marginal to the School's educational mission.

Northeastern's coop model asks more of faculty and students, thus it may be hard to replicate. Yet, there is evidence that such an approach does in fact accelerate professional development in ways that ought to please law school critics and reformers. The benefits may be well worth the costs. 

[PDF version at JD Supra]

[The text below was original published as the Northeastern Law Outcomes Assessment Project (OAP) Research Bulletin No. 3]

Immersive, Iterative and Integrative:
Does Cooperative Placement Accelerate Law Student  Professional Development?

A steep decline in the job prospects for entry-level lawyers has been followed by a sharp drop in law school applications. Media stories criticize traditional legal education for being too expensive while producing graduates unprepared for practice. Throughout the country, legal educators and administrators at law schools are trying to formulate an effective response.

A common thread running through many new law school initiatives is greater emphasis on experiential education. Fundamentally, experiential education is learning by doing, typically by assuming the role of the lawyer in an in-class simulation, law school clinic, externship or cooperative placement. As law schools seek to add hands-on opportunities to their curricular offerings, empirical evidence on experiential education’s impact on law student professional development becomes invaluable.

Northeastern University School of Law’s Outcomes Assessment Project (OAP) is an evidenced-based approach to understanding experiential learning in the law school curriculum. A focal point of the OAP is Northeastern’s Cooperative Legal Education Program, an integral part of the school’s curriculum since the late 1960s. After completing a mostly traditional first year of law school,Northeastern students enter a quarter system in which 11-week cooperative placements alternate with 11-week upper-level courses. Through the four co-op placements during the 2L and 3L years, every Northeastern student gains the functional equivalent of nearly one year of full-time legal experience, typically across a diverse array of practice areas.

The Learning Theory of Cooperative Placement

Northeastern’s Cooperative Legal Education Program is based on a learning theory with three interconnected elements: immersion, iteration and integration.

  • Immersion: Immersion in active legal work in a real-world setting enables students to feel the weight and responsibility of representing real-world clients and exercising professional judgment.
  • Iteration: Iterative movement between the classroom and co-op placements provides students with concrete opportunities to connect theory with practice and understand the role of reflection and adjustment in order to improve one’s skill and judgment as a lawyer.
  • Integration: Integrating experiential learning into the law school curriculum signals its high value to the law school mission — when 50 percent of the upper-level activities involve learning by doing, practice skills are on par with doctrinal learning.

The purpose of the OAP Research Bulletin No. 3 is to use preliminary project data to explore whether the immersion-iteration-integration approach to legal education has the effect of accelerating the professional development of law students.

Three Effects of Co-op Placements

The findings in Research Bulletin No. 3 are based on surveys and focus groups conducted with 2L and 3L Northeastern law students and a small number of Northeastern law graduates, who served as facilitators. In our conversations with these students and alumni, we identified three ways that co-op is impacting the professional development of students.

Continue reading

October 7, 2014 in Data on legal education, Important research, Scholarship on legal education | Permalink | Comments (0)

Thursday, September 4, 2014

Artificial Intelligence and the Law

Plexus, a NewLaw law firm based in Australia, has just released a new legal product that purports to apply artificial intelligence to a relatively common, discrete legal issue -- detemining whether a proposed trade promotion (advertisement in US parlance) is in compliance with applicable law. 

In the video below, Plexus Managing Partner Andrew Mellett (who is a MBA, not a lawyer), observes that this type of legal work would ordinarily take four to six weeks to complete and cost several thousand dollars.  Mellett claims that the Plexus product can provide "a legal solution in 10 minutes" at 20% to 30% of the cost of the traditional consultative method -- no lawyer required, albeit Plexus lawyers were the indispensible architects for the underlying code. 

From the video, it is unclear whether the innovation is an expert system -- akin to what Neota Logic or KM Standards are creating -- or artificial intelligence (AI) in the spirit of machine learning used in some of the best predictive coding algorithms or IBM's Watson applied to legal problems.   Back when Richard Susskind published his PhD dissertation in 1987, Expert Systems In Law, an expert system was viewed as artificial intelligence--there was no terminology to speak of because the application of technology to law was embryonic.  Now we are well past birth, as dozen of companies in the legal industry are in the toolmaking business, some living on venture or angel funding and others turning a handsome profit.

My best guess is that Plexus's new innovation is an expert system.  But frankly, the distinction does not matter very much because both expert systems and AI as applied to law are entering early toddler stage.   Of course, that suggests that those of us now working in the legal field will soon be grappling with the growth spurt of legal tech adolescence.  For law and technology, it's Detroit circa 1905.  

September 4, 2014 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, New and Noteworthy, Structural change, Video interviews | Permalink | Comments (2)

Monday, July 28, 2014

Conditional Scholarship Retention Update for the 2012-2013 Academic Year

In comparing the conditional scholarship universe between the 2011-12 academic year and the 2012-13 academic year (with a brief look at 2013-14) there are a handful of things worth noting.

First, as shown in Table 1, the number of law schools with conditional scholarships declined between 2011-12 and 2012-13 from 144 law schools to 136 law schools, and declined again for the 2013-14 academic year to 128 law schools.  The number of law schools that do not have conditional scholarships grew from 49 in 2011-12 to 58 in 2012-13 to 66 in 2013-14.  In addition, the number of schools with just one-year scholarships declined from five in 2011-12 to four in 2012-13, where it remained for 2013-14.

 Table 1:  Changes in Number of Law Schools with Conditional Scholarship Programs

Category

2011-12

2012-13

2013-14 (indications)

Law Schools with Conditional Scholarship Programs

 

144

 

136

 

128

Law Schools with One-Year Scholarships

5

4

4

Law Schools with Scholarships that are not Conditional Scholarships

 

49

 

58

 

66

 

Second, as shown in Table 2, the number of students receiving conditional scholarships in 2012-13 declined slightly from 2011-12, from 12786 to 12470, but the percentage of first-years with conditional scholarships actually increased from 27.3% to 29.2% (given the smaller number of first-years in 2012-13 compared to 2011-12).  That said, the number of students whose scholarships were reduced or eliminated declined from 4359 to 3712, meaning that the percentage of first-years whose scholarships were reduced or eliminated dropped from 9.3% to 8.7%.

Table 2: Overall Comparisons Between 2011-12 and 2012-13

Category

2011-2012

2012-13

First-years*

46778

42769

First-years with Conditional Scholarships**

12786 (27.3% of first-years)

12470 (29.2% of first-years)

First-years whose conditional scholarships were reduced or eliminated**

 

4359 (9.3% of first-years)

 

3712 (8.7% of first-years)

Average Renewal Rate (across law schools)

69%

71%

Overall Renewal Rate Among Scholarship Recipients

65.9%

70.2%

*Drawn from first-year enrollment at the 198 law schools included in this analysis (excluding the law schools in Puerto Rico and treating Widener as one law school for these purposes) based on information published in the Standard 509 reports.
** Based on information published in the mandated Conditional Scholarship Retention charts by each law school with a conditional scholarship program.

Third, the distribution of conditional scholarship programs across tiers of law schools is even more pronounced in 2012-13 than it was in 2011-12.  Using the USNews rankings from March 2014, only 16 law schools ranked in the top 50 had conditional scholarship programs in 2012-13 and eight of those 16 had a renewal rate of 97% or higher.  Three of these law schools also eliminated their conditional scholarship programs as of the fall 2013 entering class.  (Moreover, only six in the top 25 had conditional scholarship programs, five of whom had a renewal rate of 97% or higher.)

As you move further down the rankings, conditional scholarship programs become more common and manifest lower scholarship retention rates on average.

Of the 53 law schools ranked between 51 and 100 (with three tied at 100), 37 law schools (nearly 70%) had conditional scholarship programs, of which two eliminated their conditional scholarship programs as of fall 2013.  Notably, of the 37 law schools with conditional scholarship programs, eight had a renewal rate of 91% or better (nearly 22%), while seven had a renewal rate of 65% or less (nearly 19%) (with the other 22 (nearly 60%) with renewal rates between 67% and 88%)

For law schools ranked between 104 and 146 (44 law schools in total), 35 law schools (nearly 80%) had conditional scholarship programs, of which three eliminated their conditional scholarship programs as of fall 2013.   Notably, of the 35 law schools with conditional scholarship programs, six of the 35 had a renewal rate of 93% or better (roughly 17%) while 16 had a renewal rate of 65% or less (nearly 46%) (with the other 13 (roughly 37%) with renewal rates between 67% and 88%).

Finally, among the unranked schools, 47 of 51 had conditional scholarship programs – over 92% – only five of which had a renewal rate of 91% or better (nearly 11%), while 23 had a renewal rate of 65% or less (nearly 49%) (with the other 19 (roughly 40%) with renewal rates between 66% and 88%).

Tables 3 and 4 present comparative data across law schools in different USNews rankings categories.  Table 3 describes the number of law schools with conditional scholarship programs and the distribution of scholarship retention rates among law schools.  Table 4 describes the total number of students within each USNews rankings category along with the number of students on conditional scholarships and the number of students who had their conditional scholarship reduced or eliminated.

 Table 3: Scholarship Retention Rates by USNews Ranking Categories

Category

Top 50

51-100 (n=53)

104-146 (n=44)

Unranked (n=51)

Schools with Conditional Scholarship Programs

 

16

 

37

 

35

 

47

Retention Rates of 90% or More

8

8

6

5

Retention Rates of 66%-88%

4

22

13

19

Retention Rates of 65% or Less

4

7

16

23

 Table 4: Number and Percentage of First-Year Students in 2012 by USNews Rankings Categories Having Conditional Scholarships and Having Conditional Scholarships Reduced or Eliminated

 

Top 50 Law Schools (n=50)

Law Schools Ranked 51-100 (n=53)

Law Schools Ranked 104-146

(n=44)

Law Schools Ranked Alphabetically (n=51)

Number (%) of Law Schools with Conditional Scholarship Programs

16 (32%)

37 (70%)

35 (79.5%)

47 (92%)

Total First-Years at These Law Schools

11,862

10,937

7,611

12,180

Number (%) of First-Years with Conditional Scholarships

1,587 (13.4%)

3,192 (29.2%)

3,247 (42.7%)

4,444 (36.5%)

Number (%) of Conditional Scholarship Recipients Whose Scholarships were Reduced or Eliminated

154 (9.7% of conditional scholarship recipients and 1.3% of first-years)

734 (23% of conditional scholarship recipients and 6.7% of first-years)

1,124 (34.6% of conditional scholarship recipients and 14.8% of first-years)

1,700 (38.3% of conditional scholarship recipients and 14% of first-years)

Overall, as shown in Table 5, the distribution of retention rates across law schools was as follows for the 2012-13 academic year:  18 law schools had retention rates less than 50%, 20 law schools had retention rates between 50% and 59.99%, 25 law schools had retention rates between 60% and 69.99%, 21 law schools had retention rates between 70% and 79.99%, 25 law schools had retention rates between 80% and 89.99%, and 27 law schools had retention rates of 90% or better. 

 Table 5 – Number of Law Schools with Conditional Scholarship Renewal Rates in Different Deciles

Renewal Category

Number of Schools

90% or More

27 (16 of which were ranked in top 100)

80%-89.9%

25 (12 of which were ranked in top 100)

70%-79.9%

21 (10 of which were ranked in top 100)

60%-69.9%

25 (8 of which were ranked in top 100)

50%-59.9%

20 (5 of which were ranked in top 100)

Less than 50%

18 (2 of which were ranked in top 100)

Notably, of the 52 law schools ranked in the top 100 with conditional scholarship programs, only two (four percent) had retention rates that were less than 50%, while 16 (nearly 31%) had retention rates of 90% or better.  By contrast, of the 82 (of 95) law schools ranked 104 or lower with conditional scholarship programs, 16 (nearly 20%) had retention rates of 50% or less, while only 11 (roughly 13%) had retention rates of 90% or better.

In sum then, with several schools eliminating their conditional scholarship programs as of fall 2013, less than 50% of the law schools ranked in the top 100 (47 of 103 – nearly 46%) still had conditional scholarship programs, and of those, more than 27% (13 of 47) had retention rates for the 2012-13 academic year of 90% or better while less than 22% (10 of 47) had retention rates of 65% or less.

By contrast, as of fall 2013, more than 80% of the schools ranked below 100 (79 of 95 – roughly 83%) still had conditional scholarship programs, and of those, less than 12% (9 of 79) had retention rates for the 2012-13 academic year of 90% or better and nearly half (39 of 79 – roughly 49%) had retention rates of 65% or less.

July 28, 2014 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, May 27, 2014

Another Datapoint for the Laptops Debate

In my inbox this morning was the HBS Daily Stat with the title, "You'll Absorb More if You Take Notes Longhand."  Here is the accompanying explanation:

College students who take notes on laptop computers are more likely to record lecturers’ words verbatim and are thus less likely to mentally absorb what’s being said, according to a series of experiments by Pam A. Mueller of Princeton and Daniel M. Oppenheimer of UCLA. In one study, laptop-using students recorded 65% more of lectures verbatim than did those who used longhand; a half-hour later, the laptop users performed significantly worse on conceptual questions such as “How do Japan and Sweden differ in their approaches to equality within their societies?” Longhand note takers learn by reframing lecturers’ ideas in their own words, the researchers say.

SOURCE: The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking (emphasis in the original)

Wouldn't the same analysis almost surely apply to law students?  Experience tells me that many law students would argue that they are in the minority who learn better through computer transcription.  But what if, given a choice, over half decide to use laptops?  It would be likely that many, if not most, would be making the wrong tradeoff.

Data rarely changes hearts and minds.  As a result, there is likely a gap between maximum learning/knowledge worker productivity and what we are able to accomplish in an education or  workplace setting.  Why?  People like what they are used to and rationalize why data does not apply to them.  There is a solution to dilemma, I suspect.  We just have not found it yet. 

May 27, 2014 in Blog posts worth reading, Cross industry comparisons, Data on legal education, Fun and Learning in the classroom, New and Noteworthy | Permalink | Comments (2)

Monday, March 17, 2014

A Counterpoint to "The most robust legal market that ever existed in this country"

There is a line in Professor Reich-Graefe's recent essay, Keep Calm and Carry On, 27 Geo. J. Legal Ethics 55 (2014), that is attracting a lot of interest among lawyers, law students, and legal academics: 

[R]ecent law school graduates and current and future law students are standing at the threshold of the most robust legal market that ever existed in this country—a legal market which will grow, exist for, and coincide with, their entire professional career.

This hopeful prediction is based on various trendlines, such as impending lawyer retirements, a massive intergenerational transfer of wealth that will take place over the coming decades, continued population growth, and the growing complexity of law and legal regulation.

Although I am bullish on future growth and dynamism in the legal industry, and I don't dispute the accuracy or relevance of any of the trendlines cited by Reich-Graefe, I think his primary prescriptive advice -- in essence, our problems will be cured with the passage of time -- is naive and potentially dangerous to those who follow it.

The Artisan Lawyer Cannot Keep Up

The primary defect in Reich-Graefe's analysis is that it is a one-sided argument that stacks up all impending positive trendlines without taking into account the substantial evidence that the artisan model of lawyering -- one-to-one consultative legal services that are tailored to the needs of individual clients -- is breaking down as a viable service delivery model.  

Lawyers serve two principal constituencies--individuals and organizations.  This is the Heinz-Laumann "Two-Hemisphere" theory that emerged from the Chicago Lawyers I and II studies.  See Heinz et al, Urban Lawyers (2005). The breakdown in the artisan model can be observed in both hemispheres.

  1. People.  Public defenders are understaffed, legal aid is overwhelmed, and courts are glutted with pro se litigants.  Remarkably, at the same time, record numbers of law school graduates are either unemployed or underemployed.  Why?  Because most poor and middle-class Americans cannot afford to buy several hours of a lawyer's time to solve their legal problems.  
  2. Organizations.  The most affluent organizations, multinational corporations, are also balking at the price of legal services.  As a result, foreign labor, technology, process, or some combination thereof has become a replacement for relatively expensive and unskilled junior lawyers.

The primary driver of this structural shift is the relentless growth in legal complexity.  This increase in complexity arises from many sources, including globalization, technology, digitally stored information, and the sheer size and scope of multinational companies. 

But here is a crucial point:  the complexity itself is not new, only its relative magnitude.  A century ago, as the modern industrial and administrative state was beginning to take shape, lawyers responded by organizing themselves into law firms.  The advent of law firms enabled lawyers to specialize and thus more cost-effectively tackle the more complex legal problems. Further, the diffusion of the partner-associate training model (sometimes referred to as the Cravath system) enabled firms to create more specialized human capital, which put them in an ideal position to benefit from the massive surge in demand for legal services that occurred throughout the 20th century.  See Henderson, Three Generations of Lawyers: Generalists, Specialists, Project Managers, 70 Maryland L Rev 373 (2011). 

The legal industry is at the point where it is no longer cost effective to deal with this growing complexity with ever larger armies of artisan-trained lawyers.  The key phrase here is cost effective.  Law firms are ready and willing to do the work.  But increasingly, clients are looking for credible substitutes on both the cost and quality fronts. Think car versus carriage, furnace versus chimney sweep, municipal water system versus a well.  A similar paradigm shift is now gaining momentum in law.

The New Legal Economy

I have generated the graph below as a way to show the relationship between economic growth, which is the engine of U.S. and world economies, and the legal complexity that accompanies it.

Complexity
This chart can be broken down into three phases.

1. Rise of the law firm. From the early twentieth century to the early 1980s, the increasing complexity of law could be capability handled by additional law firm growth and specialization. Hire more junior lawyers, promote the best ones partner, lease more office space, repeat.  The complexity line has a clear bend it in.  But for most lawyers, the change is/was very gradual and feels/felt like a simple linear progression.  Hence, there was little urgency about the need for new methods of production.

2. Higher law firm profits. Over the last few decades, the complexity of law outpaced overall economic growth.  However, because the change was gradual, law firms, particularly those with brand names, enjoyed enough market power to perennially increase billing rates without significantly improving service offerings.  Corporate clients paid because the economic benefits of the legal work outweighed the higher costs.  Lower and middle class individuals, in contrast, bought fewer legal services because they could not afford them. But as a profession, we barely noticed, primarily because the corporate market was booming. See Henderson, Letting Go of Old Ideas, 114 Mich L Rev 101 (2014).

3. Search for substitutes.  Laws firms are feeling discomfort these days because the old formula -- hire, promote, lease more space, increase rates, repeat -- is no longer working.  This is because clients are increasingly open to alternative methods of solving legal problems, and the higher profits of the last few decades have attracted new entrants.  These alternatives are some combination of better, faster, and cheaper.   But what they all share in common is a greater reliance on technology, process, and data, which are all modes of problemsolving that are not within the training or tradition of lawyers or legal educators.  So the way forward is profoundly interdisciplinary, requiring collaboration with information technologists, systems engineers, project managers, data analysts, and experts in marketing and finance.

Why is this framework potentially difficult for many lawyers, law firms, and legal educators to accept?  Probably because it requires us to cope with uncertainties related to income and status.  This reluctance to accept an unpleasant message creates an appetite for analyses that say "keep calm and carry on."  This is arguably good advice to the British citizenry headed into war (the origin of the saying) but bad advice to members of a legal guild who need to adapt to changing economic conditions.

There is a tremendous silver lining in this analysis.  Law is a profoundly critical component of the globalized, interconnected, and highly regulated world we are entering.  Lawyers, law firms, and legal educators who adapt to these changing conditions are going to be in high demand and will likely prosper economically.  Further, at an institutional level, there is also the potential for new hierarchies to emerge that will rival and eventually supplant the old guard.

Examples

Logo-kcuraOne of the virtues of lawyers is that we demand examples before we believe something to be true.  This skepticism has benefited many a client.  A good example of the emerging legal economy is the Available Positions webpage for kCura, which is a software company that focuses exclusively on the legal industry. 

The current legal job market is terrible, right?  Perhaps for entry-level artisan-trained lawyers.  But at kCura, business is booming. Founded in 2001, the company now employs over 370+ workers and has openings for over 40 full-time professional positions, the majority of which are in Chicago at the company's LaSalle Street headquarters.  Very few of these jobs require a law degree -- yet the output of the company enables lawyers to do their work faster and more accurately.  

What are the jobs?

  • API Technical Writer [API = Application Programming Interface]
  • Big Data Architect - Software Engineering
  • Business Analyst
  • Enterprise Account Manager
  • Group Product Manager
  • Litigation Support Advice Analyst
  • Manager - Software Engineering
  • Marketing Associate
  • Marketing Specialist -- Communications
  • Marketing Specialist -- Corporate Communications and Social Media
  • Product Manager -- Software and Applications Development
  • QA Software Engineer -- Performance [QA = Quality Assurance]
  • Scrum Team Coordinator [Scrum is a team-based software development methodology]
  • Senior SalesForce Administrator 
  • Software Engineer (one in Chicago, another in Portland)
  • Software Engineer (Front-End Developer) [Front-End = what the client sees]
  • Software Engineer in Test [Test = finds and fixes software bugs]
  • Technical Architect
  • Technical Architect - Security
  • VP of Product Development and Engineering

kCura operates exclusively within the legal industry, yet it has all the hallmarks of a great technology company. In the last few years it has racked up numerous awards based on the quality of its products, its stellar growth rate, and the workplace quality of life enjoyed by its employees.

KCuraawards

That is just what is happening at kCura.  There are many other companies positioning themselves to take advantage of the growth opportunities in legal, albeit none of them bear any resemblance to traditional law firms or legal employers.

LexRedux-Eventbrite-headerIn early February, I attended a meeting in New York City of LexRedux, which is comprised of entrepreneurs working in the legal start-up space.  In a 2008 essay entitled "Legal Barriers to Innovation," Professor Gillian Hadfield queried, "Where are the 'garage guys' in law?"  Well, we now know they exist.  At LexRedux, roughly 100 people working in the legal tech start-up space were jammed into a large open room in SoHo as a small group of angel investors and venture capitalists fielded questions on a wide range of topics related to operations, sales, and venture funding.

According to Angel's List, there are as of this writing 434 companies identified as legal start-ups that have received outside capital.  According to LexRedux founder Josh Kubicki, the legal sector took in $458M in start-up funding in 2013, up from essentially zero in 2008.  See Kubicki, 2013 was a Big Year for Legal Startups; 2014 Could Be Bigger, Tech Cocktail, Feb 14, 2014.

The legal tech sector is starting to take shape.  Why?  Because the imperfections and inefficiencies inherent in the artisan model create a tremendous economic opportunity for new entrants.  For a long period of time, many commentators believed that this type of entrepreneurial ferment would be impossible so long as Rule 5.4 was in place.  But in recent years, it has become crystal clear that when it comes to organizational clients where the decisionmaker for the buyer is a licensed lawyer (likely accounting for over half of the U.S. legal economy) everything up until the courthouse door or the client counseling moment can be disaggregated into a legal input or legal product that can be provided by entities owned and controlled by nonlawyers. See Henderson, Is Axiom the Bellwether of Legal Disruption in the Legal Industry? Legal Whiteboard, Nov 13, 2013.

The Legal Ecosystem of the Future

Book-tomorrows-lawyersIn his most recent book, Tomorrow's Lawyers, Richard Susskind describes a dynamic legal economy that bares little resemblance to the legal economy of the past 200 years.  In years past, it was easier to be skeptical of Susskind because his predictions seemed so, well, futuristic and abstract.  But anyone paying close attention can see evidence of a new legal ecosystem beginning to take shape that very much fits the Susskind model.

Susskind's core framework is the movement of legal work along a five-part continuum, from bespoke to standardized to systematized to productized to commoditized.  Lawyers are most confortable in the bespoke realm because it reflects our training and makes us indispensible to a resolution.  Yet, the basic forces of capitalism pull the legal industry toward the commoditized end of the spectrum because the bespoke method of production is incapable of keeping up with the needs of a complex, interconnected, and highly regulated global economy. 

According to Susskind, the sweet spot on the continuum is between systematized and productized, as this enables the legal solution provider to "make money while you sleep."  The cost of remaining in this position (that is, to avoid commoditization) is continuous innovation.  Suffice it to say, lawyers are unlikely to make the cut if they choose to hunker down in the artisan guild and eschew collaboration with other disciplines.

Below is a chart I have generated that attempts to summarize and describe the new legal ecosystem that is now taking shape [click-on to enlarge].  The y-axis is the Heinz-Laumann two-hemisphere framework.  The x-axis is Susskind's five-part change continuum. 

Ecosystem
Those of us who are trained as lawyers and have worked in law firms will have mental frames of reference that are on the left side of the green zone.  We tend to see things from the perspective of the artisan lawyer.  That is our training and socialization, and many of us have prospered as members of the artisan guild.

Conversely, at the commoditized end of the continuum, businesses organized and financed by nonlawyers have entered the legal industry in order to tap into portion of the market that can no longer be cost-effectively serviced by licensed U.S. lawyers.  Yet, like most businesses, they are seeking ways to climb the value chain and grow into higher margin work.  For example, United Lex is one of the leading legal process outsourcers (LPOs).  Although United Lex maintains a substantial workforce in India, they are investing heavily in process, data analytics, and U.S. onshore facilities.  Why?  Because they want to differientiate the company based on quality and overall value-add to clients, thus staving off competition from law firms or other LPOs.

In the green zone are several new clusters of companies:

  • NewLaw.  These are non-law firm legal service organizations that provide high-end services to highly sophisticated corporations.  They also rely heavily on process, technology, and data.  Their offerings are sometimes called "managed services." Novus Law, Axiom, Elevate, and Radiant Law are some of the leading companies in this space. 
  • TechLaw.  These companies would not be confused with law firms. They are primarily tool makers.  Their tools facilitate better, faster, or cheaper legal output.  kCura, mentioned above, works primarily in the e-discovery space.  Lex Machina provides analytic tools that inform the strategy and valuation of IP litigation cases.  KM Standards, Neota Logic, and Exemplify provide tools and platforms that facilitate transactional practice.  In the future, these companies may open the door to the standardization of a wide array of commercial transactions.  And standardization drives down transaction costs and increases legal certainty -- all good from the client's perspective.
  • PeopleLaw.  These companies are using innovative business models to tap into the latent people hemisphere.  Modria is a venture capital-financed online dispute resolution company with DNA that traces back to PayPal and the Harvard Negotiations Workshop.  See Would You Bet on the Future of Online Dispute Resolution (ODR)?  Legal Whiteboard, Oct 20, 2013.  LegalForce is already an online tour de force in trademarks -- a service virtually every small business needs.  The company is attempting to translate its brand loyalty in trademarks into to new consumer-friendly storefront experience.  Its first store is in the heart of University Avenue in Palo Alto.  LegalForce wants to be the virtual and physical portal that start-up entrepreneurs turn to when looking for legal advice.

Conclusion

When I write about the changes occurring in the legal marketplace, I worry whether the substance and methodology of U.S. legal education provides an excellent education for a legal world that is gradually fading away, and very little preparation for the highly interdisciplinary legal world that is coming into being. 

Legal educators are fiduciaries to our students and institutions. It is our job to worry about them and for them and act accordingly.  Surely, the minimum acceptable response to the facts at hand is unease and a willingness to engage in deliberation and planning.  Although I agree we need to stay calm, I disagree that we need to carry on.  The great law schools of the 21st century will be those that adapt and change to keep pace with the legal needs of the citizenry and broader society.  And that task has barely begun.

[PDF version]

March 17, 2014 in Blog posts worth reading, Current events, Data on legal education, Data on the profession, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (16)

Sunday, March 2, 2014

THOUGHTS ON FALL 2013 ENROLLMENT AND PROFILE DATA AMONG LAW SCHOOLS

DECLINING ENROLLMENT – Between fall 2012 and fall 2013, the 199 law schools in the 48 contiguous states and Hawaii (excluding the Puerto Rican schools) accredited by the ABA’s Section for Legal Education and Admissions to the Bar, experienced the following first-year enrollment changes:

25 schools had a decline in first-year enrollment of 25% or more,

34 schools had a decline in first-year enrollment of 15%-24.99%,

44 schools had a decline in first-year enrollment of 5% to 14.99%,

62 schools had “flat” first-year enrollment of -4.99% to 4.99%,

19 schools had an increase in first-year enrollment of  5% and 14.99%, and

15 schools had an increase in first-year enrollment of 15% or more.

Overall, more than half (103) had a decrease in first-year enrollment of at least 5%, while roughly 17% (34) had an increase in first-year enrollment of at least 5%.

Across these 199 schools, first-year enrollment declined from 42,590 to 39,109, a decrease of 8.2%.  The average decline in first-year enrollment across U.S. News “tiers” of law schools was 2.6% among top 50 schools, 8.2% among schools ranked 51-99, 7.7% among schools ranked 100-144 and 7.9% among schools ranked alphabetically.

            Between fall 2010 and fall 2013, the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth), experienced the following first-year enrollment changes:

            28 schools had a decline in first-year enrollment of 40% or more,

            29 schools had a decline in first-year enrollment of 30% to 39.99%

            43 schools had a decline in first-year enrollment of 20% to 29.99%

            43 schools had a decline in first-year enrollment of 10% to 19.99%

            36 schools had a decline in first-year enrollment of 0% to 9.99%

            10 schools had an increase in first-year enrollment of 0.01%to 9.99%

            6 schools had an increase in first-year enrollment of 10% or more.

Overall, more than half (100) had a decrease in first-year enrollment of at least 20%, while only roughly 8% (16) had any increase in first-year enrollment.

            Across these 195 schools, first-year enrollment declined from 50,408 to 38,773, a drop of 23.1%.  The average decline in first-year enrollment across U.S. News “tiers” of law schools was 14.7% among top 50 schools, 22.5% among schools ranked 51-99, 22.8% among schools ranked 100-144, and 26.8% among schools ranked alphabetically. 

 

DECLINING PROFILES -- Across the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (thus excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth) the entering first-year class average LSAT profile fell one point at all three measures between 2012 and 2013, from 159.6/157/153.5 to 158.6/156/152.5.  The entering first-year class average LSAT profile fell roughly two points at all three measures between 2010 and 2013, from 160.5/158.1/155.2 to 158.6/156/152.5. 

The average decline in median LSAT scores between 2012 and 2013 across U.S. News “tiers” of law schools was .98 among top 50 schools, 1.18 among schools ranked 51-99, .72 among schools ranked 100-144, and 1.13 among schools ranked alphabetically. 

Notably, 133 law schools saw a decline in their median LSAT between 2012 and 2013, with 80 down one point, 38 down two points, 12 down three points, one down four points, one down five points and one down six points, while 54 law schools were flat and 7 saw an increase in their median LSAT. 

In terms of schools experiencing “larger” declines in median LSAT scores between 2012 and 2013, five schools in the top 50 saw a three point decline in their median LSAT, five schools ranked 51-99 saw at least a three point decline (of which one was down four points), three schools ranked 100-144 saw a three point decline, and two schools ranked alphabetically saw large declines – one of five points and one of six points.

The average decline in median LSAT scores between 2010 and 2013 across U.S. News “tiers” of law schools was 1.54 among top 50 schools, 2.27 among schools ranked 51-99, 2.11 among schools ranked 100-144, and 2.79 among schools ranked alphabetically.  If one were to unpack the top 50 schools a little more, however, one would discover that the top 20 schools saw an average decline in their median LSAT of 1.05 between 2010 and 2013, while the bottom 15 schools in the top 50 saw an average decline in their median LSAT of 2.53.

In terms of schools experiencing “larger” declines in median LSAT scores between 2010 and 2013, three schools in the top 50 have seen declines of four or more points, nine schools ranked 51-99 have seen declines of four or more points, 11 schools ranked 100-144 have seen declines of four or more points and 17 schools ranked alphabetically have seen declines of four or more points. 

When looking at the 2012-13 data in comparison with the 2010-2013 data, one sees that lower ranked schools have had more of a sustained challenge in terms of managing profile over the last few years, while schools ranked in the top 50 or top 100 had been managing profile fairly well until fall 2013 when the decreased number of high LSAT applicants really began to manifest itself in terms of impacting the LSAT profiles of highly ranked schools.

The overall decline in the LSAT profile of first-year students also can be demonstrated with two other reference points.  In 2010, there were 74 law schools with a median LSAT of 160; in 2013, that number has fallen to 56.  At the other end of the spectrum, in 2010, there were only 9 schools with a median LSAT of less than 150 and only one with a median LSAT of 145.  In 2013, the number of law schools with a median LSAT of less than 150 has more than tripled to 32, while the number of law schools with a median LSAT of 145 or less now numbers 9 (with the low now being a 143).

 

CONCLUDING THOUGHTS – Over the last three years, few schools have had the luxury of being able to hold enrollment (or come close to holding enrollment) and being able to hold profile (or come close to holding profile).  Many schools have found themselves in a “pick your poison” scenario.  A number of schools have picked profile and made an effort to hold profile or come close to holding profile by absorbing significant declines in first-year enrollment (and the corresponding loss of revenue).  By contrast, a number of schools have picked enrollment and made an effort to hold enrollment or come close to holding enrollment (and maintaining revenue) but at the expense of absorbing a significant decline in LSAT profile.  Some schools, however, haven’t even been able to pick their poison.  For these schools, the last three years have presented something of a double whammy, as the schools have experienced both significant declines in first-year enrollment (and the corresponding loss of revenue) and significant declines in profile. 

March 2, 2014 in Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (0)

Saturday, March 1, 2014

Is the Employment Market for Law Graduates Going to be Improving?

Last fall, while making a presentation at the Midwest Association of Pre-Law Advisors Conference in St. Louis, I had the opportunity to respond to the question that is the title of this blog posting. 

Is the employment market for law graduates going to be improving?  My answer was, and is, almost certainly yes, although perhaps not immediately.

I write this to offer my perspective on the employment market for law graduates in the coming years.  A number of people have written on this topic in recent weeks and months.  Bernie Burk has a very thoughtful piece analyzing the changing job market over the last three decades.  In his concluding thoughts he suggests that the decline in the number of law students will mean that the job market will be improving.  Paula Young, Debby Merritt, Matt Leichter, and The National Jurist, also have weighed in on this issue with some disagreement about how to understand the “market” for law graduates in the coming years.  Whether and how to include JD Advantage jobs in the analysis is something that is frequently contested.  Bernie Burk does a thorough job analyzing the challenges of assessing whether JD Advantage jobs should be included within his definition of “law jobs” – “placements for which a law degree is typically a necessary or extremely valuable substantive preparation; or put slightly differently, jobs that a law degree typically makes a truly substantial and significant difference in obtaining or performing.”

To avoid some of these definitional challenges, this post will focus solely on the market for full-time, long-term Bar Passage Required jobs. Initially, it will analyze those jobs in relation to all graduates; then it will look more specifically at the percentage of graduates who are likely to be eligible for Bar Passage Required jobs for whom full-time, long-term Bar Passage Required jobs likely will be available, a point on which few others appear to have focused up until now.

Class of 2013 – Little if Any Good News is Likely

In the short term, for the Class of 2013, for which job results will be reported in the coming weeks, it would not be at all surprising to see little, if any, improvement in the employment results in terms of the percentage of graduates finding jobs classified as full-time, long-term Bar Passage Required jobs.

According to NALP’s data, there were 29,978 full-time, long-term Bar Passage Required jobs for 2007 graduates, a number which fell to 24,902 for 2011 graduates, and then rebounded to 26,876 for 2012 graduates, an increase of 1,974.   According to the ABA’s Employment Outcomes data, between 2011 and 2012, the number of full-time, long-term Bar Passage Required jobs grew from 24,149 to 26,066, an increase of 1,917.  (For this blog posting, I am not going to try to reconcile the slight differences in data between NALP and the ABA’s Employment Outcomes data.)

Unfortunately, however, according to the ABA's Employment Outcomes data, this growth in full-time, long-term Bar Passage Required jobs between 2011 and 2012 corresponded with a growth in the number of law graduates, from 43,979 to 46,364, an increase of 2,385.  Thus, even though the number of full-time, long-term Bar Passage Required jobs grew by 7.9%, the percentage of graduates in full-time, long-term Bar Passage Required jobs grew only slightly, from 54.9% to 56.2%.

Between 2012 and 2013 the number of full-time, long-term Bar Passage Required jobs may increase again, but the number of graduates also will be increasing, likely from 46,364 to roughly 47,250.  (For the last few years, the number of law school graduates has averaged roughly 90% of the number of first-year students who started law school three years previously.  With 52,500 first-year students in Fall 2010, there likely were roughly 47,250 May 2013 graduates on whom employment will be reported in the coming weeks.) 

If the number of full-time, long-term Bar Passage Required jobs for the 2013 graduates reported in the ABA Employment Outcomes data grows by roughly 1,000 to 27,000, an increase of nearly 4%, the percentage of graduates with such jobs would increase only slightly to 57.1%.  If the number of full-time, long-term Bar Passage Required jobs for the 2013 graduates grows only slightly, by roughly 500, to 26,500 (an increase of less than 2%), the percentage of graduates with such jobs will drop slightly, to 56.1%.  If the number of Bar Passage Required jobs is flat, at 26,000, the percentage of graduates with such jobs will drop a little more to 55%.  Between 2011 and 2013, the market might see graduates finding roughly 2,000 to 2,500 new full-time, long-term Bar Passage Required jobs, and yet still see only 55% to 57% of graduates in such jobs because of the growth in the number of graduates between 2011 and 2013.

Classes of 2014, 2015, 2016, 2017 – An Improving Dynamic

What are the employment prospects for those currently in law school or considering starting law school in the fall of 2014?  They almost certainly will be getting better – not necessarily because there will be more jobs, but because there will be fewer graduates.

Indeed, to make this point, let’s assume that there is actually no further growth in full-time, long-term Bar Passage Required jobs between 2012 and 2017.  Assume the number of such jobs plateaus at 26,000 for graduates of the Class of 2013 and then stays at that level each year through 2017.  What percentage of law graduates over the next four years will have such jobs? 

According to the LSAC, "ABA First-Year Enrollment" has declined steadily from 2010 to the present, from 52,500 in 2010, to 48,700 in Fall 2011, to 44,500 in Fall 2012.  The ABA recently released the Fall 2013 enrollment summary noting that it had fallen to 39,675.   The LSAC's most recent Current Volume Summary, from February 21, 2014,  indicates that applicants to law school are down roughly 11% compared to last year.  Thus, it seems reasonable to project that first-year matriculants will decline again in Fall 2014.  If first-year enrollment falls by 5%, that would give us roughly 37,700 first-years.  If it falls by 10% once again, that would give us roughly 35,700 first-years.

With these estimates for the number of first-years, we can estimate the number of graduates (which, as noted above, has averaged roughly 90% of first-years for the last few years).  Even if the number of full-time, long-term Bar Passage Required jobs does not continue to rebound, but plateaus at 26,000, as the number of graduates declines over the next few years, the percentage of law graduates obtaining a full-time, long-term Bar Passage Required job, as shown in Table 1, will grow to between 77% and 84% by 2017 (depending upon first-year enrollment in fall 2014).

TABLE 1

Analysis of the Estimated Number of Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of the Estimated Number of Law Graduates from 2012-2017

   

Grad. Year

2012

2013

2014

2015

2016

2017

2017

 

(5% Dec.)

(10% Dec.)

 

(1st Yrs 3 Yrs Prior)

 

51600

 

52500

 

48700

 

44500

 

39675

 

37700*

 

35700*

 
 

Grads (90% of 1st Yrs.)

 

46364

 

47250*

 

43830*

 

40050*

 

35708*

 

33930*

 

32130*

 
 

FT/LT BPR Jobs

26066

26000*

26000*

26000*

26000*

26000*

26000*

 

% of Grads in FT/LT BPR Jobs

 

56%

 

55%*

 

59%*

 

65%*

 

73%*

 

77%*

 

84%*

 

*Denotes estimated value.

An improvement in the number of law school graduates getting full-time, long-term Bar Passage Required jobs, from roughly 55% to between 77% and 84% is indicative of an improving employment market for law school graduates.  Indeed, according to Bernie Burk’s analysis of the employment market over the last few decades, this rate of employment in full-time, long-term Bar Passage Required jobs would rival or exceed the high water mark for “Law Jobs” of roughly 77% that he identified as having been experienced by the graduates from 2005 to 2007.  (And for his purposes, “Law Jobs” included some JD Advantage jobs.)  Moreover, this assumes no growth in the number of full-time, long-term Bar Passage Required jobs; if there is even modest growth in the number of full-time, long-term Bar Passage Required jobs over the next few years, the percentages of grads in these jobs would be even higher than reflected in this chart.

 Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of Those Eligible for Such Positions by Virtue of Having Passed a Bar Exam

Even so, many may look at this and suggest the market remains less than robust given that perhaps 16%-23% of graduates in this “improved” market in 2017 will not obtain full-time, long-term Bar Passage Required jobs. While some compare the number of full-time, long-term Bar Passage Required jobs to the number of law school graduates to demonstrate why the employment market for law school graduates remains unsatisfactory, this may not be the most accurate way of thinking about the market for full-time, long-term Bar Passage Required jobs as not all graduates are going to be eligible for Bar Passage Required jobs.

Among those graduating from law schools accredited by the Section of Legal Education and Admissions to the Bar and taking a bar exam upon graduation, the National Conference of Bar Examiners indicates that over the last several years, on average, roughly 83% of graduates of ABA-accredited law schools pass the bar exam on their first attempt. 

To calculate the employment market for law graduates in the coming years who are eligible for full-time, long-term Bar Passage Required jobs, let’s assume that all law graduates actually want a full-time, long-term Bar Passage Required job and therefore take a July bar exam, and let’s assume that 83% of them pass the bar exam on their first attempt.  This should give us the maximum number of graduates eligible for full-time, long-term Bar Passage Required jobs 10 months after graduation (which will be the measuring point starting with the Class of 2014). 

Even if we assume no growth in the number of full-time, long-term Bar Passage Required jobs in the coming years and simply hold the number of such jobs at a constant 26,000, the decreasing number of law graduates will mean an even more improved employment market for those seeking full-time, long-term Bar Passage Required jobs who will be eligible for those jobs by virtue of having passed the bar exam on their first attempt, increasing from nearly 70% in 2012 and 2013 to nearly 90% by 2016 and over 90% by 2017.

 TABLE 2

Analysis of the Estimated Number of Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of the Estimated Number of Law Graduates Eligible for Bar Passage Required Jobs from 2012-2017 

Graduating Year

2012

2013

2014

2015

2016

2017

2017

 

First   Year Enrollment

 

51600

 

52500

 

48700

 

44500

 

39675

 

37700*

(5% Dec.)

 

35700*

(10% Dec.)

 
 
 

Graduates   (90% of First Year Enrollment)

 

46364

 

47250*

 

43830*

 

40050*

 

35708*

 

33930*

 

32130*

 

83% of   Graduates (NCBE Avg. for First-Time Takers)

 

38482*

 

39218*

 

36379*

 

33242

 

29638*

 

28162*

 

26668*

 
 

FT/LT Bar   Passage Jobs

 

26066

 

26000*

 

26000*

 

26000*

 

26000*

 

26000*

 

26000*

 

Percentage   of Graduates Who Might Pass the Bar for whom FT/LT Bar Passage Jobs Likely   Would be Available

 

68%*

 

66%*

 

71%*

 

78%*

 

88%*

 

92%*

 

97%*

 

 *Denotes estimated value.

Notably, these estimates probably overstate the number of graduates who will be eligible for Bar Passage Required jobs.  First, not all law school graduates want to take a bar exam as some conclude that they are not interested in practicing law as a licensed attorney.  Second, given the increasing number of law school matriculants with LSATs less than 150, one could anticipate a slightly higher rate of attrition such that fewer than 90% of matriculants graduate after three years.  Third, given the increasing number of law school matriculants with LSATs less than 150, one also could anticipate that the historical average bar passage rate of 83% might be too generous.  All of these points suggest that the number of graduates eligible for full-time, long-term Bar Passage Required jobs may decline between now and 2017 even more than is indicated in Table 2.    

Between 2012 and 2013 to 2016 and 2017, we will have gone from having nearly seven full-time, long-term Bar Passage Required jobs for every ten graduates eligible for such positions by virtue of having passed a bar exam to having nine or more full-time, long-term Bar Passage Required jobs for every ten graduates eligible for such positions by virtue of having passed a bar exam. That strikes me as an improving employment market.

Of course, this may not be good news for those who graduated in the last few years into one of the toughest markets in history.  It is not clear that this improving market will be improving for them.  But it also is not clear that this "excess capacity" will unduly constrain the opportunities available to law school graduates in the coming years.  This excess capacity already has been impacting the market, yet the number of full-time, long-term Bar Passage Required jobs obtained within nine months of graduation grew by nearly 2000 between 2011 and 2012.  That is one reason I think the assumption of no further growth in full-time, long-term Bar Passage Required jobs is probably fairly conservative. 

In addition, this may not be good news for those who fail to pass the bar exam on their first try and may have to look for jobs that do not require bar passage.  While a significant percentage of these graduates will pass the bar exam on their second attempt and may eventually find employment in full-time, long-term Bar Passage Required positions, it may take several months longer than they had desired and may require that they pursue other employment, perhaps JD Advantage employment, during the intervening months.  

Even assuming a flat market for full-time, long-term Bar Passage Required jobs, as a result of significant declines in first-year enrollment that will mean a significant decline in the number of law school graduates in 2016 and 2017, we should be moving from having slightly more than three of ten graduates who were eligible for Bar Passage Required jobs in 2012 who could not find them to having less than one of ten graduates in 2017 who likely will be eligible for Bar Passage Required jobs who cannot find them.  While individual schools and local or regional markets may have more varied results on a "micro level," on a "macro level" this should be good news for current first-year students and students considering starting law school in the fall of 2014.

Whether this improving employment situation will be enough to change the trend in terms of declining number of applicants to law school remains to be seen.  While the future may be brightening, the "news" in the coming weeks will be the report on employment outcomes for 2013 graduates nine months after graduation.  As noted above, that may be somewhat uninspiring because any increase in the number of full-time, long-term Bar Passage Required jobs may be masked by the larger number of graduates in 2013 compared to 2012.  As a result, potential law school applicants may remain reluctant to make the commitment of time and money that law school requires because the "good news" message regarding future employment prospects for law graduates may fail to gain traction if the messages about employment outcomes for recent law school graduates continue to be less than encouraging.

March 1, 2014 in Data on legal education, Data on the profession, Structural change | Permalink | Comments (8)

Tuesday, February 4, 2014

If We Make Legal Education More Experiential, Would it Really Matter?

I think the answer is yes.  But, unfortunately, in virtually all of the debate surrounding legal education, there is a tremendous lack of clarity and precision about how we assess improvements in quality.  And equally relevant, if a gain is real, was it worth the cost?

The purpose of this essay is to chip away at this serious conceptual gap.  Until this gap is filled, experiential education will fall significantly short of its potential. 

Is Experiential Legal Education Better?  And if so, at What Cost?

Many legal educators believe that if we had more clinics, externships, and skills courses in law school, legal education would be better.  Why?  Because this more diversified curriculum would become more "experiential."  

Inside the legal education echo chamber, we often accept this claim as self-evident. The logic runs something like this.  A competent lawyer needs domain knowledge + practical skills + a fiduciary disposition (i.e., the lawyer’s needs are subservient to the needs of clients and the rule of law).  Since practical skills—and some would argue, a fiduciary disposition—cannot be effectively acquired through traditional Socratic or lecture teaching methods, the ostensible logic is that schools become better by embracing the "learning-by-doing" experiential approach.

That may be true.  I would bet on it. But the per-unit cost of legal education is also probably going up as well.  So, have we really created a viable and sustainable long-term improvement to legal education?  

In my mind, the questions we should be asking instead are the following:  (1) Among experiential teaching methods, which ones are the most effective at accelerating professional development?  And (2) among these options, how much does each cost to operate?  Quality and cost must be assessed simultaneously.  After they are evaluated, then we will be able to make choices and tradeoffs. 

Let's start with quality, which I define as moving lawyers toward their peak effectiveness potential as rapidly and cost-effectively as possible. This is an education design problem, as we are trying to find the right combination of education (building domain knowledge) and experience (acquiring and honing skills through practice).  There is also likely to be an optimal way to sequence the various educational and experiential steps. 

Creating Compelling Evidence of Educational Quality

We legal educators have many ideas on how to improve educational quality, but we make no real progress if employers and students remain unconvinced.  Can it be shown that because of a specific type of experiential curriculum at School X, its graduates are, during the first few years of practice, more capable lawyers than graduates of School Y?  

[Side bar:  If you are skeptical of this market test, it is worth noting that it was the preferences of law firm employers who gave rise to the existing national law school hierarchy.  It happened about 100 years ago when a handful of law schools adopted the case method, required undergraduate education as a prerequisite to admission, and hired scholars as teachers.  As a general matter, this was a far better education than a practitioner reading lecture notes at the local YMCA.  See William Henderson, "Successful Lawyer Skills and Behaviors," in Essential Qualities of the Professional Lawyer ch 5 (P. Haskins ed., 2013).]

If a law school can produce, on balance, a better caliber of graduates than its competitors, then we are getting somewhere.  As this information diffuses,  employers (who want lawyers who make their lives easier) will preference law schools with the better graduates, and law students (who want more and better career options) will follow suit. Until we have this level of conceptual and empirical clarity, we might as well be debating art or literature.

If students and employers are responding to particular curricula, it is reasonable to assume they are responding to perceived value (i.e., quality as a function of price).   I believe there are three steps needed to create a legal education curriculum that truly moves the market.

1. Clarity on Goals.  We need to understand the knowledge, skills, and behaviors that are highly prized by legal and non-legal employers.  Truth be told, this is tacit knowledge in most workplaces. It is hard intellectual work to translate tacit knowledge into something explicit that can be communicated and taught. But we are educators -- that is our job!  If we think employers are missing something essential, we can add in additional factors. That's our job, too.

2. Designing and Building the Program. Working backwards from our goals, let's design and build curricula that will, overall, accelerate development toward those goals.  This is harder and more rigorous than lesson planning from a casebook.

3. Communicating Value to the Market.  If our program is indeed better, employers and students need to know it.  This also requires a crisp, accurate message and a receptive audience.  This requires planning and effort.  That said, if our program truly is producing more effective lawyers, it logically follows that our graduates (i.e., the more effective lawyers) will be the most  effective way to communicate that message. 

Regarding point #3, in simple, practical terms, how would this work?  

During the 1L year, we show our law students the roadmap we have developed (step #2) and spend the next two years filling in the knowledge, skills, and behaviors needed to achieve their career goals.  This professional development process would be documented through a portfolio of work.  This would enable students to communicate specific examples of initiative, collaborative learning, problem-solving, or a fiduciary disposition, etc., developed during law school.  Students would also know their weaknesses, and have a clear plan for their future professional development. In a word, they'd stand out from other law graduates because, as a group, they would be much more intentional and self-directed (i.e., they'd know where they are going and how to get there). 

With such a curriculum in place, our law school would collaborate with employers assess the performance of our graduates.  By implication, the reference point for assessing quality would be graduates from other law schools.  When our graduates fare better, future graduates will be more heavily recruited.  Why?  Because when an employer hires from our school, they would be more likely to get a lawyer who helps peers and clients while adding immediate enterprise value.    

I suspect that many of my legal academic colleagues would argue the best law schools are not trade schools -- I 100% agree.  But I am not talking about a trade school model.  Rather, a world-class law school creates skilled problem-solvers who combine theory with practice and a fiduciary disposition. Graduates of a world-class law school would be reliably smart, competent, and trustworthy.  This is a very difficult endeavor. It takes time, planning, collaboration, creativity and hard work.  But the benefits are personal, organizational, and societal.  

At a practical level, I think few law schools have targeted this goal with a full, unbridled institutional commitment.  But the opportunity exists.

Applied Research 

When I got tenure in 2009, I decided that I was going to spend the next several years doing applied research. I am a fact guy.  Rather than argue that something is, or is not, better, I prefer to spend my time and effort gathering evidence and following the data.  I am also a practical guy.  The world is headed in this direction, thanks to the ubiquity of data in the digital age.  And, on balance, that is a good thing because it has the potential to reduce conflict. 

I have pursued applied work in two ways:  (1) building stuff (curricula, selection systems, lawyer development tools, datasets for making strategic decisions, etc.) and assessing how well it works, and (2) observing and measuring the work of others.

A Law School Curriculum Worth Measuring

A couple of years ago, a really unique applied research opportunity fell onto my lap.  I had a series of lengthy discussions on the future of legal education with Emily Spieler, who was then serving as dean of Northeastern University School of Law in Boston, a position she held for over a decade.  One of the raps on legal education is that it is more alike than it is different.  In fact, this very point was just made by the ABA Taskforce on Legal Education.  See ABA Task Force On The Future Of Legal Education, Report And Recommendations (Jan. 2014) at 2.

Emily, in contrast, said her school was unique -- that the curriculum better prepared students for practice and enabled them to make better career planning decisions.  Also, Emily stated that Northeastern students were more sensitized to the needs of clients and the privilege and burden of being a lawyer--specifically, that Northeastern grads become aware, before graduation, that their own lack of competency and diligence has real-world consequences for real-world people. And that reality weighed on students' minds.  

Tall claims.   But if Northeastern coulddeliver those outcomes more effectively than the traditional unstructured law school curriculum, I wanted to know about it.  

On a purely structural level, Northeastern Law is definitely unique.  Most law schools are organized on either quarters (University of Chicago, my alma mater) or semesters (Indiana University, where I teach). Northeastern, however, has both.  The 1L year curriculum at Northeastern is the traditional two semester model.  But after that, the school flips to quarters -- one quarter in law school, and one quarter in a cooperative placement with a legal employer, such as a judge, prosecutor’s office, a law firm, a corporate legal department, or a public interest organization.  

This classroom/coop sequence occurs four times over eight quarters.  Because the cooperative placement is not viewed as part of Northeastern's ABA-required course work -- all the contact hours are packed into two 1L semesters and four 2L/3L quarters -- students can be paid during cooperative placements.  And in any given semester, roughly 30 to 40% are getting paid. 

This system has been up and running for 45 years--over 5,000 students have become lawyers through this program.  What an amazing research opportunity! 

Now imagine the faculty meeting where the law professors get together to discuss and deliberate over whether to adopt the Northeastern model.  At Northeastern, "summer" means summer quarter, not summer vacation.  

How did this unique curricular structure come into being?  That is quite an interesting story. During the 1950s, the law school at Northeastern was shuttered.  Yet, reflecting the zeitgeist of the times, a group of Northeastern law alumni and young lawyers who were skeptical of their own legal education (at elite national law schools) petitioned Northeastern to reopen the law school and feature a more progressive, forward-looking curriculum.  The university administration agreed to reopen the law school on the condition that the school adopt the signature cooperative education model.  So this crucial decision was essentially made at the birth of the law school over four decades ago.  Once up and running, Northeastern Law implemented other innovations, such as the narrative grading policy--i.e., no letter grades and no GPA.  This was done in order to mitigate competition and encourage a focus on collaboration and skills development. 

The Outcomes Assessment Project

Back in 2011, my conversations with Emily Spieler eventually led me to make a two-day pilgrimage to Boston to talk with Northeastern Law faculty, students, administrators, and coop employers.  Suffice it to say, I was surprised by what I witnessed --a truly differentiated legal education with a substantial alumni/ae base spanning 45 years.  

That pilgrimage eventually led to my involvement in Northeastern Law's Outcomes Assessment Project (OAP), which is something akin to The After the JD Project, but limited in scope to Northeastern -- although Northeastern will provide all of the project tools and templates to other law schools interested in studying their own alumni.  From the outset, the OAP has been set up to scale to other law schools. 

There are lots of tricky methodological issues with Northeastern.  For example,

  • It has a longstanding public interest tradition; Northeastern Law is overrepresented in government service, public interest, and non-profit sectors (including a sizeable contingent of law professors and legal clinicians). See Research Bulletin No 1.
  • Its student body was over 50% female almost from the outset, nearly 20 years before legal education as a whole. 
  • Because of its progressive roots, GLBT law students have long been drawn to Northeastern Law -- again, nearly two decades before it was deemed safe to be out.

Because of this distinctive profile, we have to worry that any differences in graduates are primarily due to a selection effect (who applied and enrolled) versus a treatment effect (they got a different type of education).  That said, the admissions data show that Northeastern Law students are, like other law students, strongly influenced by the US News rankings.   If a student gets admitted to Northeastern Law and BC, BU, or Harvard Law, Northeastern seldom wins.  

Over the coming months, I am going to use OAP data to attempt to develop some analytical and empirical clarity to some of the questions surrounding experiential education.   Preliminary data from our Research Bulletin No 3 suggest that the coop program does remarkably well in developing the three apprenticeships identified by the Carnegie Report.  More on that later. 

Print version of this essay at JD Supra.

February 4, 2014 in Data on legal education, Important research, Innovations in legal education, Scholarship on legal education | Permalink | Comments (4)

Sunday, November 24, 2013

Understanding Trends in Demographics of Law Students – Part Three

Why the Difference in Response to Market Signals?

In Part One, I analyzed how analysis of changes in applicants from LSAC’s Top 240 Feeder Schools demonstrates that graduates of more elite colleges and universities have abandoned legal education at a rate greater than graduates of less elite colleges and universities. 

In Part Two, I analyzed how the pool of applicants to law school has shifted with a greater decrease among applicants with high LSATs than among applicants with low LSATs resulting in a corresponding  increase in the number and percentage of matriculants with LSATs of <150.

What might explain why applicants to law school are down more significantly among graduates of more elite colleges and universities than among graduates of less elite colleges and universities?  What might explain why applicants to law school are down more significantly among those with LSATs of 165+ than among those with LSATs of <150?  Is there some relationship between these data points?

There likely is some relationship between these data points.  Many of the more elite schools in the LSAC’s list of the Top 240 Feeder Schools have historically been schools whose graduates on average have higher LSAT scores compared with graduates from less elite schools.  The LSAC’s 1995 publication, Legal Education at the Close of the Twentieth Century:  Descriptions and Analyses of Students, Financing, and Professional Expectations and Attitudes, authored by Linda F. Wightman, discusses the characteristics of the population of students who entered law school in the fall of 1991.  Roughly 31% of the students scoring in the top quarter in terms of LSAT came from very highly selective undergraduate schools, roughly 31% from highly selective undergraduate schools, and only 17% from the least selective undergraduate schools.  Id. at page 38, Table 20. Thus, it is very likely that these two data points are related – that the greater decline among applicants from more elite colleges and universities is correlated directly with the greater decline among applicants with LSAT scores of 165+.

I want to offer three possible explanations for this differential response to market signals among different populations of prospective law students.  The first two focus on the possibility that market signals are communicated differently to different populations.  The third focuses on how different populations of prospective law students simply might respond to the same market signals in markedly different ways.

Different Pre-Law Advising Resources May Mean Market Signals Penetrate Some Populations of Prospective Law Students More Deeply Than Other Populations of Prospective Law Students.  Focusing first on the nature of the feeder schools, one possibility is that access to pre-law advising resources differs across these different categories of feeder schools resulting in different messages being communicated to applicants from less elite colleges and universities than to applicants from more elite colleges and universities regarding the cost of legal education and the diminished employment prospects for law school graduates in recent years.  Perhaps there are more robust pre-law advising programs among the elite colleges and universities than among the less elite colleges and universities, with pre-law advisors who really have their finger on the pulse of what is happening in legal education and the legal employment market.  Perhaps these more robust pre-law advising programs are engaging in programming and advising that communicates more effectively to prospective law students the significant costs of legal education and the ways in which the challenging employment reality for law graduates in recent years makes the significant cost problematic.  As a result, perhaps larger percentages of prospective law students at more elite colleges and universities are getting more information about the increasing costs and diminished employment prospects for law graduates and are deciding to wait to apply to law school or are deciding to pursue a different career completely. 

Alternatively, pre-law advisors may have different responses to market signals in thinking about their role in advising students.  Perhaps pre-law advisors at more elite colleges and universities are more directive about discouraging students from considering law school while pre-law advisors at less elite colleges and universities are more inclined simply to support student interest in pursuing law school.

There clearly are disparate allocations of resources to pre-law advising across various colleges and universities, different levels of engagement among pre-law advisors and different perspectives on how directive one should be in advising students considering law school.  That said, I am not sure these differences necessarily can be delineated in relation to the extent to which a college or university is considered an elite college or university or a less elite college or university.  Moreover, with so much information now available on the internet, it is not clear that pre-law advisors are the primary source of information for prospective law students. 

These hypotheses would benefit from being explored empirically.  What are the relative pre-law advising resources at the schools down more than 30% in applicants between 2010 and 2012 relative to the pre-law advising resources at the schools down less than 10%? Are pre-law advisors at the colleges and universities down more than 30% in applicants between 2010 and 2012 more inclined to affirmatively discourage students from considering law school than pre-law advisors at  colleges and universities down less than 10%?  Were prospective students at these two categories of schools really receiving different messages about the employment situation for law graduates and the cost of law school?

Different Social Network Signals and Influences --- Another possibility might involve social network signals and influences.  Significant empirical data indicates that on average different socio-economic populations attend different types of colleges and universities.  Among those entering law school in fall 1991 from very highly selective undergraduate schools, nearly three times as many were from families from upper socio-economic status as from lower-middle socio-economic status.  Legal Education at the Close of the Twentieth Century:  Descriptions and Analyses of Students, Financing, and Professional Expectations and Attitudes, at page 38, Table 20.  By contrast, among those entering law school in fall 1991 from the least selective undergraduate schools, nearly twice as many were from lower-middle socio-economic status as from upper socio-economic status.  Id. Similarly, there is fairly significant empirical data indicating that different socio-economic populations generally attend different tiers of law schools with more of the socio-economically elite at higher-ranked law schools and fewer of the socio-economically elite at lower-ranked low schools.  Id. at pages 30-31,  Table 15 and Figure 7; Richard H. Sander and Jane R. Bambauer, The Secret of My Success:  How Status, Eliteness and School Performance Shape Legal Careers, 9 J. Empirical Legal Stud. 893, Table 2 (2012)(analysis of the After the JD dataset looking at a representative sample of law school graduates who took the bar in 2000). 

Given this background, it would seem plausible that graduates of more elite colleges and universities on average represent more of an upper-income socio-economic population who may know more lawyers than graduates of less elite colleges and universities who may on average represent more of a middle class socio-economic population.  The parents of graduates of more elite colleges and universities may be more likely to be lawyers and/or have friends who are lawyers.  Thus, it is possible that graduates of more elite colleges and universities may be more likely to have received negative signals about the rising cost of legal education and the diminished employment prospects for law school graduates in recent years from family and friends than did their peers from less elite colleges and universities.  This hypothesis also would benefit from being explored empirically.

Different Decision Matrices Based on Socio-Economic Status and Opportunity – Another possibility is that regardless of whether students across different types of feeder schools really are getting different messages about the costs of legal education and the challenging employment prospects for law school graduates, they simply may be making different decisions in response to that information.  This hypothesis builds on the possibility that different populations of prospective law students may have different motivations for considering law school or may evaluate the value of a legal education using different parameters given different sets of options that might be available to them.  It is possible that the market signals regarding employment of law graduates are more nuanced than we might generally appreciate.  

For example, it may be that graduates of elite colleges and universities, who also tend to be among the socio-economic elite, have a variety of employment options coming out of college that are more attractive than law school at the moment given the diminished job prospects for law graduates in recent years.  If these students generally value a law degree primarily because of the status associated with acquiring a “prestigious” job in a big firm upon graduating from law school, than the significant decline in big firm jobs might frame their analysis of the value-proposition of law school.  Changes in the legal employment marketplace, particularly significant declines in the number of positions with “prestigious” big firms, may have made the legal profession less attractive to the socio-economic elite, who may be able to pursue job opportunities in finance, investment banking, consulting, or technology, or meaningful public interest opportunities such as Teach for America, that are viewed favorably within their social network. 

By contrast, for graduates of less elite colleges and universities, who are generally not from the socio-economic elite, fewer opportunities may be available in finance, investment banking, consulting, and technology.  In addition, they may lack the financial flexibility to make Teach for America or other public interest opportunities viable.  Moreover, this set of prospective law students may be more motivated simply about becoming a lawyer and acquiring the status that comes with being a lawyer (even if they are not going to become a big firm lawyer, but are simply going to be a family law attorney, or a public defender or a worker’s comp attorney).  This population may be less focused on big firm options and less concerned about the lack of jobs in that niche within the market and may see any position within the legal profession as a path toward financial security and social status, despite the increasing costs of legal education and the diminished employment prospects of law graduates. 

These hypotheses also may merit more empirical assessment.  What are the graduates of more elite colleges and universities choosing to do in greater numbers as significantly smaller numbers apply to law school?  Are there different motivations for pursuing law school among different socio-economic populations?

Regardless of the explanation for the current changes in application patterns, it would appear that the population of law students not only is shrinking, but may be going through a modest demographic transformation, with a somewhat smaller percentage of law students representing the socio-economic elite and a somewhat larger percentage of law students from lower on the socio-economic scale.  First-year students in 2013 may be slightly less “blue blood” and slightly more “blue collar” than they were in 1991.  Whether this is a short-term trend or a longer term reality remains to be seen.  What it might mean for legal education and the legal profession over time also remains to be seen.

November 24, 2013 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (0)

Thursday, October 17, 2013

Understanding Trends in Demographics of Law Students -- Part Two

Trends in LSAT Profiles of Applicants and Matriculants

In looking at trends over the last 12 years, there are two relevant time frames due to changes in how LSAC reported data.  Between 2002 and 2009, the LSAC’s annual National Decision Profiles were based on the average LSAT scores of applicants and matriculants.  From 2010 to the present, the National Decision Profiles were based on the highest LSAT scores of applicants and matriculants.  This post compares trends in LSAT profiles between 2002 and 2009 with trends between 2010 and 2013, noting that the latter period not only has seen a decline in enrollment but also has seen a significant weakening of the overall LSAT profile of first-years.

Changes in LSAT Profiles from 2002-2009 Using Average LSAT

The following chart shows the difference in LSAT composition of first-years in three cycles between 2001-02 and 2008-09.

Matriculants by LSAT Category (Reflecting Average LSAT) 2002-2009

                            165+                        150-164           <150                Total

2001-02               5,889                       30,100             9,097               45,086

2004-05               7,447                       32,007             6,036               45,490

2008-09               7,652                       31,991             8,943               48,586

In the three years between 2002 and 2005, applications grew by roughly 5,000, to roughly 95,000, with growth among those with an average LSAT of 165+ and an average LSAT of 150-164, and a modest decline among those with an average LSAT of <150.  Law schools matriculated only 400 more first-years in 2005 than in 2002, but there were roughly 3,050 fewer first-year students with average LSATs <150, with 1,900 more first years with average LSATs of 150-164 and roughly 1,550 more with average LSATs of 165+.  This three-year period saw strengthening of the LSAT profile of first-year students.

Four years later, with an applicant pool that had declined to nearly 87,000, however, law schools enrolled over 3,000 additional first-year students, 2,900 of whom had average LSATs of <150.  Virtually all of the growth in first-years between 2005 and 2009, therefore, was comprised of students at the lower end of the LSAT profile. 

Nonetheless, in comparison with the 2002 first-years, the 2009 first-years included slightly fewer students with an average LSAT of <150 (down 154 – 1.7%) and larger populations of students with average LSATs of 165+ (up 1,763 – nearly 30% more) and with average LSATs of 150-164 (up 1,891 – or roughly 6.3% more).  In 2009, therefore, the average LSAT profile of all first-years, while less robust than in 2005, was still more robust than in 2002.

Between 2004 and 2008, the ABA approved nine new law schools (with fall 2009 first-year enrollment in parentheses) – Atlanta’s John Marshall (211) and Western State (188) in 2005, Liberty (119), Faulkner (150) and Charleston (241) in 2006, Phoenix (272) in 2007, and Elon (121), Drexel (156) and Charlotte (276) in 2008.  The first-year enrollment of these nine schools in Fall 2009 totaled 1,734, roughly 60% of the growth in matriculants with average LSATs of < 150 between 2005 and 2009.  While many of the first-year students at these schools had LSATs of greater than 150, these schools took students who might have gone to other schools and increased the overall demand for applicants with average LSATs of <150.

Changes in LSAT Profiles from 2010-2013

The following chart focuses on the last three admissions cycles and the current admission cycle, covering the period in which the LSAC National Decision Profiles were based on each applicant’s highest LSAT score. 

Applicants and Matriculants Across Three LSAT Categories Based on Highest LSAT from 2010 to 2013

Adm. Cycle         Total     Total    Apps.   Mat.     Apps.      Mat.       Apps.   Mat.

                            Apps.    Mat.*   165+    165+   150-164 150-164    <150   <150

Fall 2010            87912   49719   12177   9477    47722     32862      26548   7013

Fall 2011            78474   45616   11190   8952     41435     29220      24396   7101

Fall 2012            67925   41422    9196    7571    34653      25425     22089    7906

Fall 2013**        59426   38900    7496    6300    30263      24000     20569    8200

*Note that the total matriculants number is greater than the sum of the matriculants across the three categories in any given year because the total matriculants number includes non-standard test-takers and those without an LSAT.

**The Fall 2013 numbers represent estimates based on the number of applicants in each category and an assumption that 2013 saw another slight increase in the percentage of applicants from each LSAT category who matriculated (consistent with increases in the two previous years in response to the decreasing applicant pool).

During this period, the number of applicants declined by 28,000, or over 32%, but the number of applicants with a highest LSAT of 165+ declined by 38%, and the number with a highest LSAT of 150-164 declined by 36.5%, while the number with a highest LSAT of <150 declined by only 22.5%.  Thus, the pool of applicants is not only smaller in the 2012-13 admissions cycle as compared to 2009-10, but it is “weaker” in terms of the LSAT profile.  

The number of matriculants in the top two LSAT categories also declined significantly between Fall 2010 and Fall 2012, while the number of matriculants in the bottom LSAT category actually grew. 

The number of matriculants whose highest LSAT score was 165+ fell from 9,477 in 2010 to 7,571 in 2012, a decline of over 20%, while the percentage of applicants in this category who became matriculants increased from 78% to 80% to 82% over that period.  If we estimate that 84% of the 2013 applicants with a highest LSAT of 165+ matriculate, then we can anticipate roughly 6300 matriculants for Fall 2013 with a highest LSAT of 165+, a drop of nearly 33% since 2010. 

The number of matriculants whose highest LSAT score was 150-164 fell from 32,862 in 2010 to 25,425 in 2012, a decline of nearly 23%, while the percentage of applicants in this category who became matriculants increased from 69% to 70.5% to 73% over that period. If we estimate that roughly 79% of the applicants with a highest LSAT of 150-164 matriculate, then we can anticipate roughly 24,000 matriculants for Fall 2013 with an LSAT of 150-164, a decline of roughly 27% since Fall 2010. 

Meanwhile, the number of matriculants whose highest LSAT score was <150 grew from roughly 7,000 to over 7,900, an increase of roughly 13%, while the percentage of applicants in this category who became matriculants increased from 26% to 29% to 36% over that period. If we estimate that roughly 40% of the applicants with a highest LSAT of <150 matriculate, then we can anticipate roughly 8,200 matriculants with an LSAT of <150 for Fall 2013, an increase of roughly 17% since Fall 2010. 

Percentage of First-Years from Each LSAT Category Using Highest LSAT-- 2010-2013*

            165+               150-164           <150

2010    0.191               0.661               0.141

2011    0.196               0.641               0.156

2012    0.183               0.614              0.191

2013    0.162               0.617              0.211

*The sum of the percentages in any given year will be slightly less than 1.00 because the denominator -- total matriculants -- includes matriculants with non-standard LSAT and those with no LSAT.

This table shows that if my estimates for 2013 are roughly accurate, while the percentage of matriculants whose highest LSAT score was 165+ in the first-year class has declined between Fall 2010 and Fall 2013 by roughly 16% (from 19% to 16%) and the percentage of matriculants whose highest LSAT was 150-164 has declined by roughly 6% (from 66% to 62%) the percentage of matriculants whose highest LSAT was <150 has increased 50% (from 14% to 21%).

Adjusting from Highest LSAT to Average LSAT to Compare 2002 and 2013

The change in the 2009-10 admissions cycle to using highest LSAT rather than average LSAT resulted in an increase in matriculants with scores of 165+ of roughly 1,800 between Fall 2009 and Fall 2010.  Given that there had been a modest increase in the number of matriculants with an average LSAT of 165+ between 2008 and 2009 (an increase of roughly 600, from 7,023 to 7,652), it might be fair to assume that there would have been another modest increase in the number of matriculants with an average LSAT of 165+ between 2009 and 2010 given the challenging economic environment at the time and the continued growth in applications between 2009 and 2010.  Assume then that of the 1,800 additional matriculants with scores of 165+, 400 would have been included in the category if we were still using an average LSAT of 165+ rather than the highest LSAT of 165+.  That would suggest that to estimate the number of matriculants with an average LSAT of 165+ in 2010, it might make sense to subtract 1,400 matriculants from the number of matriculants with a highest LSAT of 165+ in 2010 and then for the next three years apply the same percentage reduction as reflected in the number of those with a highest LSAT of 165+ over those three years.

The change to highest LSAT rather than average LSAT also resulted in a drop in the number of matriculants with an LSAT <150 between 2009 and 2010 of roughly 1,900 matriculants. Notably, the number of applicants and matriculants with an average LSAT <150 had grown slightly between 2007 and 2009 (applicants from 29,123 to 29,926, matriculants from 7,013 to 7,906).  Nonetheless, to err on the conservative side, assume that the number of matriculants with an average LSAT <150 actually may have declined in Fall 2010 from Fall 2009 rather than continuing to increase modestly.  Assume it would have declined by roughly 5% or 400 (rather than 1,900).  That would mean that to estimate the number of matriculants with an average LSAT of <150 in Fall 2010, we would need to add to the number with a highest LSAT of <150 roughly 1,500 more matriculants and then for the next three years apply the same percentage increase as reflected in the number of those with a highest LSAT of <150 over those three years.

Using these assumptions, the estimated number of first-years with an average LSAT of 165+ would fall to roughly 5,400 as of Fall 2013, while the estimated number of first-years with an average LSAT of <150 would rise to over 9,800 in Fall 2013. 

If the estimates above are close to accurate, then the number of Fall 2013 matriculants with an average LSAT score of 165+ represents roughly 14% of Fall 2013 matriculants (a slightly higher percentage than in Fall 2002), while the number of Fall 2013 matriculants with an average LSAT of <150 represents over 25% of Fall 2013 matriculants (a much higher percentage than in Fall 2002).  The following chart shows the percentage of matriculants for the period from 2002-2013 taking into account the estimates set forth in the preceding paragraph regarding the number of matriculants with an average LSAT in each range over the period from 2010-2013. 

  Image1

This graph shows that the percentage of matriculants with an average LSAT of 165+ has varied between roughly 13% and roughly 17% percent over the period from 2002-2013, and appears to have returned in Fall 2013 to a percentage only slightly higher than where it was in Fall 2002.  By contrast, this chart also shows that the percentage of matriculants with an average LSAT of <150 had varied between roughly 19% and roughly 13% until the Fall 2012 and Fall 2013 groups of matriculants, when the percentages increased to roughly 22% (in 2012) and over 25% (in 2013).  While this graph does not include the percentage of matriculants with average LSATs of 150-164, one can infer that percentage as the difference between 100% and the sum of the 165+ percentage and the <150 percentage.  For the period between 2002 and 2011, this generally hovered between 65% and 70%, but in the last two years it has fallen closer to 60%.

This shift in LSAT profile is further evidenced by changes in LSAT profiles among first-year entering classes between 2010 and 2013.  For Fall 2010, there were only nine law schools with a median LSAT of 149 or lower (using highest LSAT for reporting purposes).  For Fall 2011, there were 14 law schools with a median LSAT of 149 or lower.  For Fall 2012, there were 21 law schools with a median LSAT of 149 or lower.  That number may grow to nearly 30 when official data is published next spring on the Fall 2013 entering class.  

If one uses the LSAT profile as an indicator of the “strength” of a given class of first-year students, and uses the framework set forth above for looking at the LSAT profile, then in the last three years we not only have seen first-year enrollment shrink by roughly 10,000 students, but also have seen a significant “weakening” of the LSAT profile.  In terms of LSAT profile, the Fall 2013 entering class is almost certainly the weakest of any class going back to Fall 2002. This may impact the classroom experience at some law schools and may impact bar passage results when the Fall 2013 entering class graduates in 2016.

Why the Differential Response to Market Signals by Different Populations of Prospective Law Students?

What might explain the extent to which different populations of prospective law students have responded to market signals in such different ways, with those from elite college and universities and those with higher LSATs turning away from law school more than those from less elite colleges and universities and those with lower LSATs?  In Part Three I will explore some possible explanations.

October 17, 2013 in Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (0)

Tuesday, October 15, 2013

What Does a JD-Advantaged Job Look Like? Job Posting for a "Legal Solutions Architect"

Below is job posting for a new type of job called a "legal solutions architect."  

SeyfarthThe job post just appeared on the website of Seyfarth Shaw, a large law firm based in Chicago.  Seyfarth was one of the first to embrace the movement toward technology and process.  See Six Sigma at Seyfarth Shaw, Legal Professions Blog, April 14, 2010.  

Before getting to the text of the ad, a few of observations for what this posting is telling us about legal education and the emerging legal job market:

  • This is a pure JD advantaged job. "Juris Doctor or MBA with legal industry experience strongly preferred job" (emphasis in original). It is full-time, long-term job in downtown Chicago.  it is not reviewing documents. This is a good professional job doing very sophisticated and challenging work.
  • The job is not partner-track.  But it terms of economic potential and job security, does that matter?  In the years to come, folks that understand the overlay between law, technology, and process are going to be great demand and have a lot of options.
  • Undergraduate education matters, but the majors are far from typical among traditional law students:  finance, business administration, computer science, or "other technical discipline."
  • It is easier to get this job if an applicant has familiarity with "extranets, intranets, document assembly, enterprise search, relational databases and workflow."  Also, it is "a plus" to have "familiarity with Agile and Scrum [two software development tools]." We don't teach any of this stuff in law school.  Perhaps we should.
  • The required skills are an blend of technical skills and knowledge plus higher order professional abilities that, frankly, are not explicitly taught in law school.  Law schools need to take notice, as this an order any decent professional school should be able to fill.

Now the actual job posting:

Requisition Number 

13-0191

Post Date 

10/10/2013

Title

Legal Solutions Architect

City

Chicago

State

IL

Seyfarth Shaw is one of the most progressive, forward-thinking law firms in the world. Seyfarth’s commitment to delivering legal services in a new way through its SeyfarthLean program - with an emphasis on value and continuous improvement - has been praised by the Association of Corporate Counsel (ACC) as being “five years ahead of every other AmLaw 200 firm.”

Legal Solutions Architects anticipate, identify, sell and drive innovative business solutions. Through an understanding of technology, knowledge management, business analysis, process improvement and project management, this role provides solutions that enhance the client experience. These multidisciplinary resources are aligned with Firm strategy and play an important role in driving the Firm’s innovative approach to the practice of law and the delivery of legal services.

This position will report to the Director of the Legal Technology Innovations Office. Seyfarth Shaw recently received awards for 2013 Innovative Law Firm of the Year and Innovative Project of the Year, and the efforts of the Legal Technology Innovations Office played a significant role in earning those recognitions.

Essential Functions

  • Partner with clients, Seyfarth legal teams and legal project managers to enhance the delivery and effectiveness of services provided within legal engagements
  • Translate stated and inferred needs of clients and attorneys into specific technologies and methods
  • Synthesize the needs of multiple engagements and create requirements for systematic solutions that underpin Seyfarth’s varied legal practices
  • Team with the Application Development Group to design and plan for custom solutions and oversee the construction and implementation of these systems
  • Manage multiple projects concurrently, juggling priorities, deadlines and essential duties for each project
  • Collaborate with other Firm departments, including Legal Project Management Office, Practice Management, Finance, Marketing and Professional Development to provide comprehensive solutions
  • Act as an effective change manager – keeping client and Firm culture, group behavior and individual habits in mind in order to best circumnavigate roadblocks and pitfalls for solution adoption
  • Provide presentations to individuals, small groups and large audiences of clients and Seyfarth attorneys in a persuasive and encouraging manner
  • Contribute to continuous improvement, promote the use of technology solutions and help improve the awareness of the impact of the solutions on the business
  • Perform vendor due diligence and serve as a point of contact for third-party technologies leveraged by the Firm
  • Conduct market, external and internal research and convey results to forward assigned projects and to aid projects lead by teammates, other groups and other departments
  • Proactively research and maintain knowledge of emerging technologies and service delivery models and possible applications to the business

Skills:

  • Highly motivated self-starter with an entrepreneurial bent
  • Uses intelligence, creativity and persistence to solve varied, non-routine problems
  • Possesses an understanding of knowledge management, process improvement and legal project management and an appreciation of the benefits to law firms employing these approaches
  • Passion for legal technology, including technical platforms, specific technical applications and their impact on the practice of law
  • Keen grasp of project management, flexible in project execution and able to meet aggressive deadlines
  • Strong business analysis approach
  • Visualizes how raw data can be converted into useful information for client and Firm decision-makers
  • Pays attention to detail but still maintains focus on the bigger picture
  • Comfortable working both independently and in diverse teams
  • Excellent written and verbal communicator that is able to distill complex concepts into simple messages
  • Familiar with the software development cycle
  • Capable of managing and motivating up, down and across the organization
  • Appreciation for user interface and user experience design
  • Embraces change and seeks to create order from chaos

Requirements

  • Bachelor’s degree, preferably in finance, business administration, computer science or other technical discipline
  • Juris Doctor or MBA with legal industry experience strongly
  • Experience working within a large law firm preferred but not required
  • Familiarity with extranets, intranets, document assembly, enterprise search, relational databases and workflow preferred
  • Familiarity with Agile and Scrum a plus

Seyfarth Shaw is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the employment process, please call (312) 460-6545 and let us know the nature of your request and your contact information. We offer an outstanding benefit package which includes: medical/dental, 401k with employer contribution; life insurance; transportation fringe benefit program; generous paid time off policy; and long-term and short-term disability policies. Equal Opportunity Employer M/F/D/V

Apply online

Send this job to a friend

October 15, 2013 in Current events, Data on legal education, Innovations in law, Law Firms, New and Noteworthy, Structural change | Permalink | Comments (0)

Friday, October 11, 2013

Understanding Trends in Demographics of Law Students – Part One

Analysis of Differential Declines in Law School Applicants Among Top-240 Feeder Schools

Some people recently have noted the decline in applications to law school from graduates of relatively elite colleges and universities - here and  hereThis suggests that different populations of potential applicants to law school are responding differently to market signals about the cost of legal education and the diminished employment prospects for law school graduates in recent years.  

In this blog posting, I analyze the changes in applications among the LSAC's Top 240 Feeder Schools between 2010 and 2012, documenting the extent to which the response to market signals about legal education has been different among graduates of elite colleges and universities when compared with graduates of less elite colleges and universities.  In Part Two, I will look at a different set of data regarding changes in LSAT profiles of applicants.  In Part Three, I will offer some possible explanations for the different responses to market signals among different groups of applicants.

Overview

Between 2010 and 2012, the total number of applicants from the Top 240 Feeder Schools fell from 55,818 to 42,825.  In both years, the Top 240 Feeder Schools were responsible for roughly 63% of the total pool of applicants (63.5% of 87,900 in 2010 and 63.1% of 67,900 in 2012).  But the decline in applications was not uniform across all of the Top 240 Feeder Schools.  There are a few different ways one can look at this information to get a sense of the different responses among different populations of potential applicants. 

Differential Declines Among Feeder Schools with Law Schools Ranked in Different Tiers

First, one can look at declines across the Top 240 Feeder Schools that have law schools.

One might surmise that potential applicants who are graduates of colleges and universities with a law school might be particularly well aware of the increasing costs of legal education and the challenging employment environment for recent law school graduates and assume that feeder schools with law schools would generally see similar declines in applications.  In fact, however, the percentage decline in applications between 2010 and 2012 varied significantly by the ranking of the law school at the feeder school.

Among feeder schools with law schools ranked between 1-50 in the most recent USNews rankings, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 28.08%.  Among feeder schools with law schools ranked between 51-100, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 20.27%.  Among feeder schools with law schools ranked between 100-146, the average percentage decline in applicants was 18.14%.  But among feeder schools with law schools that are ranked alphabetically, the average percentage decline in applicants was only 3.31%.   

Given that most of the top ranked law schools are at colleges and universities that also are considered elite colleges and universities, and most of the alphabetically ranked law schools are at colleges and universities that are not considered elite colleges and universities, this analysis suggests that graduates of elite colleges and universities are responding to the market signals regarding legal education differently than graduates of less elite college and universities.  (This may seem particularly paradoxical, given that the percentage decline in applicants generally is greater at colleges and universities with more highly ranked law schools (whose graduates generally experience more promising employment outcomes) while the percentage decline in applicants is lowest at colleges and universities with less highly ranked law schools (whose graduates generally experience less promising employment outcomes.))

Comparisons of Outlier Schools – Those Schools More than One Standard Deviation from the Mean

Second, one can look at “outlier” schools and see how negative outliers compare to positive outliers.  The average percentage decline in applicants across the Top 240 Feeder Schools between 2010 and 2012 was 19.76%.  The standard deviation was 18.67%.  How do those schools more than one standard deviation from the mean compare with each other?

There are a total of 13 schools that saw a decline in applicants between 2010 and 2012 putting them below the mean by more than one standard deviation – schools with a decline in applications greater than 38.44%.  There are a total of 26 schools that saw an increase in applications or such a modest decline in applications that their increase/decline was more than one standard deviation above the mean – a decline of less than 1.09% or an increase.  How do these schools compare? 

Eight of the 13 feeder schools that saw the most significant declines in applications had a law school with an average rank of 69.  (These schools include NYU (6), Virginia (7), Cornell (13), George Mason (41), Marquette (94), Akron (119), Loyola (New Orleans) (126), and Univ. of San Fran. (144). Four of the eight were top-50 law schools, while none were alphabetically ranked.) 

Thirteen of the 26 feeder schools that saw the least significant declines in applications (or saw increases in applications) had a law school, including four that were ranked alphabetically.  Among just the nine law schools in this category that are ranked, the average rank is 104.  (These schools include Denver (64), UNLV (68), Loyola (Chicago) (76), Rutgers (91), Florida International (105), Wyoming (113), CUNY (132), Southern Illinois (140), and Suffolk (144), along with Florida A & M, North Carolina Central, Nova Southeastern, and Southern (all alphabetical).  Notably, only four of the thirteen were ranked in the top-100 law schools (none in the top-50).)

Again, in this analysis, with a few exceptions, those feeder schools that saw significant declines in applicants generally represent a more elite slice of American colleges and universities, while those with the most nominal declines in applicants (or increases in applicants) generally represent a less elite slice of American colleges and universities.

Outliers More Broadly – Comparing Schools with Declines Greater than 30% and Less than 10%

Third, if one wanted to look at a broader pool of feeder schools at the bottom and the top, one could look at all schools down 30% or more in applicants and all schools that were down 10% or less in applicants between 2010 and 2012 (roughly 10% above and below the mean), two sets that account for nearly half of the Top 240 Feeder Schools.

There were 68 schools down 30% or more in applicants, 46 of which had a law school, of which 29 were ranked in the top-50, with only one school ranked alphabetically.  The average rank of the 45 numerically ranked law schools was 48.  The other 22 feeder schools in this category include several highly regarded schools – including, for example, Rice, Vassar, Miami University, Brown, Amherst, Johns Hopkins and Princeton.

There were 51 schools with a decrease in applicants of 10% or less, 25 of which had law schools, only two of which were ranked in the top-50, with six schools ranked alphabetically.  The average rank of the 19 numerically ranked law schools was 94.  The other 26 feeder schools in this category include mostly less elite colleges and universities – including, for example, Kenesaw State University, University of Texas at San Antonio, and Florida Gulf Coast University, along with University of Phoenix and Kaplan University.

Conclusion

All three approaches to analyzing the changes in applicants among the Top-240 Feeder Schools point in the same direction.  Graduates of elite colleges and universities are opting not to apply to law school at a greater rate than graduates of less elite colleges and universities.  One might suppose that this translates to a greater decline in the number of applicants and matriculants with really high LSATs (165 or above) as compared to those with relatively low LSATs (149 and below).  In Part 2, I explore whether this supposition is accurate.

Posted by Jerry Organ

October 11, 2013 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Wednesday, October 2, 2013

Enduring Hierarchies in American Legal Education

Because the U.S. News & World Report ranking era has been associated with so much turmoil and bad behavior, many of us in legal education tend to think of the magazine as the source of woes.  In fact, the evidence compiled in an new paper on SSRN, "Enduring Hierarchies in American Legal Education," suggest that our desire (or propensity) to establish a legal education pecking order predates the U.S. News rankings by century or so.  Vanity of vanities, all is vanity -- at least that is what the data seem to suggest.

 My brilliant and industrious colleagues, Funmi Arewa and Andy Morriss, led the charge on this.  For many, a major contribution of this research will be the detailed 40+ tables compiled at the end of the article.  Now that all that fact-gathering work is done, others can use it.   Below is the paper's abstract:

Although much attention has been paid to U.S. News & World Report’s rankings of U.S. law schools, the hierarchy it describes is a long-standing one rather than a recent innovation. In this Article, we show the presence of a consistent hierarchy of U.S. law schools from the 1930s to the present, provide a categorization of law schools for use in research on trends in legal education, and examine the impact of U.S. News’s introduction of a national, ordinal ranking on this established hierarchy. The Article examines the impact of such hierarchies for a range of decision-making in law school contexts, including the role of hierarchies in promotion, tenure, publication, and admissions, for employers in hiring, and for prospective law students in choosing a law school. This Article concludes with suggestions for ways the legal academy can move beyond existing hierarchies and at the same time address issues of pressing concern in the legal education sector. Finally, the Article provides a categorization of law schools across time that can serve as a basis for future empirical work on trends in legal education and scholarship.

Posted by Bill Henderson

October 2, 2013 in Data on legal education, New and Noteworthy, Scholarship on legal education | Permalink | Comments (0)

Monday, September 16, 2013

The Trend Toward Legal Onshoring -- What It Tells Us about the Future of Law

The trend toward outsourcing of legal work to India may be giving way to "onshoring."  What is the attraction of moving legal jobs back to the US?  The wage gap between India and the US is closing, but more importantly, innovation and continuous improvement are significantly aided by proximity. 

I heard this perspective from a friend of mine who was part of the management team of a successful LPO that was sold (at a substantial profit) to a much larger legal conglomerate.  Indeed, he contemplated getting back into the business, but this time running an onshoring operation. 

BlackHillsIP-logo-RGB-2color-xsThis identical perspective is on display in a recent Minneapolis StarTribune story on Black Hills IP, a 2.0 legal process outsourcer that provides various types of managed services for all things related to intellectual property.   According to its website, Black Hills IP is a "US-based IP paralegal service that is faster, more accurate and more cost-effective than in house departments and off-shore providers."  The company appears to be growing, as it did a PR-blitz to commemorate its 100th client.  The company was originally started in Rapids City, South Dakota but has since expanded to Minneapolis.

Cpa globalWhat make this story especially interesting is that many of the folks who started Black Hills IP were sophisticated Minneapolis corporate lawyers who created a company in the early 2000s called Intellevate, a 1.0 LPO that was sending legal work to India.  In 2006, Intellevate became part of CPA Global, a much larger LPO.  In other words, the folks at Black Hills IP are industry players with much better information than the rest of us who are making bets with their own money. 

Unlike traditional law firms, these types of legal vendors are growing rapidly.  Their secret sauce appears to be combining high-quality processes with capable, motivated paraprofessional talent. 

McCrackinThe challenge for law schools and many practicing lawyers is getting our heads around the fact that, from a pure market perspective, bright legal minds may be less valuable than well-designed and well-executed legal processes and systems.  This state of affairs is just as much an opportunity as it is a threat.

One last interesting note suggesting that companies like Black Hills IP are part of the same ecosystem as traditional law firms and law schools: The CEO of Black Hills IP is Ann McCrackin, a former professor of law at Franklin Pierce (now University of New Hampshire School of Law), where she was director of the Patent Prosecution and Procedure Program.  Prior to that, McCrackin was a shareholder in Schwegman, Lundberg & Woessner, a large patent law firm based in Minneapolis that specializes in high technology.

posted by Bill Henderson

September 16, 2013 in Current events, Data on legal education, Data on the profession, Innovations in law, New and Noteworthy, Structural change | Permalink | Comments (1)

Wednesday, July 3, 2013

Conditional Scholarships and Scholarship Retention for 2011-12

             As a result of the ABA’s revisions to Standard 509, Consumer Information, there is now a much greater universe of publicly available information about law school scholarship programs, specifically conditional scholarship programs and scholarship retention.  Based on a review of law school websites conducted between March 19 and May 29, 2013, I have compiled a complete list of schools with conditional scholarship programs, with only one-year scholarships, with good standing (or guaranteed) scholarships and with only need-based scholarships. 

            The availability of this data now gives each admitted scholarship recipient some meaningful basis for assessing the likelihood that any given scholarship will be renewed.   (That said, within a given cohort of conditional scholarship recipients at a given school, those at the top end of the entering class profile likely retain their scholarships at a higher percentage than reflected in the law school's overall data while those further down the class profile likely retain their scholarships at a lower percentage than reflected in the law school's overall data.)

            What do we know about the conditional scholarship programs in place for students entering law school in 2011-12?  There were 140 schools with conditional scholarship programs.  The average retention rate across all law schools was 69%.  In total, 12,735 students who entered law school in the fall of 2011 and continued into their second year of law school at the same school entered with conditional scholarships and 4,387 students lost those scholarships, a retention rate across individual students of 66%. Across the 194 law schools on which I compiled data, the Fall 2011 entering first-year class totaled 46,233, so roughly 27.5% of the students in the Fall 2011 entering first-year class were on conditional scholarships and roughly 9.5% of the students in the Fall 2011 entering first-year class failed to retain their conditional scholarship as they moved into the second year of law school.

            The distribution of scholarship retention rates by deciles across all 140 schools reporting conditional scholarship programs is set forth in Table 1.  Table 1 shows the largest number of law schools grouped around the overall average retention rate, with 30 law schools in the 60-69% range and 24 law schools in the 70-79% range; nearly 40 percent of law schools with conditional scholarships fall in these two ranges.  Interestingly, the decile range of 90% or better is the second largest decile range, with 26 law schools (nearly half of which are ranked 50 or better in the USNEWS ranking).  Notably, 23 law schools had scholarship retention rates of less than 50%.

 Table 1: Number of Law Schools Reporting Retention Rates by Decile Range 

Retention Rate

Number

Brief Description

Less than 40%

8

Four of the eight were law schools ranked alphabetically

40-49%

15

Eight of the 15 were law schools ranked between 50 and 99

50-59%

20

16 of the 20 were law schools ranked 100 or lower, while only two were in the top 50

60-69%

30

23 of the 30 were law schools ranked 100 or lower, while only one was in the top 50

70-79%

24

13 of the 24 were law schools ranked in the top 100, but only three of those were in the top 50

80-90%

17

12 of the 17 were law schools ranked between 50 and 145

90% or better

26

12 of the 26 were law schools ranked in the top 50

             As shown in Table 2, law schools ranked in the top-50 in the U.S.News 2012 Rankings had the smallest percentage of law schools with conditional scholarship programs, with only 20 law schools – 40% -- having conditional scholarship programs, directly impacting only 1,674 students who had conditional scholarships (12.8% of the 13,109 first-year students at these law schools) and only 192 who failed to retain their scholarships (11.5% of the 1674 conditional scholarship recipients and only 1.5% of the 13,109 first year students).   By contrast, across the balance of law schools, over 80% of the law schools had conditional scholarships with 11,061 of the 33,124 first-year students (33.4%) having conditional scholarships and 4,195 (37.9% of those on scholarship and 12.7% of first-years at the balance of law schools) losing their scholarships after their first-year of law school.

 Table 2: Number and Percentage of First-Year Students in 2011 Having Conditional Scholarships and Losing Conditional Scholarships by US News Rankings Categories 

 

Top 50 Law Schools

Law Schools Ranked 51-100

Law Schools Ranked 101-146

Law Schools Ranked Alphabetically

Total Number of Law Schools

50

50

46

48

Number (%) of Law Schools with Conditional Scholarship Programs

20 (40%)

40 (80%)

36 (78.3%)

43 (89.6%)

Total First-Years at These Law Schools

13,109

11,592

9,293

12,239

Number (%) of First-Years with Conditional Scholarships

1,674 (12.8% of all first-year students in top-50 schools)

4,176 (36% of all first-year students in schools 51-100)

2,754 (29.6% of all first-year students in schools 101-145)

4,131 (33.6% of all first-year students at alphabetically-ranked schools)

Number (%) of Conditional Scholarship Recipients NOT Retaining Scholarships

192 (11.5% of conditional scholarship recipients and 1.5% of first-years)

1,454 (34.8% of conditional scholarship recipients and 12.5% of first-years)

1,044 (37.9% of conditional scholarship recipients and 11.2% of first-years)

1,697 (41% of conditional scholarship recipients and 13.7% of first-years)

            A number of law schools switched to non-conditional scholarship programs for 2012-13 or will be switching to non-conditional scholarship programs for the 2013-14 academic year. As a result, for the 2013-14 academic year, there will be 131 law schools with conditional scholarship programs, five law schools with non-renewable one-year scholarships, four that only offer need-based scholarships, and 54 law schools with good standing (or guaranteed) scholarships.  Of the 194 schools on which I was gathering information, therefore, as of the 2013-14 academic year, 70% will have conditional or one-year scholarship programs (136/194), while nearly 28% will have good standing (or guaranteed) scholarships (54/194), with 2% (4/194) having only need based scholarship assistance. (Note that some law schools with conditional scholarship programs also offer some scholarships on a non-conditional basis and/or offer some need-based assistance.)

            Those who might be interested in a more detailed analysis of conditional scholarship programs, may want to look at the draft article I have posted on SSRN – Better Understanding the Scope of Conditional Scholarship Programs in American Law Schools

[posted by Jerry Organ]

July 3, 2013 in Data on legal education, New and Noteworthy, Scholarship on legal education | Permalink | Comments (0)