Wednesday, December 10, 2014

BETTER UNDERSTANDING THE TRANSFER MARKET

What do we know about the transfer student market in legal education? 

Not enough.  But that will begin to change in the coming weeks.

NUMBER/PERCENTAGE OF TRANSFER STUDENTS HAS INCREASED MODESTLY

Up until this year, the ABA Section of Legal Education and Admissions to the Bar only asked law schools to report the number of transfer students “in” and the number of transfer students “out.”  This allowed us to understand roughly how many students are transferring and gave us some idea of where they are going, and where they are coming from, but not with any direct “matching” of exit and entrance.

Has the number and percentage of transfer students changed in recent years?

In 2010, Jeff Rensberger published an article in the Journal of Legal Education in which he analyzed much of the then available data regarding the transfer market and evaluated some of the issues associated with transfer students.  He noted that from 2006 to 2009 the number of transfer students had remained within a range that represented roughly 5% of the rising second-year class (after accounting for other attrition) – 2,265 in summer 2006, 2,324 in summer 2007, 2,400 in summer 2008, and 2,333 in summer 2009.)  

Using data published in the law school Standard 509 reports, the number of transfers in 2011, 2012 and 2013 has increased only marginally, from 2427 to 2438 to 2501, but, given the declining number of law students, it has increased as a percentage of the preceding year’s first-year “class,” from 4.6% to 5.6%.  Thus, there is a sense in which the transfer market is growing, even if not growing dramatically.

Numbers of Transfer Students 2006-2008 and 2011-2013

 

2006

2007

2008

2011

2012

2013

Number of Transfers

2265

2324

2400

2427

2438

2501

Previous Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

% of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET

In 2008, Bill Henderson and Brian Leiter highlighted issues associated with transfer students.   Henderson and Leiter were discussing the data from the summer of 2006.  Brian Leiter posted a list of the top ten law schools for net transfer students as a percentage of the first year class.  Bill Henderson noted the distribution of transfer students across tiers of law schools (with the law schools in the top two tiers generally having positive net transfers and the law schools in the bottom two tiers generally having negative net transfers), something Jeff Rensberger also noted in his 2010 article.   

Things haven’t changed too much since 2006.  In 2012, there were 118 law schools with fewer than 10 “transfers in” representing a total of 485 transfers – slightly less than 20% of all transfers.  On the other end, there were 21 schools with 30 or more “transfers in” totaling 996 transfers -- nearly 41% of all transfers. Thus, roughly 10% of the law schools occupied 40% of the market (increasing to nearly 44% of the market in 2013).

We also know who the leading transfer schools have been over the last three years.  The following two charts list the top 20 transfer schools in Summer 2011 (fall 2010 entering class), Summer 2012 (fall 2011 entering class) and Summer 2013 (fall 2012 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers in 2012 and 2013

(BOLD indicates presence on list all three year)

 

School

Number in 2011

School

Number in 2012

School

Number in 2013

George Wash.

104

Florida State

89

Georgetown

122

Georgetown

71

Georgetown

85

George Wash.

93

Florida St.

57

George Wash.

63

Florida St.

90

New York Univ.

56

Columbia

58

Emory

75

American

53

Michigan State

54

Arizona State

73

Michigan State

52

New York Univ.

53

American

68

Columbia

46

American

49

Texas

59

Cardozo

45

Cardozo

48

Columbia

52

Loyola Marymount

44

Loyola Marymount

46

New York Univ.

47

Washington Univ.

42

Rutgers - Camden

42

Minnesota

45

Cal. Los Angeles

40

Minnesota

42

Arizona

44

Michigan

39

Arizona State

42

Northwestern

44

Northwestern

39

Cal. Berkeley

41

Cal. Los Angeles

41

Rutgers - Camden

36

Emory

41

Cardozo

38

San Diego

35

Cal. Los Angeles

39

Southern Cal.

37

Arizona State

34

Northwestern

38

Utah

34

Brooklyn

33

Florida

37

Harvard

34

Cal. Hastings

32

Maryland

34

Florida

33

Minnesota

31

Michigan

33

Cal. Berkeley

32

Lewis & Clark

30

SMU

31

Washington Univ.

31

Harvard

30

Harvard

31

   

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class

(BOLD indicates presence on list in both years)

 

 School

Percentage 2011 (as a percentage of the 2010 first year class)

School

Percentage 2012

(as a percentage of the 2011 first year class)

School

Percentage 2013

(as a percentage of the 2012 first year class)

Florida St.

28.6

Florida St.

44.5

Florida State

48.1

George Wash.

19.9

Arizona State

24.6

Arizona State

48

Utah

19.7

Michigan State

17.5

Utah

34.7

Arizona State

17.8

Utah

17.5

Emory

29.6

Michigan State

17.4

Minnesota

17.1

Arizona

28.9

Washington and Lee

15.3

Emory

16.5

Minnesota

22

Washington Univ.

15.2

Cal. Berkeley

16.2

George Wash.

21.8

Loyola Marymount

15.1

Rutgers - Camden

14.9

Georgetown

21.2

Northwestern

14.2

Georgetown

14.7

Rutgers – Camden

20.7

Richmond

13.7

Southern Cal.

14.7

Southern Cal.

19.7

Rutgers - Camden

13.4

Northwestern

14.4

Texas

19.1

Cal. Los Angeles

13

Cincinnati

14.3

Cincinnati

17.5

Cal. Davis

12.8

Columbia

14.3

Northwestern

17.1

Lewis & Clark

12.1

Buffalo

14.2

Washington Univ.

15.4

Georgetown

12

Arizona

14

Univ. Washington

15.3

Minnesota

11.9

Cardozo

13.8

Columbia

14.2

New York Univ.

11.8

SMU

13.4

American

13.8

Cardozo

11.8

Florida

12.7

SMU

13.3

Columbia

11.4

Chicago

12.6

Cal. Los Angeles

13.3

Buffalo

11

George Wash.

12.5

Chicago

13

 

Note that in these two charts, the “repeat players” are bolded – those schools in the top 20 for all three years – 2011, 2012 and 2013.  (Four of the top ten schools Leiter highlighted from the summer of 2006 remain in the top ten as of the summer of 2013, with four others still in the top 20.)  In addition, it is worth noting some significant changes between 2011 and 2013.  For example, the number of schools with 50 or more transfers increased from six to eight with only two schools with more than 70 transfers in 2011 and 2012, but with five schools with more than 70 transfers in 2013. 

Leiter’s top ten law schools took in a total of 482 transfers, representing 21.3% of the 2,265 transfers that summer.  The top ten law schools in 2011 totaled 570 transfers, representing 23.5% of the 2427 transfer students that summer.  The top ten law schools in 2012 totaled 587 transfers, representing 24.1% of the 2438 transfers that summer.  The top ten law schools in 2013, however, totaled 724 students, representing 28.9% of the 2501 transfers in 2013, demonstrating an increasing concentration in the transfer market between 2006 and 2013 and even moreso between 2012 and 2013. 

In addition, three of the top four schools with the highest number of transfers were the same all three years, with Georgetown welcoming 71 in the summer of 2011, 85 in the summer of 2012, and 122 in the summer of 2013, George Washington, welcoming 104 in the summer of 2011, 63 in the summer of 2012, and 93 in the summer of 2013, and Florida State welcoming 57 in the summer of 2011, 89 in the summer of 2012 and 90 in the summer of 2013.  (Notably, Georgetown and Florida State were the two top schools for transfers in 2006, with 100 and 59 transfers in respectively.)

Similarly, three of the top four schools with the highest “percentage of transfers” were the same all three years, with Utah at 19.7% in 2011, 17.5% in 2012 and 34.7% in 2013, Arizona State at 17.8% in 2011, 24.6% in 2012 and 48% in 2013, and Florida State at 28.6% in 2011, 44.5% in 2012 and 48.1% in 2013.  The top five schools on the “percentage of transfers” chart all increased the “percentage” of transfer students they welcomed between 2011 and 2013, some significantly, which also suggests greater concentration in the transfer market between 2011 and 2013.

More specifically, there are several schools that have really “played” the transfer game in the last two years – increasing their engagement by a significant percentage.  These eight schools had 10.2% of the transfer market in 2011, but garnered 22.2% of the transfer market in 2013.

Schools with Significant Increases in Transfers 2011-2013

School

2011

2012

2013

Percentage Increase

Texas

6

9

59

883%

Arizona

6

24

44

633%

Emory

19

41

75

295%

Arizona State

34

42

73

115%

Georgetown

71

85

122

70%

Florida State

57

89

90

58%

Southern Cal

24

29

37

54%

Minnesota

31

42

45

45%

Totals

248

371

555

124%

 

REGIONAL MARKETS

There appear to be “regional” transfer markets.  In the Southeast in 2013, for example, three schools -- Florida State, Florida and Emory -- had a combined net inflow of 180 transfer students, while Stetson and Miami were flat (43 transfers in and 42 transfers in, combined) and eight other schools from the region -- Florida Coastal, Charlotte, Charleston, Atlanta’s John Marshall, St. Thomas University, Ave Maria, Florida A&M, Nova Southeastern – had a combined net outflow of 303.  It seems reasonable to assume that many of the transfers out of these schools found their way to Emory, Florida and Florida State (and perhaps to Miami and Stetson to the extent that Miami and Stetson lost students to Emory, Florida and Florida State).

NEW DATA – NEW INSIGHTS

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar is collecting and requiring schools to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least five students at the school transferred in).  As a result, we will be able to delineate the regional transfer markets (as well as those schools with more of a national transfer market.

Notably, even though the Section of Legal Education and Admissions to the Bar is not requiring the gathering and publication of the 75%, 50%, and 25% LSAT and UGPA, one thing we are very likely to learn is that for many schools, the “LSAT/UGPA” profile of transfers in is almost certainly lower than the LSAT/UGPA profile of the first-year matriculants in the prior year, a point that both Henderson and Rensberger highlight in their analyses. 

Just look at the schools in the Southeast as an example.  Assume Emory, Florida State and Florida (large “transfer in” schools) are, in fact, admitting a significant number of transfer students from other schools in the Southeast region, such as Miami and Stetson, and schools like Florida Coastal, St. Thomas University, Charlotte, Atlanta’s John Marshall and Ave Maria (large “transfer out” schools in the Southeast).  Even if they are taking students who only came from the top quarter of the entering classes at those schools, the incoming transfers would have a significantly less robust LSAT/UGPA profile when compared with the entering class profile at Emory, Florida State or Florida in the prior year.  Virtually every student who might be transferring in to Emory, Florida or Florida State from one of these transfer out schools (other than Miami and perhaps Stetson) is likely to be in the bottom quarter of the entering class LSAT profile at Emory, Florida, and Florida State.

Comparison of Relative Profiles of Southeast Region Transfer In/Out Schools

TRANSFER IN SCHOOLS

2012 LSAT

2012 UGPA

TRANSFER OUT SCHOOLS

2012 LSAT

 2012 UGPA

Emory

166/165/161

3.82/3.70/3.35

Miami

159/156/155

3.57/3.36/3.14

Florida

164/161/160

3.73/3.59/3.33

Stetson

157/157/152

3.52/3.28/3.02

Florida State

162/160/157

3.72/3.54/3.29

St. Thomas (FL)

150/148/146

3.33/3.10/2.83

 

 

 

Florida Coastal

151/146/143

3.26/3.01/2.71

 

 

 

Charlotte

150/146/142

3.32/2.97/2.65

 

 

 

Atlanta’s John Marshall

153/150/148

3.26/2.99/2.60

 

 

 

Ave Maria

153/148/144

3.48/3.10/2.81

 

This raises an interesting question about LSAT and UGPA profile data.  If we assume that LSAT and UGPA profile data are used not only by law schools as predictors of performance, but that third parties also use this data as evidence of the “strength” of the student body, and ultimately the graduates, of a given law school (for example, USNEWS in its rankings and employers in their assessment of the quality of schools at which to interview), what can we surmise about the impact from significant numbers of transfers?  For those law schools with a significant number/percentage of “transfers in” from law schools whose entering class profiles are seemingly much weaker, the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class.  Similarly, if the “transfers out” from a given school happen to come from the top half of the entering class profile, then for these schools as well the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class. 

Using the chart above, if Emory, Florida and Florida State are drawing a significant number of transfers from the regional transfer out schools, and if they had to report the LSAT and UGPA profile of their second-year class rather than their first-year class, their LSAT and UGPA profiles almost certainly would decline.   (The same likely would be true for other law schools with large numbers of transfers.)

STILL MANY UNKNOWNS

Even with more granular data available in the near future to delineate more clearly the transfer pathways between transfer out schools and transfer in schools, there still will be a significant number of unknowns relating to transfer students, regarding employment outcomes, the demographics of transfers, the experience of transfers and the motivation for transfers.

First, with respect to the employment outcomes of transfer students, how do they compare with the employment outcomes for students who started at a law school as first-years? Do the employment outcomes for transfer students track that of students who started at a law school as first-years, or is the employment market for transfer students less robust than it is for students who started at a law school as first-years?  Are the employment outcomes nonetheless better than they might have been at the school from which they transferred? These are important questions given the perception that many students transfer “up” in the rankings to improve their employment opportunities. 

Second, with respect to demographics, do students of color and women participate proportionately in the transfer market or is the market disproportionately occupied by white males?

Third, with respect to the experience of transfers, the Law School Survey of Student Engagement gathered some data from participating law schools in 2005 regarding the experience of transfers but more could be done to better understand how integrated transfer students are in the life of the learning community into which they transfer.

Fourth, with respect to the motivations of transfers, it is generally assumed that transfers are “climbing” the rankings, and Henderson’s data broadly suggests movement from lower-ranked schools to higher-ranked schools, but what percentage of transfers are doing so partly or primarily for geographic reasons – to be near family or a future career location?  How many are transferring for financial reasons because they lost a conditional scholarship after their first year of law school?  How many truly are transferring to get a JD from a higher ranked law school?  How many of those believe their job opportunities will be better at the school to which they are transferring?

We will have answers to some questions soon, but will still have many questions that remain unanswered.

December 10, 2014 in Data on legal education | Permalink | Comments (7)

Tuesday, December 2, 2014

The Market for Law School Applicants -- A Milestone to Remember

In early 2013, Michael Moffitt, the dean of Oregon Law, was interviewed by the New York Times about the tumult affecting law schools. Moffitt, who is a very thoughtful guy, reponded, "I feel like I am living a business school case study.”  

I think the analogy to the business school case study is a good one.  In the nearly two years since that story was published, the market for law school applicants has actually gotten worse.

Yesterday's Dealbook column in the New York Times featured Northwestern Law Dean Dan Rodriguez (who also serves at President of the AALS) speaking candidly about the meltdown dynamics that have taken hold.  See Elizabeth Olson, "Law School is Buyer's Market, with Top Students in Demand," New York Times, Dec. 1, 2014. 

DanRodriguez"It's insane," said Rodriguez, "We’re in hand-to-hand combat with other schools." The trendlines are indeed terrible.  Year-over-year, LSAT test-taker volume is down another 8.7%.  See Organ, LWB, Nov 11, 2014.  So we can expect the situation to get worse, at least in the near term.      

I applaud Dan Rodriguez for this leadership instincts.  He is being transparent and honest.  Several years ago the leadership of the AALS went to great lengths to avoid engagement with the media. Dan has gone the opposite direction, inviting the press into our living room and kitchen.  

Want to know what leadership and judgment look like?  It looks like Dan's interview with Elizabeth Olson.  Dan's words did not solve anyone's problem, but his honesty and candor made it more likely that we help ourselves.  Because it's Northwestern, and Dan is president of the AALS (something the story did not mention but most of us know), and this was reported by Elizabeth Olson in the New York Times, the substance and tenor of discussions within law school faculties is bound to shift, at least slightly and in the direction favoring change.   

What is the de facto plan at most law schools these days?  Universities are not going to backstop law schools indefinitely. I think the sign below is not far off the mark.  

Outrun-the-bear

We are indeed living through a business school case study, which is both bad and good.   At many schools -- likely well more than half --  hard choices need to be made to ensure survival.  (And for the record, virtually all schools, regardless of rank, are feeling uncomfortable levels of heat.)   A law school needs cash to pay its expenses.  But it also needs faculty and curricula to attract students. The deeper a law school cuts, the less attractive it becomes to students.  Likewise, pervasive steep discounts on tuition reflect a classic collective action problem. Some schools may eventually close, but a huge proportion of survivors are burning through their financial reserves.  

Open admissions, which might pay the bills today, will eventually force the ABA and DOE to do something neither really want to do -- aggressively regulate legal education.  This is not a game that is likely to produce many winners.  Rather than letting this play out, individual law schools would be much better off pursuing a realistic strategic plan that can actually move the market. 

The positive side of the business school case study is that a few legal academics are finding their voice and learning -- for the first time in several generations -- how to lead.  Necessity is a wonderful tutor.  Law is not an industry on the decline -- far from it.  The only thing on the decline is the archetypal artisan lawyer that law schools are geared to churn out.  Indeed, back in 2013 when Dean Moffitt commented about living through a business school case study, he was not referencing imminent failure.   Sure, Moffitt did not like the hand he was being dealt, but as the 2013 article showed, his school was proving to be remarkably resourceful in adapting.

The good news resides on the other side of a successful change effort.  The process of change is painful, yet the effects of change can be transformative and make people truly grateful for the pain that made it all possible.  In our case, for the first time in nearly a century, what we teach, and how we teach it, is actually going matter.  If we believe serious publications like The Economist, employers in law, business, and government need creative problem solvers who are excellent communicators, adept at learning new skills, and comfortable collaborating accross multiple disciplines -- this is, in fact, a meaningful subset of the growing JD-Advantage job market.

In the years to come, employers will become more aggressive looking for the most reliable sources of talent, in part because law schools are going to seek out preferred-provider relationships with high quality employers.  Hiring based on school prestige is a remarkably ineffective way to build a world-class workforce -- Google discovered this empirically.  

From an employer perspective, the best bet is likely to be three years of specialized training, ideally where applicants are admitted based on motivation, aptitude, and past accomplishments. The LSAT/UGPA grid method misses this by a wide margin. After that, the design and content of curricula are going to matter.  It is amazing how much motivated students can learn and grow in three years. And remarkably, legal educators control the quality of the soil.  It brings to mind that seemingly trite Spiderman cliche about great power.

For those of us working in legal education, the next several years could be the best of times or the worst of times.  We get to decide.  Yesterday's article in the Times made it a little more likely that we actually have the difficult conversations needed to get to the other side. 

December 2, 2014 in Current events, Data on legal education, Innovations in legal education, New and Noteworthy, Structural change | Permalink | Comments (4)

Wednesday, November 26, 2014

The Michael S. Maurer Crossword Puzzle

Maurer_announce_20081204_261Apropos of not much at all, I noticed when printing my New York Times crossword puzzle this morning that its constructor was the same Michael S. Maurer who is the named benefactor of Bill Henderson's school.

I knew Mickey Maurer when I was in Indianapolis and he ran the Indiana Economic Development office.  Stand up guy.  He told me that he wasn't very good at doing crosswords, even though he was a regular contributor to the New York Times.

N.B.:  Will Shortz is also an Indiana U. grad.  I don't know if that's a coincidence.

November 26, 2014 | Permalink | Comments (2)

Tuesday, November 11, 2014

What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?

The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July.  Specifically, in a letter to law school deans, the NCBE stated that:  “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results.  All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”

Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.”  Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."

I am not persuaded.   

(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.”  Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)

I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.

If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.

The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score.  The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.

If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013.  But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.

Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0.  The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.

THINKING ABOUT GROUPS

The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam.  I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).

What about “first-time takers”?  That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction.  Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.

So who’s left?  Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014.  If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.

COMPARING LSAT PROFILES

The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013).  The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points? 

The answer to that is no.

The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.

To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade. 

Defining “student quality” can be a challenge.  A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150. 

While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.

In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.

To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134.  To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class. 

(In his article, Unpacking the BarOf Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores.  I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013. 

LSAT RANGE

Prediction of Success on the Bar Exam Based on Lowest LSAT in Range

175-180

.98

170-174

.97

165-169

.95

160-164

.91

155-159

.85

150-154

.76

145-149

.65

140-144

.50

135-139

.36

130-134

.25

Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)

One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010.  Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once.  Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once.  This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.

Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.

Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score.  Those two lines are plotted below for the period from 2005-2012.  (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)

Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012

  Image1

What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes.  But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores.  From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points.  Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points.  This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.

How does this help us understand what has happened in 2014 relative to 2013?  The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010).  Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively. 

Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores?  Certainly.  This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.

But that is not what we have experienced.  We have experienced an historic decline of 2.8 points.  Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.” 

THE EXAMSOFT DEBACLE

What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle

For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam.  This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions). 

In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before.  If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.

ONE STATE’S EXPERIENCE BELIES THE NCBE STORY

It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates.  It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates.  To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).

Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”

The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013.  In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years.  In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.

YEAR

2004

2005

2006

2007

2008

2009

2010

2011

2012

2013

2014

First Time Bar Passage Rate

 

87%

 

 

86%

 

88%

 

89%

 

90%

 

93%

 

91%

 

90%

 

92%

 

93%

 

 

84%

Change from Prior Year

 

 

-1

 

2

 

1

 

1

 

3

 

-2

 

-1

 

2

 

1

 

 

-9

 

The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013.  But here is the problem for the NCBE.

Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam.  The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa.  In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate).  In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year. 

Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively).  In both years, Iowa had a profile of 164/161/158.  In both years, Drake had a profile of 158/156/153.  This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade.  The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.

TIME WILL TELL

This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014.  What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.

When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers.  When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).

It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions.  In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.

November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (3)

Monday, October 20, 2014

What Law Schools Can Learn from Dental Schools in the 1980s Regarding the Consequences of a Decline in Applicants

For four consecutive years we have seen a decline in the number of applicants to law school and a corresponding decline in the number of matriculating first-year students.  Over the last year or two, some have suggested that as a result of this “market adjustment” some law schools would end up closing.  Most recently, the former AALS President, Michael Olivas, in response to the financial challenges facing the Thomas Jefferson Law School, was quoted as stating that he expects several law schools to close. 

To date, however, no law schools have closed (although the Western Michigan University Thomas M. Cooley Law School recently announced the closure of its Ann Arbor branch).  

Have law schools found ways to cut costs and manage expenses in the face of declining revenues such that all will remain financially viable and remain in operation?  Is it realistic to think that no law schools will close?

Although there may be a number of people in the legal academy who continue to believe that somehow legal education is “exceptional” – that market forces may impose financial challenges for law schools in the near term, but will not result in the closing of any law schools -- this strikes me as an unduly optimistic assessment of the situation. 

To understand why, I think those in legal education can learn from the experience of those in dental education in the 1980s.

The Dental School Experience from 1975-1990

In the 1980s, dental school deans, along with provosts and presidents at their host universities, had to deal with the challenge of a significant decline in applicants to dental school. 

At least partially in response to federal funding to support dental education, first-year enrollment at the country’s dental schools grew throughout the 1970s to a peak in 1979 of roughly 6,300 across roughly 60 dental schools.  Even at that point, however, for a number of reasons -- improved dental health from fluoridation, reductions in federal funding, high tuition costs and debt loads -- the number of applicants had already started to decline from the mid-1970s peak of over 15,000. 

By the mid-1980s, applicants had fallen to 6,300 and matriculants had fallen to 5,000.  As of 1985, no dental schools had closed.  But by the late 1980s and early 1990s there were fewer than 5000 applicants and barely 4000 first-year students – applicants had declined by more than two-thirds and first-year enrollment had declined by more than one-third from their earlier peaks. (Source – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author).)

How did dental schools and their associated universities respond to this changing market?  Between 1986 and 1993, six private universities closed their dental schools: Oral Roberts University, Tulsa, Oklahoma (1986); Emory University, Atlanta, Georgia (1988); Georgetown University, Washington, D.C. (1990); Fairleigh Dickinson University, Rutherford, New Jersey (1990); Washington University, St. Louis, Missouri (1991); and Loyola University, Chicago, Illinois (1993). (Source: Dental Education at the Crossroads:  Challenges and Change, Table 1.1 (Institute of Medicine 1995)).  According to a New York Times article from October 29, 1987, “Georgetown, formerly the nation's largest private dental school, decided to close after a Price Waterhouse study found that the school would have a $3.6 million deficit by 1992.” (Source: Tamar Lewin, Plagued by Falling Enrollment, Dental Schools Close or Cut Back, New York Times, Oct. 29, 1987).

Some of the primary factors contributing to the closing of dental schools were described as follows:

Financial issues were repeatedly described as critical. Dental education was cited as an expensive enterprise that is or may become a drain on university resources. On average, current-year expenditures for the average dental school are about $1 million more than current revenues. … The declining size and quality of the applicant pool during the 1980s played a role in some closures by threatening the tuition base and prestige on which private schools rely. Faculty and alumni resistance to change may feed impatience among university administrators. In some institutions, the comparative isolation of dental schools within the university has provided them with few allies or at least informed colleagues and has left them ill-prepared to counter proposals for "downsizing." (Source: Dental Education at the Crossroads:  Challenges and Change, at 202-203 (Institute of Medicine 1995)). 

The Law School Experience from 2004-2014

In terms of applicants and enrollment over the last decade, the trends law schools have experienced look remarkably comparable to the experience of dental schools in the 1970s and 1980s.  According to the LSAC Volume Summary, applicants to law schools peaked in 2004 with 100,600 applicants (and roughly 48,200 first-year students).  By 2010, applicants had fallen to roughly 87,600, but first-year enrollment peaked at 52,500.  Over the last four years, applicants have fallen steadily to roughly 54,700 for fall 2014, with a projected 37,000 first-years matriculating this fall, the smallest number since 1973-74, when there were 40 fewer law schools and over one thousand fewer law professors.  (Source - ABA Statistics)(For the analysis supporting this projection of 37,000 first-years, see my blog post on The Legal Whiteboard from March 18, 2014.)  

The two charts below compare the dental school experience from 1975 to 1990 with the law school experience in the last decade.  One chart compares dental school applicants with law school applicants and one chart compares dental school first-years with law school first-years.  (Note that for purposes of easy comparison, the law school numbers are presented as one-tenth of the actual numbers.)

Applicants

First years

(Sources – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author) and the LSAC’s Volume Summary  (with my own estimates for 2014 based on the LSAC’s Current Volume Summary)).

The Law School Experience 2014-2019

Notably, these charts do not bode well for law schools.  The law school experience tracks pretty closely the dental school experience over the first ten years reflected in the charts.  For law schools, 2014 looks a lot like 1985 did for dental schools.

There might be any number of reasons why the law school experience over the next several years might be different from the dental school experience in the late 1980s and early 1990s, such that the next several years do not continue as a downward trend in applicants and matriculants.  The market forces associated with changes in the dental profession and dental education in the 1980s are not the same as the market forces associated with changes in the legal profession and legal education in the 2010s and the cost structures for dental education and legal education are not exactly the same.

The problem for law schools, however, is that without an upward trend law schools will continue to face significant financial pressures for the next few years just as dental schools did in the late 1980s.  There might be some encouraging news on the employment front over the next few years as the decreasing number of matriculants will mean a decreasing number of graduates in 2015, 2016 and 2017.  Even without any meaningful growth in the employment market for law graduates, this decline in the number of graduates should mean significant increases in the percentage of graduates finding full-time, long-term employment in bar passage required jobs.  Over time, this market signal may begin to gain traction among those considering law school such that the number of applicants to law school stops declining and perhaps starts increasing modestly. 

But the near term remains discouraging.  The number of people taking the June 2014 LSAT was down roughly 9% compared to June 2013 and the anticipation is that the number of test-takers in the most recent administration in late September was down as well compared to October 2013.  Thus, applicants well might be down another 5-8% in the 2014-15 admissions cycle, resulting in perhaps as few as 51,000 applicants and perhaps as few as 35,000 matriculants in fall 2015.  Even if things flatten out and begin to rebound modestly in the next few years, it would appear to be unlikely that the number of matriculants will climb back near or above 40,000 before the fall of 2017 or 2018.

Moreover, if current trends continue, the matriculants in 2015 also are going to have a significantly less robust LSAT/GPA profile than the matriculants in fall 2010.   As I noted in a blog posting on March 2, 2014, between 2010 and 2013, the number of law schools with a median LSAT less than 150 grew from 9 to 32, and the number with a median LSAT of 145 or below grew from 1 to 9.

What Does this Mean for the Average Law School?

Assume you are the Dean at a hypothetical private law school that had 600 students (200 in each class) and a budget based on $18 million in JD tuition revenue in 2010-11.  (This reflects a net tuition of $30,000 from each student – with nominal tuition set at $40,000 but with a discount rate of 25%.)  Further assume that with this budget, your law school was providing $2.0 million annually to the university with which it is affiliated.  As of 2010-11, your entering class profile reflected a median LSAT of 155 and a median GPA of 3.4.

Assume first-year enrollment declined to 170 in 2011, to 145 in 2012, and to 125 in 2013, a cumulative decrease in first-year enrollment since 2010 of 37%.  As you tried to balance enrollment and profile, the law school managed to maintain its median LSAT and GPA in 2011, but saw its LSAT and GPA medians decline to 153 and 3.35 in 2012 and to 152 and 3.30 in 2013.

This means that for the 2013-14 academic year, the law school had only 440 students, a decrease of roughly 27% from its total enrollment of 600 in 2010, with a much less robust entering class profile in comparison with the entering class profile in 2010. (Note that this assumes no attrition and no transfers in or out, so if anything, it likely overstates total enrollment).  (For comparison purposes, the National Jurist recently listed 25 law schools with enrollment declines of 28% or more between 2010-11 and 2013-14.)

Assume further that the law school had to increase its scholarships to attract even this smaller pool of students with less robust LSAT/GPA profiles, such that the net tuition from each first-year student beginning in fall 2012 has been only $25,500 (with nominal tuition now set at $42,500, but with a discount rate of 40%). 

For the 2013-14 academic year, therefore, you were operating with a budget based on $12,411,000 in JD tuition revenue, a decrease in JD tuition revenue of over $5.5 million since the 2010-11 academic year, over 30%.  (170 x $32,500 for third years ($5.525 million), 145 x $25,500 for second years ($3.698 million), and 125 x $25,500 for first-years ($3.188 million)).

What does this mean?  This means you have been in budget-cutting mode for over three years.  Of course, this has been a challenge for the law school, given that a significant percentage of its costs are for faculty and staff salaries and associated fringe benefits.  Through the 2013-14 academic year, however, assume you cut costs by paring the library budget, eliminating summer research stipends for faculty, finding several other places to cut expenditures, cutting six staff positions and using the retirement or early retirement of ten of your 38 faculty members as a de facto “reduction in force,” resulting in net savings of $3.59 million.  In addition, assume you have gotten the university to agree to waive any “draw” saving another $2 million (based on the “draw” in 2010-2011).  Thus, albeit in a significantly leaner state, you managed to generate a “balanced” budget for the 2013-14 year while generating no revenue for your host university.    

The problem is that the worst is yet to come, as the law school welcomes a class of first-year students much smaller than the class of third-years that graduated in May.  With the continued decline in the number of applicants, the law school has lower first-year enrollment again for 2014-15, with only 120 first-year students with a median LSAT and GPA that has declined again to 151 and 3.2.  Projections for 2015-16 (based on the decline in June and October 2014 LSAT takers) suggest that the school should expect no more than 115 matriculants and may see a further decline in profile.  That means that the law school has only 390 students in 2014-15 and may have only 360 students in 2015-16 (an enrollment decline of 40% since 2010-11). Assuming net tuition for first-year students also remains at $25,500 due to the competition on scholarships to attract students (and this may be a generous assumption) – the JD tuition revenue for 2014-15 and 2015-16 is estimated to be $9,945,000, and $9,180,000, respectively (a decline in revenue of nearly 50% from the 2010-11 academic year). 

In reality, then, the “balanced” budget for the 2013-2014 academic year based on revenues of $12,411,000, now looks like a $2,500,000 budget shortfall in 2014-15 and a $3,200,000 budget shortfall for the 2015-16 academic year, absent significant additional budget cuts or new revenue streams (with most of the “low hanging fruit” in terms of budget cuts already “picked”). 

While you may be able to make some extraordinary draws on unrestricted endowment reserves to cover some of the shortfall (assuming the law school has some endowment of its own), and may be creative in pursuing new sources of revenue (a certificate program or a Master of Laws), even if you come up with an extra $400,000 annually in extraordinary draws on endowment and an extra $400,000 annually in terms of non-JD revenue you still are looking at losses of at least $1,700,000 in 2014-15 and at least $2,400,000 in 2015-16 absent further budget cuts.  Even with another round of early retirement offers to some tenured faculty and/or to staff (assuming there are still some that might qualify for early retirement), or the termination of untenured faculty and/or of staff, the budget shortfall well might remain in the $1,000,000 to $1,700,000 range for this year and next year (with similar projections for the ensuing years).  This means the law school may need subsidies from the university with which it is affiliated, or may need to make even more draconian cuts than it has contemplated to date.  (For indications that these estimates have some relation to reality, please see the recent stories about budget issues at Albany, Minnesota and UNLV.)

Difficult Conversations -- Difficult Decisions

This situation will make for some interesting conversations between you as the Dean of the law school and the Provost and President of the university.  As noted above in the discussion of dental schools, the provost and president of a university with a law school likely will be asking:  How “mission critical” is the law school to the university when the law school has transformed from a “cash cow” into a “money pit” and when reasonable projections suggest it may continue to be a money pit for the next few years?  How "mission critical" is the law school when its entering class profile is significantly weaker than it was just a few years ago, particularly if that weaker profile begins to translate into lower bar passage rates and even less robust employment outcomes?   How “mission critical” is the law school to the university if its faculty and alumni seem resistant to change and if the law school faculty and administration are somewhat disconnected from their colleagues in other schools and departments on campus?

Some universities are going to have difficult decisions to make (as may the Boards of Trustees of some of the independent law schools).  As of 1985, no dental schools had closed, but by the late 1980s and early 1990s, roughly ten percent of the dental schools were closed in response to significant declines in the number and quality of applicants and the corresponding financial pressures.  When faced with having to invest significantly to keep dental schools open, several universities decided that dental schools no longer were “mission critical” aspects of the university. 

I do not believe law schools should view themselves as so exceptional that they will have more immunity to these market forces than dental schools did in the 1980s.  I do not know whether ten percent of law schools will close, but just as some universities decided dental schools were no longer “mission critical” to the university, it is not only very possible, but perhaps even likely, that some universities now will decide that law schools that may require subsidies of $1 million or $2 million or more for a number of years are no longer “mission critical” to the university. 

(I am grateful to Bernie Burk and Derek Muller for their helpful comments on earlier drafts of this blog posting.)

 

October 20, 2014 in Cross industry comparisons, Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (5)

Sunday, October 12, 2014

Is the Legal Profession Showing Its Age?

The figure below suggests that a growing number of students are attending law school but not going on to become lawyers.  This conclusion requires some explanation, which I will supply below.  Alternative explanations are also welcome, as I’d like to find a plausible narrative that foreshadows a brighter future for the licensed bar. [PDF version of this essay]

Slide14

I generated this figure based on data from various editions of The Lawyer Statistical Report, which is periodically compiled by the American Bar Foundation (ABF).  The ABF's gets the underlying data from Martindale-Hubbell, which is a comprehensive directory of the licensed bar.  As of 2005, the sample was roughly 1 million lawyers who work in law firms, solo practice, in-house legal departments, government, and the judiciary.

The big surprise here is that the proportion of young lawyers (under age 35) has been declining for several decades.  And not by a little, but by a lot.  During this period, the median age went from 39 in 1980, to 41 in 1991, to 45 in 2000, to 49 in 2005.  See ABA Market Research Department.

I would be tempted to attribute a demographic shift of this magnitude to a computational error.  But that is unlikely because the underlying data were calculated at four different points in time, yet the results come together to produce a single, steady trendline -- a trendline that shows a licensed bar that is steadily aging.  

Another possible factor to consider is whether there are any significant data collection or sampling issues that skew the data in a manner that dramatically undercounts younger lawyers. For example, Martindale-Hubbell is largely irrelevant to today's younger lawyers.  So, in solo and small firm practices, where they are making the business decisions, we might expect plummeting subscription rates.  But subscribing and requesting the publication of additional biographical information (in the hope of garnering referral business) is not the same thing as being listed. Martindale Hubbell attempts to track lawyers who did not subscribe to the directory, as the near-universe level of inclusion increases the directory's value.  

To illustrate this point, consider that in 2000, the Lawyer Statistical Report (which relies on Martindale-Hubbell data) counted 909,000 lawyers.  According to the ABA, the total number of lawyers licensed in the US (compiled from state bar roles) was 1,022,000, and that almost certainly includes some double counting of lawyers licensed in more than one state.  While I have no doubt that younger lawyers are becoming harder to hunt down because of cell phones and home-based offices, the gap of missing lawyers is just not big enough to fully account for the sharp drop-off in younger lawyers. 

I have shown this chart to various law firms, legal departments, law faculty and bar association audiences.  Through this process, I have developed two working theories that are not mutually exclusive:

  1. Increased exits from law practice based on gender integration
  2. Slowing absorption of law graduates into the licensed bar

Theory 1: Gender Integration

One explanation is gender integration.  In short, over the last 40 years, more women have entered the legal profession; and as an empirical matter, they are much more likely to exit the workforce in order to focus on childcare.  Thus, more gender integration over time would cause a proportional decline in the younger lawyer cohort.

So let's examine the data.  According to the figure below, which shows number of male and female 1Ls enrolling by year at ABA-accredited law schools [Click-on to enlarge], the high water mark for male 1L enrollment occurred over 40 years ago -- in 1971!   The high water mark for female enrollment in percentage terms was 2000 (49.4%). In absolute numbers, the high was the class entering in the fall of 2009 (24,305).  

Temp12

Presumably, the higher the percentage of female graduates, the lower the percentage of lawyers under the age of 35.  In 1968, a 22 year old female 1L, if she graduated from law school and stayed in the legal profession, would be part of the younger lawyer cohort in 1980. Yet, her 1L cohort included only 1,179 females (7.4% of all 1Ls).  By 1993 (12 years before 2005), the number of female 1Ls had increased to 19,059 (nearly 44%). So exits based on childcare factors would likely be increasing.  

I can readily accept gender integration as a partial, but not a complete, explanation.  Why? Because female exits are likely to be siphoning off a substantial portion of the over 35 cohort, as this group is still having and raising children.   It seems implausible that female lawyers are leaving in droves before age 35 (reducing the younger lawyer cohort) yet returning in droves thereafter (swelling the over 35 cohort).  Further, according to the figure above, the absolute number of law school graduates is increasing during this entire period.  Sheer numbers are likely a partial counterbalance to the impact of gender-related exits.

Theory 2:  Slowing Absorption of Younger Lawyers

It is important to keep in mind the magnitude of the overall slide in younger lawyers -- from 36% in 1980 to 13% in 2005.  One would think the trendline would be moving in the exact opposite direction -- that larger graduating classes would be replacing the much smaller number of law school graduates from 40 years earlier who were retiring or passing away.  But such a youth movement does not appear to be happening, at least based on data through 2005.

I think the most likely explanation is that the rate of absorption of law school graduates into the licensed bar has been steadily declining over time.  This explanation, which would affect men and women equally, is directionally consistent with the percentage of entry-level jobs in private practice, which has been declining since in the late 1980s. See figure below.

Slide10

The slower absorption theory is also directionally consistent with the shifting demographics of large law firms, which now have more partners than associates.  See figure below.

NLJdemographics

Despite the higher number of partners compared to associates, it is worth noting that large law firms are not becoming more generous in sharing the partnership pie.   

Rather, the real sea change is the decline in the number of traditional law firm associates, who have been slowly supplanted by staff attorneys, permanent of counsel lawyers, and nonequity partners. Indeed, over 40% are large law firm partners (defined at AmLaw 200 / NLJ 250) are nonequity. Three decades ago, this category of partner was relatively rare.  See Henderson, An Empirical Study of Single-Tier Versus Two-Tier Partnerships in the Am Law 200, 84 NC L Rev 1691 (2006).  The growth of nonequity partners reflects a new kind of law firm leverage that relies on senior lawyers. The annual ALM/Major Lindsay & Africa study of partner compensation reveals that equity partners make dramatically higher incomes than nonequity partners and that the size of the pay gap is widening over time. See Ross Todd, A Widening Partner Pay Gap, American Lawyer, Sept 29, 2014.  

The primary advantage of nonequity partners and other senior lawyers, like permanent counsel, is that training costs fall to near zero. Cf. Elizabeth Olson, Corporations Drive Drop in Law Firms’ Use of Starting Lawyers, Study Finds, New York Times, Oct. 10. 2014 (showing drop over time in use of first year associates because clients are refusing to pay for training costs).

To my mind, however, the most persuasive support for the lower absorption theory is the simple delta between the growth in the licensed bar--which has clearly hit a plateau--and the size of graduating classes from ABA-accredited law schools--which, until recently, had been steadily increasing. The figure below shows these macro-level trendlines.

Slide1

If younger lawyers were replacing older lawyers and also growing to keep pace with the broader economy, the under 35 young lawyers cohort would be getting bigger or at least remain relatively constant in size.  But instead, as the first figure in this essay showed, the younger lawyer cohort has gotten smaller.  Arguably, the simplest explanation for these patterns is that it has gotten much harder over time to parlay a JD degree into paid employment as a licensed lawyer.  So, faced with a saturated legal market, law school graduates have been pursuing careers outside of law.  

What Does This Mean?

The analysis above suggests that the JD Advantage / JD Preferred employment market started to take shape several decades ago, long before these terms were put in place by the ABA and NALP.  Yet, we really don't know about these careers.  To construct a more useful, informative narrative, we'd have to systematically study the career paths of our alumni.  That task is long overdue.  

I started teaching at Indiana Law in 2003.  Since I first saw the declining trendline for the young lawyer cohort, I have been thinking about the roughly 1,600 students who have taken my Corporations, Securities Regulation, Business Planning, Project Management, Law Firms as a Business Organization, and Legal Professions classes.  

  • What percentage are working as licensed lawyers?  
  • For those doing something different, where are they working?  
  • Has their legal education opened doors for them? 
  • Did those doors lead to interesting and remunerative work?

The After the JD Study is based on law school graduates who passed in the bar in the year 2000.  The Wave III results provide some clues to how at least one cohort of younger lawyers fared during their first ten years in practice.  

  • Roughly a quarter of the class of 2000 is no longer practicing law (remember the base sample excluded those who never took or passed the bar).
  • The migration out of practice is generally in the direction of private sector business.  
  • Ten years out, the median pay for full-time work is more than $100,000 for both men and women.  No tears need to be shed here.
  • Roughly three-quarters report being satisfied with their decision to attend law school. 

These statistics are generally encouraging, but some caution is in order, as the entry-level legal economy was quite different in 2000.  

Because of the law school transparency movement, we lack commensurable data between 2000 and 2013.  That is an important piece of information right there, as changes in collection and reporting standards were caused by student protests, including several lawsuits surrounding allegedly misleading employment data. Yet, we can cobble together some potentially useful comparisons:

Even if NALP's full-time legal positions in 2000 is a more expansive category than the ABA's full-time bar-passage jobs in 2013, the gap is startling -- over 20 percent!  Further, we have additional evidence of a major shift in the job market, as law firm summer associate positions have declined in size by more than 50% since in the early 2000s. See Henderson, Sea Change in the Legal Market, NALP Bulletin (Aug 2013).   Between 2008 and 2013, there has also been a drop in median starting salaries, from $72,000 to $62,500. See NALP, Employment for the Class of 2013 – Selected Findings

Demand Drops, but Supply Marchs On

Cumulatively, the trendlines presented in this essay suggest that we are on the tail end of a multi-decade structural shift in the legal economy.  So what comes next?

Law schools were recently taken to task in an editorial by the Young Lawyers Board of Philadelphia Legal Intelligencer.  See If Unchanged, Legal Education will Remain a Business in Decline, Legal Intelligencer, Sept 25, 2014.  According to the young lawyers, "One reason graduates have difficulty obtaining employment is that most of them need to be trained in how to practice law, and clients are unwilling to pay for training new lawyers. Law schools need to step up and train students on how to practice law."

I am very sympathetic to the young lawyers, but I think they are missing something essential.  A law school that improves the quality of its skills training reduces the training costs to prospective employers.  That is a good thing, but it does not change the underlying demand for legal services. And it appears that that demand is eroding on several fronts:  (a) wealthy corporations are balking at the price of outside counsel and looking for credible substitutes, (b) ordinary citizens are struggling to afford a lawyer at all, and (c) a new segment of the legal economy is emerging that is financed by nonlawyers and heavily focused on data, process, and technology, which taps into skill sets not traditionally taught in law school. See Henderson, A Counterpoint to "The most robust legal market that ever existed in this country", Legal Whiteboard, Mar 17, 2014.  

Conclusion

My own conclusion is that neither the organized bar nor the legal academy has a firm grip on the changes that are occurring in the legal marketplace.  This uncertainty and confusion is understandable in light of the magnitude of the shift.

Nonetheless, these market shifts create special urgency for legal educators because we can't teach what we don't understand.  The thesis of the Young Lawyers Board is surely right -- if unchanged, legal education will remain a business in decline.  Much of legal education today is premised on a 20th century professional archetype--an archetype that is, based on the data, becoming less and less relevant with each passing day.  Thus, we are under-serving our students.  And frankly, they are figuring that out.  

Change is hard for people and organizations they work in.  And law professors and law schools are no different.  The retooling of legal education will likely be a slow, painful process that will take the better part of a full generation to complete.  I am trying to do my part.

Yet, the brunt of the demographic shift falls on the licensed bar, which is getting older and thus weaker with each passing year.  This is a problem that belongs to the ABA, the state bars, and the state supreme courts, not the legal academy.   [PDF version of this essay]

October 12, 2014 in Data on the profession, Structural change | Permalink | Comments (1)

Tuesday, October 7, 2014

Does Cooperative Placement Accelerate Law Student Professional Development?

The title of an earlier essay posed a threshold question for legal ed reform: "If We Make Legal Education More Experiential, Would it Really Matter?" (Legal Whiteboard, Feb 2014) (PDF). I answered "yes" but admitted it was only my best guess.  Thus, to be more rigorous, I outlined the conditions necessary to prove the concept.

The essay below is a companion to the first essay.  It is a case study on how one type and brand of experiential education -- cooperative placements at Northeastern Law -- appears to accelerate the professional development of its law students. The outcome criteria are comprised of the three apprenticeships of Educating Lawyers (2007) (aka The Carnegie Report) --cognitive skills, practice skills, and professional identity.

The better outcomes flow from Northeastern's immersive, iterative, and integrative approach. First, students are immersed in full-time coops that last a standard 11 weeks. Second, students move through four iterations of coops interspersed with four quarters of upper-level classes. Third, this experiential approach is integrated into the Law School's value system -- i.e., the experiential component is perceived as central rather than marginal to the School's educational mission.

Northeastern's coop model asks more of faculty and students, thus it may be hard to replicate. Yet, there is evidence that such an approach does in fact accelerate professional development in ways that ought to please law school critics and reformers. The benefits may be well worth the costs. 

[PDF version at JD Supra]

[The text below was original published as the Northeastern Law Outcomes Assessment Project (OAP) Research Bulletin No. 3]

Immersive, Iterative and Integrative:
Does Cooperative Placement Accelerate Law Student  Professional Development?

A steep decline in the job prospects for entry-level lawyers has been followed by a sharp drop in law school applications. Media stories criticize traditional legal education for being too expensive while producing graduates unprepared for practice. Throughout the country, legal educators and administrators at law schools are trying to formulate an effective response.

A common thread running through many new law school initiatives is greater emphasis on experiential education. Fundamentally, experiential education is learning by doing, typically by assuming the role of the lawyer in an in-class simulation, law school clinic, externship or cooperative placement. As law schools seek to add hands-on opportunities to their curricular offerings, empirical evidence on experiential education’s impact on law student professional development becomes invaluable.

Northeastern University School of Law’s Outcomes Assessment Project (OAP) is an evidenced-based approach to understanding experiential learning in the law school curriculum. A focal point of the OAP is Northeastern’s Cooperative Legal Education Program, an integral part of the school’s curriculum since the late 1960s. After completing a mostly traditional first year of law school,Northeastern students enter a quarter system in which 11-week cooperative placements alternate with 11-week upper-level courses. Through the four co-op placements during the 2L and 3L years, every Northeastern student gains the functional equivalent of nearly one year of full-time legal experience, typically across a diverse array of practice areas.

The Learning Theory of Cooperative Placement

Northeastern’s Cooperative Legal Education Program is based on a learning theory with three interconnected elements: immersion, iteration and integration.

  • Immersion: Immersion in active legal work in a real-world setting enables students to feel the weight and responsibility of representing real-world clients and exercising professional judgment.
  • Iteration: Iterative movement between the classroom and co-op placements provides students with concrete opportunities to connect theory with practice and understand the role of reflection and adjustment in order to improve one’s skill and judgment as a lawyer.
  • Integration: Integrating experiential learning into the law school curriculum signals its high value to the law school mission — when 50 percent of the upper-level activities involve learning by doing, practice skills are on par with doctrinal learning.

The purpose of the OAP Research Bulletin No. 3 is to use preliminary project data to explore whether the immersion-iteration-integration approach to legal education has the effect of accelerating the professional development of law students.

Three Effects of Co-op Placements

The findings in Research Bulletin No. 3 are based on surveys and focus groups conducted with 2L and 3L Northeastern law students and a small number of Northeastern law graduates, who served as facilitators. In our conversations with these students and alumni, we identified three ways that co-op is impacting the professional development of students.

Continue reading

October 7, 2014 in Data on legal education, Important research, Scholarship on legal education | Permalink | Comments (0)

Thursday, September 18, 2014

Cassidy on Reforming the Law School Curriculum From the Top Down

Mike Cassidy at the Boston College Law School has an interesting essay on curricular reform forthcoming in the Journal of Legal Education.  Here's the abstract:

With growing consensus that legal education is in turmoil if not in crisis, law schools need to take advantage of industry upheaval to catalyze innovation in the way they train their students. Curriculum reform, long the “third rail” of faculty politics, is now essential if some law schools are going to survive the present tsunami of low enrollments and stagnant hiring. One cautiously optimistic note within this doomsday symphony is that law school deans are now in extremely strong bargaining positions with their faculties and boards of trustees with respect to curriculum innovation.

In this essay, the author proposes a pivotal reform to the third year curriculum involving team-taught “Advanced Legal Problem Solving” workshops in subject specific areas, and describes the precise structure, content and staffing of such capstone courses. He argues that such workshops would significantly enhance the preparation of law students for entry into the profession, and would create an efficient and cost-effective route for law schools to satisfy rigorous new ABA accreditation standards regarding experiential learning and outcomes assessment.

September 18, 2014 | Permalink | Comments (0)

Thursday, September 4, 2014

Artificial Intelligence and the Law

Plexus, a NewLaw law firm based in Australia, has just released a new legal product that purports to apply artificial intelligence to a relatively common, discrete legal issue -- detemining whether a proposed trade promotion (advertisement in US parlance) is in compliance with applicable law. 

In the video below, Plexus Managing Partner Andrew Mellett (who is a MBA, not a lawyer), observes that this type of legal work would ordinarily take four to six weeks to complete and cost several thousand dollars.  Mellett claims that the Plexus product can provide "a legal solution in 10 minutes" at 20% to 30% of the cost of the traditional consultative method -- no lawyer required, albeit Plexus lawyers were the indispensible architects for the underlying code. 

From the video, it is unclear whether the innovation is an expert system -- akin to what Neota Logic or KM Standards are creating -- or artificial intelligence (AI) in the spirit of machine learning used in some of the best predictive coding algorithms or IBM's Watson applied to legal problems.   Back when Richard Susskind published his PhD dissertation in 1987, Expert Systems In Law, an expert system was viewed as artificial intelligence--there was no terminology to speak of because the application of technology to law was embryonic.  Now we are well past birth, as dozen of companies in the legal industry are in the toolmaking business, some living on venture or angel funding and others turning a handsome profit.

My best guess is that Plexus's new innovation is an expert system.  But frankly, the distinction does not matter very much because both expert systems and AI as applied to law are entering early toddler stage.   Of course, that suggests that those of us now working in the legal field will soon be grappling with the growth spurt of legal tech adolescence.  For law and technology, it's Detroit circa 1905.  

September 4, 2014 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, New and Noteworthy, Structural change, Video interviews | Permalink | Comments (2)

Sunday, August 24, 2014

Ahead of the Curve: Three Big Innovators in BigLaw

Nashville, TN.  It is time to put down the broad brush used to paint BigLaw as inefficient and out of touch.  At least for me, that is the big takeaway from the 2014 International Legal Technology Association (ILTA) conference, which took place this past week at the Gaylord Opryland Hotel in Nashville and included nearly 2,000 lawyers, administrators, staff, and vendors from around the world.

My takeaway is based on what I saw during the presentation session for the ILTA Most Innovative Law Firm Award.  The three finalists all qualify as big:  Bryan Cave (985 lawyers), Seyfarth Shaw (779 lawyers), and Littler Mendelson (1002 lawyers). Presenters from each firm had 15 minutes to share their innovations followed by 5 minutes of Q&A.  Afterwards, ILTA members in attendance casted ballots for first, second, and third place.

Kudos to Bryan Cave, Seyfarth Shaw, and Littler Mendelson for publicly sharing their innovations, as it demonstrates a commitment to the broader legal profession.

In this post, I will describe the salient points of each innovation. I will err on the side of detail because, when it comes to innovation in the legal space, there is a short supply of “guts of the operations” commentary.  I will then offer some macro-level observations.  As it turns out, BigLaw has on balance a surprisingly good hand to play.  Many will thrive, but at the expense of taking market share from the rest.

Bryancave

Bryan Cave

Presenter: John Alber, Strategic Technology Partner

Bryan Cave has developed an ingenious and highly efficient way to educate its lawyers on the economics of its business.  Prior to the presentation, I was familiar with the firm’s investment in a rigorous cost accounting system to guide the firm’s strategy and operations.

Yet, to get the full benefit out of such a system, the understanding needs to filter down to the individual lawyer-timekeeper level so that each lawyer-timekeeper can use the superior data to allocate time and effort in ways that strengthen the enterprise.  Even in the year 2014, many successful and skilled BigLaw lawyers confuse revenues with profit. And the confusion is understandable because portable books of business, which tend to be measured in terms of revenue, drive the valuation of lateral partners.  See Henderson & Zorn, Of Partners and Peacocks, Am. Law., February 2014.

Based on what I saw at ILTA, such confusion appears to have been substantially eliminated at Bryan Cave. No_math_arithmophobia

The core Bryan Cave innovation is a simple dashboard that tracks a variety of statistics at the lawyer, practice group, and firm level.  What is most striking about the Bryan Cave initiative is the sensitivity shown to the large percentage of lawyers who are not comfortable processing numbers (“arithmophobia” was the term used in the presentation).  The Bryan Cave innovation team dealt with this constraint in two ways.

1. The Octagon.  The Octagon is a data visualization technique that communicates eight key metrics in an octagon-shaped graphic.  Wondering what the term "data visualization" means? It's finding graphical ways to communicate complex multivariable data in a format that requires the end user, such as a lawyer, to have very little technical training.  The Octagon is a textbook example. It uses colors and distance from the center of the graphic to convey essential information related to origination, client relationships, matter management, days to bill, days to collect, hours billed, leverage, and profit margins. (There may be other octagons containing other metrics--the one we were shown appeared to be geared toward partners.)

Each lawyer each month gets a new updated Octagon; and that graphic communicates, through its shape, the lawyer’s relative contributions to the firm.  Specifically, there are distinctive patterns well known within the firm that tend to signal rainmaker, service partner, project manager, technical specialist, or some blend thereof.  The features of the Octagon also communicate how well a lawyer is performing in his or her various roles relative to his or her peers.  So, on a monthly basis, self-image confronts hard numbers.

This type of transparency is bound to have a profound effect on behavior.  (During another ILTA session I heard, from another Bryan Cave presenter, that since the introduction of the Octagon a couple of years ago, the average days to collect has fallen from 60 to 44.)

2. The Rosetta. Some lawyers are bound to prefer a story rather than a picture.  For these lawyers, the firm has created a narrative, referred to as the Rosetta, that translates the numbers into a diagnostic story of strengths, weaknesses, and, most importantly, specific prescriptive advice on how to improve.

But there is an interesting catch—the stories are all written with a computer algorithm.  How is this possible?  It’s a technology pioneered by a company called Narrative Science.  Note that computers that are fed nothing but a traditional baseball scoring sheet now routinely write sports stories that summarize the game for the local sports page.  This narrative summary accompanying the Octagon removes any lingering ambiguity regarding what the diagram means.  Further, all report generation, including practice-group level Octagon and Rosetta reports, has been entirely automated.

I am told that the Octagon and Rosetta programs can handle, and properly incentivize, work that is done on either a billable or alternative fee arrangement basis. If this is true, Bryan Cave has an innovation designed for the legal market of the future.

Some readers may be turned off that the Bryan Cave innovation may seem, on the surface anyway, entirely focused on law firm financial performance.  I am not. To my mind, this type of technology is valuable for communicating the fundamentals of the business.  This reduces the myths and false narratives that routinely take hold in data-poor environments.  This innovation is also timely because it is getting harder to give clients superior value while also delivering a strong return to the firm's owners -- the best of whom could lateral to another firm tomorrow.

The challenge of every BigLaw firm is getting all of the firm's stakeholders to row in the same direction. The combination of the Dashboard, Octagon, and Rosetta is a breakthrough in lawyer communication and, by extension, change management.  Bryan Cave attorneys have the information they need to both build their practices while also advancing the broader goals of the enterprise.

Seyfarth Shaw Seyfarth

Presenters: Kathy Perrelli, Chair of Litigation Practice; Kim Craig, Global Director of Legal Project Management.

Seyfarth Shaw’s innovation is the creation of a true Research & Development Department staffed by lawyers, project managers, technologists, and software developers.  The charge of Seyfarth’s R&D Department is to build solutions in advance of perceived client needs.  As the presenters mentioned, “we are not doing this because our clients are asking for these solutions; we are doing this because our clients will ask.”

Continue reading

August 24, 2014 in Blog posts worth reading, Current events, Data on the profession, Innovations in law, Law Firms | Permalink | Comments (1)

Monday, July 28, 2014

Conditional Scholarship Retention Update for the 2012-2013 Academic Year

In comparing the conditional scholarship universe between the 2011-12 academic year and the 2012-13 academic year (with a brief look at 2013-14) there are a handful of things worth noting.

First, as shown in Table 1, the number of law schools with conditional scholarships declined between 2011-12 and 2012-13 from 144 law schools to 136 law schools, and declined again for the 2013-14 academic year to 128 law schools.  The number of law schools that do not have conditional scholarships grew from 49 in 2011-12 to 58 in 2012-13 to 66 in 2013-14.  In addition, the number of schools with just one-year scholarships declined from five in 2011-12 to four in 2012-13, where it remained for 2013-14.

 Table 1:  Changes in Number of Law Schools with Conditional Scholarship Programs

Category

2011-12

2012-13

2013-14 (indications)

Law Schools with Conditional Scholarship Programs

 

144

 

136

 

128

Law Schools with One-Year Scholarships

5

4

4

Law Schools with Scholarships that are not Conditional Scholarships

 

49

 

58

 

66

 

Second, as shown in Table 2, the number of students receiving conditional scholarships in 2012-13 declined slightly from 2011-12, from 12786 to 12470, but the percentage of first-years with conditional scholarships actually increased from 27.3% to 29.2% (given the smaller number of first-years in 2012-13 compared to 2011-12).  That said, the number of students whose scholarships were reduced or eliminated declined from 4359 to 3712, meaning that the percentage of first-years whose scholarships were reduced or eliminated dropped from 9.3% to 8.7%.

Table 2: Overall Comparisons Between 2011-12 and 2012-13

Category

2011-2012

2012-13

First-years*

46778

42769

First-years with Conditional Scholarships**

12786 (27.3% of first-years)

12470 (29.2% of first-years)

First-years whose conditional scholarships were reduced or eliminated**

 

4359 (9.3% of first-years)

 

3712 (8.7% of first-years)

Average Renewal Rate (across law schools)

69%

71%

Overall Renewal Rate Among Scholarship Recipients

65.9%

70.2%

*Drawn from first-year enrollment at the 198 law schools included in this analysis (excluding the law schools in Puerto Rico and treating Widener as one law school for these purposes) based on information published in the Standard 509 reports.
** Based on information published in the mandated Conditional Scholarship Retention charts by each law school with a conditional scholarship program.

Third, the distribution of conditional scholarship programs across tiers of law schools is even more pronounced in 2012-13 than it was in 2011-12.  Using the USNews rankings from March 2014, only 16 law schools ranked in the top 50 had conditional scholarship programs in 2012-13 and eight of those 16 had a renewal rate of 97% or higher.  Three of these law schools also eliminated their conditional scholarship programs as of the fall 2013 entering class.  (Moreover, only six in the top 25 had conditional scholarship programs, five of whom had a renewal rate of 97% or higher.)

As you move further down the rankings, conditional scholarship programs become more common and manifest lower scholarship retention rates on average.

Of the 53 law schools ranked between 51 and 100 (with three tied at 100), 37 law schools (nearly 70%) had conditional scholarship programs, of which two eliminated their conditional scholarship programs as of fall 2013.  Notably, of the 37 law schools with conditional scholarship programs, eight had a renewal rate of 91% or better (nearly 22%), while seven had a renewal rate of 65% or less (nearly 19%) (with the other 22 (nearly 60%) with renewal rates between 67% and 88%)

For law schools ranked between 104 and 146 (44 law schools in total), 35 law schools (nearly 80%) had conditional scholarship programs, of which three eliminated their conditional scholarship programs as of fall 2013.   Notably, of the 35 law schools with conditional scholarship programs, six of the 35 had a renewal rate of 93% or better (roughly 17%) while 16 had a renewal rate of 65% or less (nearly 46%) (with the other 13 (roughly 37%) with renewal rates between 67% and 88%).

Finally, among the unranked schools, 47 of 51 had conditional scholarship programs – over 92% – only five of which had a renewal rate of 91% or better (nearly 11%), while 23 had a renewal rate of 65% or less (nearly 49%) (with the other 19 (roughly 40%) with renewal rates between 66% and 88%).

Tables 3 and 4 present comparative data across law schools in different USNews rankings categories.  Table 3 describes the number of law schools with conditional scholarship programs and the distribution of scholarship retention rates among law schools.  Table 4 describes the total number of students within each USNews rankings category along with the number of students on conditional scholarships and the number of students who had their conditional scholarship reduced or eliminated.

 Table 3: Scholarship Retention Rates by USNews Ranking Categories

Category

Top 50

51-100 (n=53)

104-146 (n=44)

Unranked (n=51)

Schools with Conditional Scholarship Programs

 

16

 

37

 

35

 

47

Retention Rates of 90% or More

8

8

6

5

Retention Rates of 66%-88%

4

22

13

19

Retention Rates of 65% or Less

4

7

16

23

 Table 4: Number and Percentage of First-Year Students in 2012 by USNews Rankings Categories Having Conditional Scholarships and Having Conditional Scholarships Reduced or Eliminated

 

Top 50 Law Schools (n=50)

Law Schools Ranked 51-100 (n=53)

Law Schools Ranked 104-146

(n=44)

Law Schools Ranked Alphabetically (n=51)

Number (%) of Law Schools with Conditional Scholarship Programs

16 (32%)

37 (70%)

35 (79.5%)

47 (92%)

Total First-Years at These Law Schools

11,862

10,937

7,611

12,180

Number (%) of First-Years with Conditional Scholarships

1,587 (13.4%)

3,192 (29.2%)

3,247 (42.7%)

4,444 (36.5%)

Number (%) of Conditional Scholarship Recipients Whose Scholarships were Reduced or Eliminated

154 (9.7% of conditional scholarship recipients and 1.3% of first-years)

734 (23% of conditional scholarship recipients and 6.7% of first-years)

1,124 (34.6% of conditional scholarship recipients and 14.8% of first-years)

1,700 (38.3% of conditional scholarship recipients and 14% of first-years)

Overall, as shown in Table 5, the distribution of retention rates across law schools was as follows for the 2012-13 academic year:  18 law schools had retention rates less than 50%, 20 law schools had retention rates between 50% and 59.99%, 25 law schools had retention rates between 60% and 69.99%, 21 law schools had retention rates between 70% and 79.99%, 25 law schools had retention rates between 80% and 89.99%, and 27 law schools had retention rates of 90% or better. 

 Table 5 – Number of Law Schools with Conditional Scholarship Renewal Rates in Different Deciles

Renewal Category

Number of Schools

90% or More

27 (16 of which were ranked in top 100)

80%-89.9%

25 (12 of which were ranked in top 100)

70%-79.9%

21 (10 of which were ranked in top 100)

60%-69.9%

25 (8 of which were ranked in top 100)

50%-59.9%

20 (5 of which were ranked in top 100)

Less than 50%

18 (2 of which were ranked in top 100)

Notably, of the 52 law schools ranked in the top 100 with conditional scholarship programs, only two (four percent) had retention rates that were less than 50%, while 16 (nearly 31%) had retention rates of 90% or better.  By contrast, of the 82 (of 95) law schools ranked 104 or lower with conditional scholarship programs, 16 (nearly 20%) had retention rates of 50% or less, while only 11 (roughly 13%) had retention rates of 90% or better.

In sum then, with several schools eliminating their conditional scholarship programs as of fall 2013, less than 50% of the law schools ranked in the top 100 (47 of 103 – nearly 46%) still had conditional scholarship programs, and of those, more than 27% (13 of 47) had retention rates for the 2012-13 academic year of 90% or better while less than 22% (10 of 47) had retention rates of 65% or less.

By contrast, as of fall 2013, more than 80% of the schools ranked below 100 (79 of 95 – roughly 83%) still had conditional scholarship programs, and of those, less than 12% (9 of 79) had retention rates for the 2012-13 academic year of 90% or better and nearly half (39 of 79 – roughly 49%) had retention rates of 65% or less.

July 28, 2014 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, July 8, 2014

A General Counsel's Advice to A Law Firm - Circa 2004

Nothing promotes de-cluttering one's office like a move or new furniture.  A colleague is retiring; I bought his table and standup desk, and gave up the humungous thing they gave me when I showed up.  It meant tossing lots and lots of stuff I never look at anymore (and goodbye hundreds of reprints - may you recycle into something far more valuable).

763cf59eI found the notes from a talk I gave in Chicago to a 2004 meeting of the firm then called Piper Rudnick - a combination of Piper Marbury of Baltimore and Rudnick & Wolfe of Chicago soon to absorb Gray Cary, and thereafter to merge with Dibb Lupton of Great Britain to become the behemoth DLA Piper.

At the time I was the general counsel of Great Lakes Chemical Corporation.  Piper had done a lot of our work under the various EPA-administered statutes that regulated household and other chemicals - TSCA, RFRA (the one dealing with rodenticides and fungicides, not the one dealing with religion), etc.  It had succeeded in securing more work through a "Preferred Provider Program" our terrific Associate GC, Joanne Smith, organized.  In Chicago, I was on a panel with the general counsel of AON, a senior lawyer from Boeing, and one other I can't recall now.  I do remember it was a big room with a lot of people in the audience.

Ten years later, there isn't much here that I'd change - other than I wouldn't have notes on lined paper but would instead have used the Speeches app on my iPad.  A reconstruction of the talk from my notes follows the break.

[Cross-posted at PrawfsBlawg and Legal Profession Blog]

Continue reading

July 8, 2014 | Permalink | Comments (0)

Tuesday, June 17, 2014

So, A Rough Cartoon Always Says Much....*

HeminwayVia Facebook, I saw that my friend Joan Heminway had some interesting things to say about the private-public distinction in securities law over at our sister blog, Business Law Prof.  I heartily recommend it.

But with my usual instinct for avoiding the import of a serious presentation and jumping immediately to the trivial and irrelevant, it dawned on me I had never known that the Crowdfund Act of 2012 was really the CROWDFUND Act of 2012.  

My cartooning skills are not up to what they were in my productive peak thirty-five to forty years ago during my brief stints at the Michigan Daily and the Stanford Law Journal,** but I was inspired to grab my crutches, hop up to the second floor,* get a sharp pencil and some paper and sketch this:

Acronym Cartoon

* I suffered a complete rupture of my achilles tendon pretending I was a lot younger than I am and find that I now have a lot of time on my hands.

** This was the student newspaper, not the law review, something I noted on my resume for many years.

June 17, 2014 | Permalink | Comments (1)

Obama, Law School, and Interdisciplinarity

Graphic-RodriguezDanielB._v2012-01-05;053739Dan Rodriguez, Northwestern University Law School dean and current president of the AALS, has some typically thoughtful commentary at PrawfsBlawg on the news that President Obama is not recommending law school as the grad school of choice, suggesting instead that we need more STEM professionals than lawyers.  

If I may characterize Dan's comment, he's not taking issue with the President's comment as much as suggesting the real name of the game is in interdisciplinarity, and lawyers and law degrees have something to contribute on that score.

I commented on his post over at Prawfs, but thought I'd reprint it here as well.

The business world is at least twenty (and probably more, given the structural and institutional problems with interdisciplinary work in academia) years ahead of universities in this. It's not to say that disciplinary and functional silos don't exist out there; they still do. But the fundamental insight of lean enterprise and continuous improvement was that research, engineering, manufacturing, and marketing all had to talk to each other from the outset, or you ended up with designs nobody could build, or products nobody wanted.

So who teaches interdisciplinarity? Louis Menand's "The Marketplace of Ideas" and Michele Lamont's "How Professors Think" are about as good as it gets in nailing what my casual empiricism tells me: there's a tension between dilettantism and disciplinary rigor every time you venture out into the space between disciplines (which by the way are something WE create and don't necessarily or even contingently cut fields of knowledge at the joints). The paradox, of course, is that once you establish peer review or other disciplinary standards in the new space you've replicated the original disciplinary problem.

So in academia, what you have to do is forge ahead notwithstanding the cautious naysayers (i.e. risk being called a dilettante, which ain't easy if you are pre-tenure) but at the same time do the best you can in finding like minded souls from the other discipline to afford you some check on rigor, mix it all together and hope for the best.

Finally, the particular hallmark of disciplinary rigor among both academic and practicing lawyers is attribution of blame as the focus of cause-and-effect in the world (from Honore & Hart to Moore to "all you say is 'no'"). That's usually one of the first things that effective business lawyers manage to shed, oftentimes to the dismay of lawyers' lawyers.

June 17, 2014 | Permalink | Comments (0)

Sunday, June 1, 2014

NewLaw, Innovation, and the Importance of Failure

FurlongJordan Furlong is one of the first-rate commentators on the legal industry. He is an excellent observer, a deep thinker, and skilled and stylish communicator.  

Over at Law 21, Jordan has written a set of companion essays that explain the ferment that is now taking hold in the legal industry.  Check them out if you need or want the seemingly complex made simple.

The first essay is a highly useful reference guide to NewLaw (#NewLaw), a category coined by the Australian consultant George Beaton.   Jordan modestly titled the essay "An Incomplete Inventory of NewLaw," but its alleged incompleteness does not distract from its usefulness.  Complicated things like new business models need to be organized and simplified before we can get our heads around them.  Here, Jordan creates a elegant typology and fills it out with example after example.  Before Jordan's essay, few of us could be sure we were discussing the same ideas or concepts.

One of Jordan's most noteworthy observation is that the talent side of NewLaw is appears to be growing faster in the UK (new models of organizing and delivering legal services and content) while the US seems to be getting the most traction in legal tech.  The former is likely due to liberalization of regulations that flow from the UK's Legal Services Act of 2007 and the latter from the proximity to venture funding.  To have similar legal ecosystems developing in different ways is bound to trigger consequences and interactions that we cannot fully anticipate. 

Jordan's second post is on the failure of legal innovation, which he points out is nothing more than the precursor long-term success.  See  "The Failure of Legal Innovation," Law 21, May 29, 2014.  I definitely agree.  When I look at the legal innovation space in 2014 -- and my frame for reference is LegalTech, LexRedux, ReInvent Law, some of the ABA Legal Rebels, and a lot of shoe-leather research on my part -- I think of Detroit in 1905.  There were roughly 125 car manufacturers and hundreds more in other parts of the country, as Detroit was not yet car capital of the world.  All of those business owners were right about one thing:  The car is the future.  But they wistful optimists about something else -- their car company is the future. 

A start-up is like a sapling in the woods -- the odds are against it ever growing to the treeline. Fortunately, in the start-up ecosystem good ideas and talented entrepreneurs never really lose.  Instead, they are rolled up into competitors to form the types of companies that can truly shape an entire new industry.  Along these lines, if I were working in investment banking these days, I would be trying to specialize in the legal sector, as the roll-ups in this space are going to be fast and furious in the years to come.  

Let's fasten our seatbelts.  The next several years are going to be time of great tranformation.

June 1, 2014 in Blog posts worth reading, Cross industry comparisons, Important research, Innovations in law, New and Noteworthy, Structural change | Permalink | Comments (1)

Tuesday, May 27, 2014

Another Datapoint for the Laptops Debate

In my inbox this morning was the HBS Daily Stat with the title, "You'll Absorb More if You Take Notes Longhand."  Here is the accompanying explanation:

College students who take notes on laptop computers are more likely to record lecturers’ words verbatim and are thus less likely to mentally absorb what’s being said, according to a series of experiments by Pam A. Mueller of Princeton and Daniel M. Oppenheimer of UCLA. In one study, laptop-using students recorded 65% more of lectures verbatim than did those who used longhand; a half-hour later, the laptop users performed significantly worse on conceptual questions such as “How do Japan and Sweden differ in their approaches to equality within their societies?” Longhand note takers learn by reframing lecturers’ ideas in their own words, the researchers say.

SOURCE: The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking (emphasis in the original)

Wouldn't the same analysis almost surely apply to law students?  Experience tells me that many law students would argue that they are in the minority who learn better through computer transcription.  But what if, given a choice, over half decide to use laptops?  It would be likely that many, if not most, would be making the wrong tradeoff.

Data rarely changes hearts and minds.  As a result, there is likely a gap between maximum learning/knowledge worker productivity and what we are able to accomplish in an education or  workplace setting.  Why?  People like what they are used to and rationalize why data does not apply to them.  There is a solution to dilemma, I suspect.  We just have not found it yet. 

May 27, 2014 in Blog posts worth reading, Cross industry comparisons, Data on legal education, Fun and Learning in the classroom, New and Noteworthy | Permalink | Comments (2)

Thursday, May 22, 2014

New Study Tool - "SeRiouS" Learning Platform


Logo-184e34c8eec23e71d5723ff536ff6a43Most of us would agree that the essence of lawyering consists of seeing the issues embedded in the narrative, and translating the narrative into legal argument.  That's a hard thing to teach.

But you still have to know the language into which you are translating the narrative.  And learning language - whether we like it or not - involves cramming a lot of stuff into our memories.  Particularly before the bar exam.

My friend and colleague at Suffolk, Gabe Teninbaum, has created a new learning platform with which law students and bar preppers can address this latter issue.  He calls the product “SeRiouS,” which stands for Spaced Repetition Systems.  Gabe is making the program available for free in beta mode at SpacedRepetition.com (at least) through the July 2014 bar exam. Gteninbaum   

Although the very thought of cramming sends chills down my spine, sometimes you gotta do it (I took the MPRE to be admitted to the Massachusetts bar seven years ago and it was an unpleasant flashback!).  Crammers know what the studies bear out:  you forget most of what you crammed (66%) within 24 hours, and almost all of it (79%) within a month.

Gabe's claim (give it a try!) is that SeRiouS improves the memory retention rate to 92% for as long as the student is using the system, and takes less time than traditional methods.

To slow the rate at which users forget, SeRiouS shows them online flashcards and, after each one, prompts the user to report how well he or she knew the answer after flipping it over.  If the user knew it well, the card won't reappear for a longer time; if the user struggled to remember, SeRiouS will shown it again sooner. Based on these  answers, SeRiouS’s algorithm customizes itself to the user’s personal rate of forgetting, and then uses that information to prompt studying at just the right time. 


With spaced repetition, as with any other cramming, it’s “garbage in, garbage out.”  In other words, if the content of the flash cards stinks, so will the memorized result.  Currently, SeRious  has 600+ law professor-created flashcards on the topics most likely to be tested on the Multistate Bar Exam and in core law school courses. 

The system works on any device as long as there’s internet access.  SeRiouS updates constantly based on users’ work, and individual users’ data is stored in the cloud.  

I don't endorse commercial products, but this one is free for the time being.  You can also reach Gabe with questions, comments or feedback.

May 22, 2014 | Permalink | Comments (1)

Sunday, May 4, 2014

"Verbatim: What is a Photocopier?"

The New York Times has is publishing a new series of short documentaries films called Op-Docs.  The Op-Doc below is a dramatization of a deposition, albeit the script is a verbatim rendition of an actual deposition transcript. The plaintiff's lawyer is trying to establish whether the witness's office (which happens to be the Recorder for Cuyahoga County Court of Common Pleas) has a photocopy machine.  Simple question, right?

The video is quite funny, but suffice it to say the verbatim transcript does not cast litigation in a favorable light.  The fact that the Ohio judiciary is the defendant is even more troubling.  Mediums like a documentary on the Times website seems like a promising change catalyst. 

May 4, 2014 in Blog posts worth reading, Current events, New and Noteworthy | Permalink | Comments (0)

Thursday, May 1, 2014

What Ails the Large Law Firm? Will the Real FutureFirm Please Stand Up

TimeinabottleFive years ago this April, I helped organize a novel experiment on how to reengineer the modern law firm.  The occasion was FutureFirm 1.0, a collaborative competition in which teams of law firm partners, associates, and in-house lawyers to create a strategic plan for the fictional firm of Marbury & Madison (M&M).  The goal was a new business model that would enable the firm "to survive and thrive over the next 20 years."  See M&M Fact Pattern.   

We planned FutureFirm 1.0 in the fall of 2008, but by April 2009, things looked pretty unstable.  Deal flow had ground to a halt, and corporations were reluctant to fund noncrucial litigation.  Law firms in turn were rescinding offers to thousands of law students.  Further, the specter of law firm failure hung in the air.  Suffice it to say, the timing was not right for sharing the results of FutureFirm.  As a result, my analysis of the event, "What Ails the Large Law Firm?  Will the Real FutureFirm Please Stand Up," was never published or circulated.  

With five year anniversary of FutureFirm 1.0, I decided to uncork my time-in-a-bottle essay and post it on SSRN and JDSupra.  

Having not read this essay for five years, I am surprised at how well the FutureFirm analysis holds up.  Yet, the biggest takeaway from my FutureFirm experience is not the specifics of the analysis, but acclimating myself to the permanence of new change dynamic, much of which I can see through the participants of FurtureFirm 1.0.   

  • Two law firm partners subsequently left to start their own boutiques, one of which is aggressively moving into managed services in South Africa.  
  • Another law firm partner became a judge in King County, Washington (Seattle).
  • Several summer associates joined BigLaw only to leave within three or four years to become sophisticated in-house lawyers who are themselves driving change.
  • Several people in all roles have switched over to the business side.  Indeed, new legal businesses are actively being planned.

In the spring of 2014, the new normal is here to stay, and it has no froth.  FutureFirm was probably a fringe activity back in 2009.  Now, an event like FutureFirm would be one of the key places to go for answers.  Indeed, I have very serious senior in-house lawyers at Fortune 100 companies who want to run this type of colloborative competition to help better design tomorrow's legal departments.  So stay tuned for that.

I hope you are sufficiently curious to do a bit of time travel and give "What Ails the Large Law Firm?" a read.  I would welcome your thoughts and feedback. 

May 1, 2014 in Data on the profession, Innovations in law, Law Firms, Legal Departments, Structural change | Permalink | Comments (0)

Monday, April 28, 2014

Critiquing Law Schools -- Some Perspective

Humblepie-e1288647520854Every few months, whether I like it or not, I get served a slice of humble pie.   I thought those tiring of the steady stream of law school critiques might find this slice particularly tasty, as someone else (me) is ingesting it.

Over the last few years, I have begun reading books on management and leadership.  My interest in this topic is driven partly by my belief that law schools will be tooling up in this area in the years to come; and partly by a desire to learn about, and acquire, what I hope to teach. 

The finest resource I have found on this topic is Management and Leadership: A Group of Letters to an Industrial Organization.  This book was originally published in 1948 by Carl Braun, a prominent industrialist of the early and mid-20th century.  Braun wrote this book, and several others, for the benefit of his managers at C.F. Braun & Co., which was an engineering company that designed and built many of the nation's oil refineries.  I was drawn to Braun because his company was such a spectacular and enduring success.  

The success, however, not not merely financial.  What made C.F. Braun so successful for so long was Braun's relentless drive to maximize the potential of every person in his organization.  

Now let's think about that -- reaping large profits by putting your people first.  For Braun, this was not a abstraction.  It was, in fact, the company's track record over a period of several decades.  In 1989, 35 years after Carl Braun's death, C.F. Braun & Co. was sold to what is now Kellogg Brown & Root (KBR).  And today, people who worked there are still reminiscing over the positive impact the company had on their lives and the lives of their families.  If you think I am exaggerating, check out the C.F. Braun Alumni Group on LinkedIn.

I have read Management and Leadership several times.  Without exception, each time I put the book down I feel both challenged and inspired.  Well, this last time, I read the following passages in one sitting -- and suffice to say, the contrast hit be pretty hard.  You be the judge.

Below in a section titled "The Doers Must Teach," Braun implores his managers to accept their role as teachers, as our nation's schools, including colleges and universities, lack the practical orientation of modern industry.

Our field of endeavor, industry, unlike medicine for instance, is one of those fields in which the teachers are not the doers. Our teachers, whether in grade-school, high school, or college, seldom have had industrial experience. Few have had even slight contact with industry. And fewer still have current contacts. Not understanding industry, they too often judge it by its worst members, and so develop for it an active disrespect.

The result is that, with rare exceptions, teachers do not find out from industry what industry needs from them. Nor do they seek from industry the teaching-methods that the better industrialists have developed.  The gap is enormous between the abstract teachings of our schools and the concrete needs of industrial man. It is this gap that we industrial leaders must fill. We must fill in what's missing. And we must make the whole a living growing thing.

Well, as I am reading the above passage, my mind is quickly drawing a parallel between Braun and numerous incisive and trenchant critiques of legal education.  See, e.g., Legal Education's Ninety-Five Theses, Legal Whiteboard, Feb 1, 2012; Harry Edwards, The Growing Disjunction Between Legal Education and the Legal Profession, 91 Mich L Rev 34 (1992).   Yes, I thought to myself, these authors, like Braun, really know what they are talking about.

But that was not what Braun had in mind.  About 20 pages later, Braun focuses on legal education and the case system as a beacon that will lead us to a better way. 

The law schools have the right idea. They used to bore law students by droning at them from Blackstone - that encyclopedic treatise on law theory. But now they teach from concrete cases - and they've done it for eighty years. The student studies adjudicated cases - cases that are real, typical, modern. From these cases, with the help of his teacher, the student builds up the guiding rules.

This is the right method. Let it be our method. Let's shake ourselves free of the horrible methods we have been brought up on in our schools.  Let's have no dogmatic rules in our teaching.  Let's have no silly and artificial examples that nobody ever uses. Let's be sure that in all our teaching we start with concrete cases -- cases that are real, that are applicable to our purposes, and that preferably are within the practical experience of our learner.

Of course, in 1947, the year before Braun would extol law schools to his audience of engineers, the influential legal realist, Jerome Frank, published an incisive critique that called for the near complete overthrow of the 80-year tradition. See Frank, A Plea for Lawyer-Schools, 56 Yale L J 1301 (1947).

Alas, we humans often find the deepest faults with what is close and intimate, and greatest virtue  with what is mythical and far away.  How often a sense of accurate proportion eludes us.   After several years traveling the country discussing legal education reform, I have gradually concluded that if I want to maximize my influence on change, I need to build and encourage, not criticize and debate.   

April 28, 2014 in Blog posts worth reading, Cross industry comparisons | Permalink | Comments (0)