Monday, April 13, 2015

PROJECTIONS FOR LAW SCHOOL ENROLLMENT FOR FALL 2015

          This blog posting is designed to do three things.  First, following up on recent discussions regarding trends in applicants by Al Brophy at The Faculty Lounge and Derek Muller at Excess of Democracy, I provide a detailed analysis to project the likely total applicant pool we can expect at the end of the cycle based on trends from March through the end of the cycle in 2013 and 2014.  Second, using the likely total pool of applicants, I estimate the number of admitted students and matriculants, but also question whether the estimates might be too high given the decline in quality of the applicant pool in this cycle.  Third, building on the second point, I suggest that law schools in the lower half of the top tier are likely to see unusual enrollment/profile pressure that may then have a ripple effect down through the rankings.

1. ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS

Reviewing the 2013 and 2014 Cycles to Inform the 2015 Cycle

2013   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   25, 2013

30,098

56%

53,750

Mar.   8, 2013

46,587

84%

55,460

May   17, 2013

55,764

95%

58,700

End   of Cycle

 

 

59,400

 

2014   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   31, 2014

29,638

58%

51,110

Mar.   7, 2014

42,068

79%

53,250

April   25, 2014

48,698

89%

54,720

End   of Cycle

 

 

55,700

 

2015   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   30, 2015

26,702

54%

49,450

Mar.   6, 2015

39,646

76%

52,160

April   3, 2015

45,978

87%

52,848

End   of Cycle

 

 

54,000   (Estimate)

        In each of the last two years, a modest surge in late applicants meant the final count exceeded the March/April projections by a couple thousand.  That would suggest that the current projection (for just under 53,000) likely understates the end of cycle applicant pool, which I am now estimating conservatively at 54,000 (down about 3% from 2014).  (In 2014, the amount by which the final pool total exceeded the early March projection was nearly 2,500.  With an estimated pool of 54,000 applicants, I am estimating that the final pool in 2015 will exceed the early March projection by roughly 2,000.)  (That said, if the employment results for 2014 graduates, which will be released shortly, show modest improvement over 2013, I anticipate that even more people might come off the fence and perhaps apply late for the fall 2015 class.)

2. ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS  

        The chart below shows the number of applicants, admitted students and matriculants over the last three years along with an estimate for fall 2015 based on the assumption above that we have a total of 54,000 applicants this cycle.  With 1,700 fewer applicants, I am assuming 1,000 fewer admitted students (a slight increase in the percentage admitted from 2014), and then assuming the number of matriculants will reflect the three-year average for the percentage of admitted students who matriculate – 87%.  This would yield a first-year entering class of 36,975, down about 2.5% from 2014.   

Estimates of Admitted Students and Matriculants for 2015 Based on Trends in 2012-2014

 

Applicants

Admitted   Students

Percent   of Applicants

Matriculants

Percent  of Admitted

2012

67,900

50,600

74.5%

44,481

87.9%

2013

59,400

45,700

76.9%

39,675

86.8%

2014

55,700

43,500

78.1%

37,924

87.2%

2015   (est.)

54,000

42,500

78.7%

36,975

87%

Why These Estimates for Admitted Students and Matriculants Might be Too High

        a.      Significant Decline in Applicants with LSATs of 165+

        Because of changes in the nature of the applicant pool in 2015, however, the estimates of the number of admitted students and number of matriculants in the chart above may be too high.  In 2014, almost all of the decrease in applicants came among those with LSATs of <165.  The pool of applicants with LSATs of 165+ in 2014 was only slightly smaller than in 2013 (7,477 compared with 7,496). Indeed, as a percentage of the applicant pool, those with LSATs of 165+ increased from 12.6% in 2013 to 13.4% in 2014.  This resulted in a slight increase in the number of matriculants with LSATs of 165+ in 2014 compared to 2013 (6,189 compared with 6,154).

        In the current cycle, however, the number of applicants with LSATs of 165+ was only 6,320 as of March 6, 2015. In 2013, there were 7,228 on March 8, 2013 (of a final total of 7,496).  In 2014, there were 7,150 on March 7 (of a final total of 7,477).  Thus, the average increase in applicants with LSATs of 165+ between early March and the end of the cycle is only about 4%.  That would suggest that we could anticipate having roughly 6,585 applicants with LSATs of 165+ at the end of the cycle – down nearly 900 from 2014 – over 12%.

Estimate of Number of Total Applicants for 2015 with LSATs of 165+ Based on Trends in 2013 and 2014

 

Applicants at 165+

 

Applicants at 165+

# Increase to end of Cycle

% Increase to end of Cycle

March 8, 2013

7228

End of Cycle 2013

7496

268

3.7%

March 7, 2014

7150

End of Cycle 2014

7477

327

4.6%

March 6, 2015

6320

End of Cycle 2015 (est.)

6585

265

4.2%

        On a longer term basis, if the estimates in the preceding paragraphs are accurate, the entering class in fall of 2015 will again extend the slide in the number and percentage of first-year students with LSATs of 165+ that has been underway since the class that entered in fall of 2010.

Five-Year Trend in Applicants and Matriculants with LSATs of 165+  and Estimates for 2015

 

Applicants with LSATs of 165+

Matriculants   with LSATs of 165+

Percent   of Applicants Matriculating

2010

12,177

9,477  

77.8%

2011

11,190

8,952  

80%

2012

9,196

7,571  

82.3%

2013

7,496

6,154  

82.1%

2014

7,477

6,189

82.8%

2015 (est.)

6,585

5,420

82.4%

        Given that on average over the last three years roughly 82.4% of admitted students with LSATs of 165+ actually matriculated, one could expect that the 6,585 applicants would translate into 5,420 matriculants with LSATs of 165+ for fall 2015, a decline of nearly 770 from 2014.  Notably, this would represent a 45.9% drop in applicants with LSATs of 165+ since 2010 and a 42.8% drop in matriculants with LSATs of 165+ since 2010.

        b. Modest Decrease Among Applicants with LSATs <150

        On the other end of the LSAT distribution, it is a completely different story. Although the number of applicants with LSATs <150 also has declined, the decline has been more modest than among those with LSATs of 165+.  Moreover, those with LSATs of <150 are much more likely to apply late in the cycle.  In the last two years there has been significant growth among applicants with LSATs of <150 between early March and the end of the cycle.   As a result, I would estimate that we would have 18,350 applicants with LSATs of <150 by the end of this cycle, a decline of only about 4.5%.

Estimate of Number of Total Applicants for 2015 with LSATs of <150 Based on Trends in 2013 and 2014

 

Applicants with LSATs of <150

 

Applicants with LSATs of <150

# Increase

% Increase

March 8, 2013

13,364

End of Cycle 2013

20,706

6,642

49.7%

March 7, 2014

11,662

End of Cycle 2014

19,239

7,577

65%

March 6, 2015

11,467

End of Cycle 2015 (est.)

18,350

6,880

60%

        With applicants with LSATs <150 making up a larger percentage of the declining applicant pool, the number of matriculants with LSATs of <150 actually had grown each year up until 2014, when the slight increase in matriculants with LSATs of 165+ was mirrored by a slight decrease in matriculants with LSATs <150. 

Five-Year Trend in Applicants and Matriculants with LSATs of <150 and Estimates for 2015

 

Applicants   with LSATs of <150

Matriculants   with LSATs of <150

Percent   of Applicants Matriculating

2010

26,548

7,013

26.4%

2011

24,192

7,101

29.4%

2012

22,089

7,906

35.8%

2013

20,706

8,482

41%

2014

19,239

8,361

43.5%

2015 (est.)

18,350

8,700

47.4%

        Given that the percentage of applicants with LSATs <150 matriculating has increased each of the last five years, it seems reasonable to expect another increase – to 47.4% -- resulting in roughly 8,700 matriculants with LSATs of <150, particularly given the decrease in the number of applicants with LSATs of 165+.  Even so, it seems unlikely to make up for the drop of nearly 770 matriculants among those with LSATs of 165+.  Notably, while the pool of applicants with LSATs <150 has decreased by of 30.9% since 2010, the number of matriculants has increased by 24.2%.

        Thus, while the smaller decline in applicants that is expected this year might suggest a correspondingly smaller decline in matriculants, with the weaker profile of the applicant pool in 2015 compared to 2014, it is quite possible that the total number of admitted students will be lower than the chart above suggests and that the corresponding number of matriculants also will be lower than the chart above suggests.

        Phrased differently, if there really is going to be a decline of roughly 770 matriculants just in the group with LSATs of 165+, then the total decline in matriculants may well be greater than the 950 estimated in the chart above.  Between 2013 and 2014, a decline in applicants of 3,700, almost all with LSATs of 164 and below, resulted in a decline in matriculants of 1,750, all with LSATs of 164 and below.  If the decline in applicants is 1,700 this cycle, with over half the decline among those with LSATs of 165+, with a decline of perhaps several hundred with LSATs between 150-164, and with a modest decrease (or possibly a slight increase) among those with LSATs <150, we may well see that the decline in admitted students and in matriculants is slightly larger than estimated in the chart above.

3. PROFILE CHALLENGES AMONG ELITE SCHOOLS

        One interesting side note is that the significant decrease in the number of applicants with LSATs of 165+ is likely to put significant pressure on a number of top-50 law schools as they try to hold their enrollment and their LSAT profiles.  Simply put, there are not enough applicants with LSATs of 165+ to allow all the law schools in the top-50 or so to maintain their profiles and their enrollment. 

        If the estimates above are correct – that there will be roughly 5420 matriculants with LSATs of 165+– and if we assume that at least a few hundred of these matriculants are going to be going to law schools ranked 50 or below either due to geography or scholarships or both – and if we assume that the top 15 law schools are likely to leverage rankings prestige (and perhaps scholarships) to hold enrollment and profile -- then the decrease of roughly 770 matriculants with LSATs of 165+ is going to be felt mostly among the law schools ranked 16-50 or so. 

        In 2014, the top 15 law schools probably had roughly 3,800 first-year matriculants with LSATs of 165+.  The schools ranked 16-50 likely had another 1,900 or so.  The remaining 500 plus matriculants with LSATs of 165 and above likely were scattered among other law schools lower in the rankings. Let’s assume the top-15 law schools manage to keep roughly 3,700 of the 3,800 they had in 2014.  Let’s assume law schools ranked 50 and below keep roughly 500 or so.  That means the law schools ranked between 16 and 50 have to get by with 1,220 matriculants with LSATs of 165+ rather than 1,900 last year.  While many schools will be dealing with the challenges of maintaining enrollment (and revenue) while trying to hold profile, this likely will be a particularly challenging year for law schools ranked between 16 and 50 that are trying to navigate concerns about enrollment (and revenue) with concerns about profile.  To the extent that those schools look toward applicants with lower LSAT profiles to maintain enrollment, that will then have a ripple effect through the law schools lower in the rankings.

April 13, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Sunday, April 12, 2015

Another Example of Using Big Data to Improve Odds of Winning in Court

Back in February, I wrote a post on The The Early Days of Legal Analytics.  It discussed some of the innovations at Lex Machina, a legal start-up that uses Big Data to value contested patents and develop a litigation strategy designed to maximize value / minimize risk.  I recently came across another company, Premonition, that claims to use artificial intelligence to select lawyers with the best odds of achieving a favorable result. See Premonition Infographic at bottom of this post.

I spend a lot of time on the road talking to law firm lawyers and legal innovators, including legal start-ups.  Many large firm lawyers tend to dismiss new innovations without stopping to listen to, much less gather, relevant facts.  Likewise, there is a lot of puffery among legal start-ups as they try to land their first few customers.  Thus, I tend to apply a windage factor to accurately interpret what I am being told.

The image below contains Premonition's simple, one-sentence pitch.  I don't know about the product, but the concept is pretty clear.

Premonition

The core benefit Premonition appears to offer is a list of lawyers with winning track records in front of specific judges. We don't need artificial intelligence (AI) to make that calculation.  A win-rate is a simple descriptive statistic, even if it has been filtered for a variety of matching criteria.

That said, AI could come in handy in building the requisite data sets.  As explained on Premonition's website, courts don't construct their case management systems so they can be vacuumed out by data mining companies. Indeed, local court officials would likely to be hostile to such requests because any resulting statistical model is unlikely to make them look good.  Indeed, the purpose of the model is, at least in part, to identify and exploit imperfections in the judicial process.  

Because the courts have no incentive to make life easy for Big Data vendors, Premonition's Chief Innovation Officer, Toby Unwin, claims to have tackled the data assembly problem by building a technology that scrapes and buckets the necessary data from the jumbled chaos of web portals for state and local courts.  Such a task, in theory, can be performed by fairly standard machine learning, which qualifies as AI, at least in some circles.  

Assuming Premonition has built a machine that can calculate win-rate of lawyers, is that information valuable to clients trying to maximize the likelihood of a favorable result?  I don't know, but its plausible enough to test with data.

Some data skeptics will argue that win-rates, whether high or low, are just artifacts of any normally distributed outcome.  The reasoning runs, "Two, three, and four sigma events occur in the ordinary course of life, but regression to the mean is pulling them back to the center. Thus, they are poor predictors of the future."  This reasoning is why many people buy indexed funds rather than shares in actively managed mutual funds.  Cutting the other way, the hedge fund industry is premised on the belief that some money managers are a lot better than others. Five-year return rates are aggregated and published in the industry trade press.  Some of the returns may be due to random luck, but some could be attributed to superior skill.  It is absurdly unlikely, for example, that Warren Buffet's success in buying and selling stocks is just a 60-year lucky streak.

In the case of win-rates in court, I can think of at least two plausible non-random factors that could affect outcomes:

  1. Judicial bias or favoritism.  Judges, either consciously or unconsciously, may react differently to the case depending upon the advocate.  One does not have to wade too far into the political science literature to find peer-reviewed empirical studies that reveal that judges are influenced by more than just facts and law. 
  2. The gap between credentials and bona fide skill.  Law has historically been a credence good.  This means the market relies on elite credentials and firm reputation as a proxy for skill.  Yet, it is plausible that some lawyers may lack the pedigree to get hired by large, elite law firms, yet they go on to develop outstanding legal skills, perhaps because of superior drive, intellectual curiosity, or "early at-bats" as a prosecutor or public defender. If these folks exist, Big Data can likely find them.

I can't vouch for Premontion's technology beyond two statements: (1) it sounds plausible, and (2) it is a waste of time to debate its usefulness because it's an empirical question that the market will answer in the relatively near term.  

Below is one of Premonition's infographics.

Infographic-Everything-You-Know-About-Lawyer-Selection-Is-Wrong

 

April 12, 2015 in Cross industry comparisons, Current events, Innovations in law, New and Noteworthy | Permalink | Comments (0)

Wednesday, March 18, 2015

Igniting Law Teaching Conference

IgnitingLawTeaching
Registration is now open for LegalED’s Igniting Law Teaching 2015.  The conference is Friday, March 20, 2015 from 9:00 am – 5:00 pm EST at American University Washington College of Law.  It is also available for live viewing by webcast.

The conference will feature talks by 30 law school academics and practitioners from the US, Canada and England in a TEDx-styled conference to share ideas on teaching methodologies.  LegalED’s Teaching Pedagogy video collection includes many of the talks from last year’s conference, which have been viewed collectively more than 5000 times.  The panels for this year include:

  • Law Teaching for the 21st Century
  • Applying Learning Theory to Legal Education
  • The Art and Craft of Law Teaching
  • Using Technological Tools for Legal Education, and Pathways to Practice. 

Here is a link to the topics, speakers and schedule.

 The Igniting Law Teaching conference is unlike other gatherings of law professors.  Here, talks will be styled as TEDx Talks, with each speaker on stage alone, giving a well scripted and performed 8 minute talk about an aspect of law school pedagogy.  In the end, we will create a collection of short videos on law school-related pedagogy that will inspire innovation and experimentation by law professors around the country, and the world, to bring more active learning and practical skills training into the law school curriculum.  The videos will be available for viewing by the larger academic community on LegalED, a website developed by a community of law professors interested in using online technologies to facilitate more active, problem-based learning in the classroom, in addition to better assessment and feedback.

 Readers are encouraged to attend the March 20th conference, either live or virtually

[HT: the tireless and thoughtful Michele Pistone of Villanova Law]

March 18, 2015 in Current events, Fun and Learning in the classroom | Permalink | Comments (0)

Saturday, February 7, 2015

The Early Days of Legal Analytics

LexMachina-logo1There is an interesting story in Forbes on Lex Machina, a legal start-up that provides analytics for use in patent litigation.  See Dan Fisher, Stanford-Bred Startup Uses Moneyball Stats to Handicap Judges, Forbes, Feb. 2, 2015.  The company was created by faculty at Stanford Computer Science and Stanford Law.  As the company emerged from the University, the reigns were handed to Josh Becker, a Stanford JD-MBA.  To date, the company has raised $8 million in start-up funding.  According to the Forbes article, the company's clients include some of the nation's large technology companies plus one-third of the AmLaw 100.

What makes Lex Machina so interesting is that the company is not a NewLaw service provider that trying to take marketshare. Instead, Lex Machina is a toolmaker.  It is a true Big Data company that provides analytics to (a) value contested patents and (b) protect/maximize that value through a litigation strategy that is informed by data.  

The impact of Lex Machina is hard to decipher, primarily because if it does provide an edge, the customers are unlikely to be too vocal. Just like a hedge fund with an effective trading strategy, why advertise the ingredients of your secret sauce? Indeed, compared to other toolmakers (e.g., predictive coding, expert systems) Lex Machina's benefits are less about efficiency and more about affecting the outcomes of cases -- who wins and by how much.  If Lex Machina is truly delivering, it will eventually touch-off a Big Data legal analytics arms race akin to the quant revolution on Wall Street.  Dan Katz frequently makes this point, and I think he is right.  The Forbes article makes the point that Lex Machina is already moving into adjacent areas of IP law and general commercial litigation.  

The broader legal industry is unlikely to notice Lex Machina until it has a substantial liquidity event -- i.e., it's acquired or goes public, making if founders far richer than the BigLaw partners and in-house lawyers they currently serve.  

If we are looking for early signs of a tipping point for legal analytics, one marker may be the number of Stanford Law grads who are turning down entry-level opportunities in BigLaw to pursue legal start-ups.  In recent years, Stanford Law grads fresh out of law school have gone on to found other venture-backed legal start-ups like Ravel Law, Judicata, and Law Gives.  Back in 2013, The Stanford Lawyer (SLS alumni magazine) had an extensive write-up with several examples.  See Sharon, Driscoll, A Positive Disruption, June 4, 2013.  In 2014, Stanford's CSO offered a program titled, An Alternative to BigLaw -- Startups.

The legal world isn't going away; it's just changing.

February 7, 2015 in Cross industry comparisons, Current events, Data on the profession, Innovations in law, New and Noteworthy, Structural change | Permalink | Comments (0)

Saturday, January 17, 2015

Stark on Curriculum Reform (and Law School Pre-reqs)

UnknownTina Stark, a pioneer in transactional skills education, has an interesting and highly readable take on how to organize a modern law school curriculum, including the first year.  It's all insightful, but I think the most powerful suggestion is the creation of three foundational modules - litigation, transactions, and legislation/regulation - in which the offerings would combine theory, doctrine, and skills.  

The piece is a transcript of her keynote address, "What Cornell Veterinary School Taught Me About Legal Education," at the Fourth Biennial Conference of Emory Law School's Center for Transactional Law and Education.

What stirred Tina's reflections was her son's matriculation from the classroom to the clinical portion of his veterinary education.  I posted similar thoughts some time ago reflecting on my son Matthew's (now Dr. Matthew Lipshaw, resident in pediatrics, Yale-New Haven Children's Hospital) med school experience.  I'm still persuaded that there are two things medical education has over legal education (I can't speak to veterinary): (a) the reimbursement system in health care provides resources for skills training not likely to be available any time soon for legal education, and (b) with all due respect to law students (and I know whereof I speak because I was one), even with a semester or a year in a clinical offering, med students work a hell of a lot harder than law students.  We'd have no problem getting both doctrinal and skills training into our students if they were at it six or seven days a week, twelve or thirteen hours a day.

If you download Tina's piece, there's a lagniappe at the end - some thoughts on why there's no law school pre-req for English composition, yet there is in med and engineering school!

 

January 17, 2015 | Permalink | Comments (2)

Tuesday, January 6, 2015

The Variable Affordability of Law School – How Geography and LSAT Profile Impact Tuition Costs

I have posted to SSRN the PowerPoint slides I presented yesterday at the AALS Conference session sponsored by the Section on Law School Administration and Finance.  The presentation was entitled The Variable Affordability of Law School – How Geography and LSAT Impact Tuition Cost.   (I am very grateful to my research assistant, Kate Jirik, and her husband, Sam, for awesome work on the spreadsheet that supported the data I presented.)

The presentation begins with two slides summarizing data presented in my article Reflections on the Decreasing Affordability of Legal Education showing the extent to which average public school and private school tuition increased between 1985 and 2011 relative to law school graduate income.  While many have observed that law school has become increasingly expensive over the last few decades, this "macro" discussion fails to highlight the extent to which differences in tuition exist at a “micro” level either based on geography or on LSAT score.

Using 2012 tuition data, the first set of slides focuses on geographic differences – noting some states where legal education generally is very expensive, some states where legal education generally is very affordable and the balance of states in which tuition costs are in the middle or have a mix of affordable and expensive. 

Following those slides, there is a set of slides that describe the process I used to calculate net tuition costs after accounting for scholarships for all entering first-year students at the 195 fully accredited and ranked law schools in fall 2012 in an effort to allocate all students into a five-by-five grid with five LSAT categories (165+, 160-164, 155-159, 150-154 and <150) and five cost categories ($0-$10,000, $10,000-$20,000, $20,000-$30,000, $30,000-$40,000, and $40,000+).  There then are a set of slides summarizing this data and trying to explain what we can learn from how students are allocated across the five-by-five grid, which includes a set of slides showing the average rank of the schools at which students in each LSAT/Cost category cell are enrolled.

The concluding slide sets forth a couple of short observations about the data. There was a robust discussion with some great questions following the presentation of this data.

Here are four of the slides to give you a flavor for the presentation on net cost generally and then net cost relative to LSAT categories -- Image1
Image1

 

Image1

 

Image1


Image1

January 6, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Sunday, January 4, 2015

Size of the US Legal Market by Type of Client

Washington, DC.  The AALS Section on Professional Responsibility hosted a vigorous discussion today on the evolving ethical duty of competency, a topic partially inspired by the recent changes to Model Rule 1.1 cmt. 8 (requiring lawyers to stay abreast of the "benefits and risks associated with relevant technology").  As part of this panel, I showed a chart on the size of the US legal market, which was promptly tweeted by CALI 's Director of Community Development, Sarah Glassmeyer, a law librarian who is a total data subversive in a style and manner I fully support.

Well, despite a less-than-optimal photo angle, the chart was retweeted and favorited, so I figured I ought to just post the actual chart here. [Click on to enlarge] Legal Market

In a competitive market, the threshold question, asked by potential entrants and those who might finance them, is often the same: "what is the size of the available (or addressible) market?" Because lawyers and law schools are feeling unprecedented economic pressure, I thought it would be worthwhile to run this exercise for the U.S. legal industry and break it down by type of client.

The figures above are estimates of 2014 receipts going to organizations and individuals in the business of providing legal services.  My calculations are derived from US Census Bureau data. They exclude the cost of in-house and government lawyers.  More granular calculation details will be laid out in a forthcoming publication.

At today's AALS Professional Responsibility session, technology was framed as an ethical issue. And that is certainly right:  technology can deliver enormous cost and quality benefits to clients, so we have both a fiduciary and professional duty to be up-to-date.  Yet, there is a flip-side here that is crucially important -- to ignore or fall behind on technology is to run the risk of commercial ruin. This axiom applies to lawyers in private practice and to law schools that want employers to hire their graduates. 

Building upon that theme, I used the Market Size chart to make two points today, one based on the high-end corporate market (right side of chart) and the other directed toward the individual consumer market (left side of chart). 

Re the corporate side, the data show that a relatively small roster of large corporations are spending vast sums each year on legal services -- more than $10 million per year for a publicly held company.  Because large national and international corporations are awash in a sea of growing legal complexity, they are turning to technology, process, and data to keep legal costs in line with overall company revenues.  From the perspective of a large corporate client, the typical junior law firm associate has little to offer.  A more seasoned partner or counsel is a better value, but this is by virtue of experience rather than technology or process.  As a result, law firm hiring remains stagnant, and more legal work is being taken in-house or given to LPOs or New Law legal service providers like Axiom, Elevate, or Novus Law.  It may take a generation for the law school--law firm--legal department supply chain to come into a reasonable alignment.  Right now, it's broken.

Re the individual retail market, the $232 annual legal spend per citizen means that there is not enough money go around to pay for all the legal need.   If a middle-class professional couple with kids has a contested divorce, that could easily chew-up $50,000 to $100,000 in legal fees.  A DUI is likely to cost $1,500.  A worker's comp claim might be 30% of an award.  Probate work runs well into the thousands.  In reality, most citizens go without.  One of our co-panelists today, retired US Magistrate Judge John Facciola, made the claim that 83% of American never talk to a lawyer to help them with a legal problem.  "The middle class is largely gone from federal court."  To my mind, technology is the only vehicle for tapping into a large latent market for legal services.  LegalZoom, Rocket Lawyer, Modria, Shake, and many other legal technology companies all see the potential here. And so do the venture capital and private equity firms that are funding them. 

 Today's panel was one of the most lively I have ever attended at AALS, owing in part to my excellent co-panelists but also an audience that asked some great, tough questions.  Many thanks to Andy Perlman (Suffolk Law) for organizing a terrific session and Natasha Martin (Seattle) for her skillful moderation of the panel.

January 4, 2015 in Current events, Data on the profession, Legal Departments, New and Noteworthy, Structural change | Permalink | Comments (2)

Monday, December 29, 2014

The Composition of Graduating Classes of Law Students -- 2013-2016 -- Part One

PART ONE -- Analyzing the LSAT Profile/Composition of Entering First-Years from 2010 to 2013 and 2014

In the fall of 2013, I had a series of blog posting about the changing demographics of law students.  In the first, I noted that fewer students were coming to law school from elite colleges and universities.  In the second, I noted that between 2010 and 2013 there had been a decline in the number of matriculants with high LSATs and an increase in the number of matriculants with low LSATs such that the “composition” of the class that entered law school in the fall of 2013 was demonstrably less robust (in terms of LSAT profile) than the “composition” of the class that entered law school in the fall of 2010.  In describing this phenomenon, I noted that when the entering class in fall 2013 graduates in 2016, it might encounter greater problems with bar passage than previous classes. 

In light of the significant decline in the median MBE scaled score in July, which Derek Muller has discussed here and here, and which I have discussed here, and a significant decline in first-time bar passage rates in many jurisdictions this year, it seems like an appropriate time to look more closely at changing class profiles and the likely impact on bar passage in the next few years.

This is the first of two blog posts regarding the changing composition of entering classes and the changing composition of graduating classes.  In Part I, I analyze the distribution of LSAT scores across categories based on the LSAC’s National Decision Profiles for the years 2009-2010 through 2012-2013, and then analyze the distribution of law school median LSATs and the 25th percentile LSATs across ranges of LSAT scores.  In Part II, I will analyze how attrition trends have changed since 2010 to assess what that might tell us about the composition of graduating classes three years after entering law school as a way of thinking about the likely impact on bar passage over time.

Tracking Changes Based on National Decision Profiles – 2010-2013

The following discussion summarizes data in the LSAC’s National Decision Profiles from the 2009-10 admission cycle (fall 2010) through the 2012-13 admission cycle (fall 2013).  The National Decision Profile for the 2013-14 admission cycle (fall 2014) has not yet been released.

Let’s start with the big picture.  If you take the matriculants each year and break them into three LSAT categories – 160+, 150-159, and <150 – the following chart and graph show the changes in percentages of matriculants in each of these categories over the last four years. 

Percentage of Matriculants in LSAT Categories – 2010-2013

                        2010    2011    2012    2013

160+                40.8     39        36.3     33.4

150-159           45        45.3     44.3     44.1

<150                14.2     15.7     19.3     22.5

Image1
Notably, this chart and graph show almost no change in the “middle” category (150-159 -- purple) with most of the change at the top (160+ -- orange -- decreasing from 40.8% to 33.4%) and bottom (<150 -- blue -- increasing from 14.2% to 22.5%).  This chart and graph also show only a modest change between 2010 and 2011 with more significant changes in 2012 and again in 2013 – when the percentage of students with LSATs of 160+ declines more substantially and the percentage of students with LSATs of <150 grows more substantially.

While I think this tells the story pretty clearly, for those interested in more detail, the following charts provide a more granular analysis.

Changes in LSAT Distributions of Matriculants – 2010-2013       

                            2010    2011    2012    2013         Chg in Number     % Chg in Number       

170+                3635    3330    2788    2072                -1563               -43%   

165-169           5842    5622    4783    4082                -1760               -30%   

160-164           10666  8678    7281    6442                -4224               -39.6%

155-159           11570   10657  9700    8459                -3111                -26.9%

150-154           10626  9885    8444    8163                -2463               -23.2%

145-149           5131     5196    5334    5541                 410                  8%      

<145                1869    1888    2564    2930                1061    `           56.8% 

                        49339  45256  40894  37689 

Note that in terms of percentage change in the number of matriculants in each LSAT category, the five highest LSAT categories are all down at least 20%, with 160-164 down nearly 40% and 170+ down over 40%, while the two lowest LSAT categories are up, with <145 being up over 50%.

 

Image1
Note that in the line graph above, the top two categories have been combined into 165+ while the bottom two categories have been combined into <150.  Perhaps most significantly, in 2010, the <150 group, with 7,000 students, was over 2,400 students smaller than the next smallest category (165+ with 9.477) and more than 4,500 students smaller than the largest category (155-159 with 11,570).  By 2013, however, the <150 category had become the largest category, with 8,471, just surpassing the 155-159 category, with 8,459, and now 2,300 larger than the smallest category, 165+ with only 6,154.

Changes in Percentage of Matriculants in LSAT Ranges – 2010-2013

                        PERCENTAGE OF MATRICULANTS

                        2010    2011    2012    2013    % Chg in %    

>169                0.074   0.074   0.068   0.055   -25.7%

165-169           0.118   0.124   0.117    0.108   -8.5%  

160-164           0.216   0.192   0.178   0.171   -20.8%

155-159           0.235   0.235   0.237   0.224   -4.7%  

150-154           0.215   0.218   0.206   0.217   0.9%   

145-149           0.104   0.115    0.13     0.147   41.3% 

<145                0.038   0.042   0.063   0.078   105.3%                       

In terms of the “composition” of the class, the percentage of matriculants in each LSAT category, as noted above, little has changed in the “middle” – 155-159 and 150-154, but significant changes have occurred at the top and bottom, with declines of 20% or more at 160-164 and 170+ and with increases of 40% at 145-149 and over 100% at <145.

Tracking Changes in Law School Median LSATs by LSAT Category

A different way of looking at this involves LSAT profiles among law schools over this period.  Based on the data law schools reported in their Standard 509 Reports, from 2010 to 2014, the chart below lists the numbers of law schools reporting median LSATs within certain LSAT ranges.  (This chart excludes law schools in Puerto Rico and provisionally-approved law schools.)

Number of Law Schools with LSAT Medians in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

30

31

26

23

21

160-164

47

41

39

31

29

155-159

59

57

56

53

51

150-154

50

52

53

56

59

145-149

9

14

22

28

29

<145

0

1

0

5

7

 

Image1

The chart above pretty clearly demonstrates the changes that have taken place since 2010, with declines in the number of law schools with median LSATs in higher LSAT categories and increases in the number of law schools with median LSATs in the lower LSAT categories.  The number of law schools with median LSATs of 160 or higher has declined from 77 to 50.  By contrast, the number of law schools with median LSATs of <150 has quadrupled, from 9 to 36.   Moreover, the “mode” in 2010 was in the 155-159 category, with nearly 60 law schools, but as of 2014, the “mode” had shifted to the 150-154 category with nearly 60 law schools.

Number of Law Schools with 25th Percentile LSAT in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

17

16

11

10

10

160-164

26

20

21

17

15

155-159

55

54

49

42

41

150-154

67

69

59

65

57

145-149

26

33

46

48

48

<145

4

4

10

14

25

 

Image1

For those who want to focus on the bottom 25th percentile of LSAT profile among law schools, the chart above shows a similar trend when compared with the medians, except that the number of law schools with a 25th  percentile LSAT between 150-154 also declined (as opposed to an increase with respect to medians). The number of law schools with 25th percentile LSATs of 160 or higher has declined from 43 to 25.  Similarly, the number of law schools with 25th percentile LSATs of 150-159 has declined from 122 to 98.  By contrast, the number of law schools with 25th percentile LSATs of 145-149 has nearly doubled from 26 to 48, while the number of law schools with 25th percentile LSATs of <145 has sextupled from 4 to 25. 

One other way of looking at this is just to see how the average first-year LSAT profiles have changed over the last four years. 

Average LSATs of Matriculants at Fully-Accredited ABA Law Schools

            75th Percentile             Median            25th Percentile

2010                160.5               158.1               155.2

2011                160.1               157.8               154.5

2012                159.6               157                  153.6

2013                158.7               156                  152.6

2014                158.2               155.4               151.8

This shows that between 2010 and 2014, the average 75th percentile LSAT has declined by 2.3 points, the average median LSAT has declined by 2.7 points and that the average 25th percentile LSAT has declined by 3.4 points.

Conclusion

If one focuses on the LSAT score as one measure of “quality” of the entering class of law students each year, then the period from 2010-2014 not only has seen a significant decline in enrollment, it also has seen a significant decline in quality.  On an axis with high LSATs to the left and low LSATs to the right, the “composition” of the entering class of law students between 2010 and 2014 has shifted markedly to the right, as shown in the graph below.  Moreover, the shape of the curve has changed somewhat, thinning among high LSAT ranges and growing among low LSAT ranges.  

Image1

This shift in entering class composition suggests that bar passage rates are likely to continue to decline in the coming years.  But in terms of bar passage, the entering class profile is less meaningful than the graduating class profile.  In part two, I will look at attrition data from 2011 to 2014 to try to quantify the likely “composition” of the graduating classes from 2010 to 2013, which will give us a more refined idea of what to expect in terms of trends in bar passage in 2015 and 2016.

(I am grateful to Bernie Burk and Alice Noble-Allgire for helpful comments on earlier drafts.)

December 29, 2014 in Data on legal education, Structural change | Permalink | Comments (5)

Saturday, December 20, 2014

Further Understanding the Transfer Market -- A Look at the 2014 Transfer Data

This blog posting is designed to update my recent blog posting on transfers to incorporate some of the newly available data on the Summer 2014 transfer market.  Derek Muller also has written about some of the transfer data and I anticipate others will be doing so as well.

NUMBERS AND PERCENTAGES OF TRANSFERS – 2006-2008, 2011-2014

While the number of transfers dropped to 2187 in 2014 down from 2501 in 2013, the percentage of the previous fall’s entering class that engaged in the transfer market remained the same at roughly 5.5%, down slightly from 5.6% in 2013, but still above the percentages that prevailed from 2006-2008 and in 2011 and 2012.

 

2006

2007

2008

2011

2012

2013

2014

Number   of Transfers

2265

2324

2400

2427

2438

2501

2187

Previous   Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

39700

%   of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

5.5%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET – 2012-2014

The following two charts list the top 20 transfer schools in Summer 2012 (fall 2011 entering class), Summer 2013 (fall 2012 entering class) and Summer 2014 (fall 2013 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers from 2012-2014

School

Number   in 2012

School

Number   in 2013

School

Number   in 2014

Florida   State

89

Georgetown

122

Georgetown

113

Georgetown

85

George   Wash.

93

George Wash.

97

George   Wash.

63

Florida   St.

90

Arizona St.

66

Columbia

58

Emory

75

Idaho

57

Mich. State

54

Arizona   State

73

Cal. Berkeley

55

NYU

53

American

68

NYU

53

American

49

Texas

59

Emory

50

Cardozo

48

Columbia

52

Columbia

46

Loyola Marymount

46

NYU

47

American

44

Rutgers   - Camden

42

Minnesota

45

UCLA

44

Minnesota

42

Arizona

44

Wash. Univ.

44

Arizona   State

42

Northwestern

44

Texas

43

Cal. Berkeley

41

UCLA

41

Minnesota

37

Emory

41

Cardozo

38

Northwestern

35

UCLA

39

Southern   Cal.

37

Harvard

33

Northwestern

38

Utah

34

Mich. State

33

Florida

37

Harvard

34

Loyola Marymount

32

Maryland

34

Florida

33

Florida State

31

Michigan

33

Cal. Berkeley

32

Southern   Cal.

30

SMU

31

Wash Univ.

31

Miami

29

Harvard

31

 

 

 

 

 

Largest Law Schools by Transfers as Percentage of Previous First-Year Class

2012-2014 

School

% 2012

School

% 2013

School

% 2014

 

Florida St.

44.5

Florida State

48.1

Arizona State

51.6

Arizona State

24.6

Arizona State

48

Idaho

51.4

Michigan State

17.5

Utah

34.7

Washington Univ.

23.3

Utah

17.5

Emory

29.6

Emory

22.9

Minnesota

17.1

Arizona

28.9

Georgetown

20.8

Emory

16.5

Minnesota

22

George Wash.

20.2

Cal. Berkeley

16.2

George Wash.

21.8

Cal. Berkeley

19.4

Rutgers - Camden

14.9

Georgetown

21.2

Florida St.

18.2

Georgetown

14.7

Rutgers – Camden

20.7

Rutgers - Camden

17.1

Southern Cal.

14.7

Southern Cal.

19.7

Southern Cal.

17.1

Northwestern

14.4

Texas

19.1

Minnesota

16.7

Cincinnati

14.3

Cincinnati

17.5

Utah

15.9

Columbia

14.3

Northwestern

17.1

Northwestern

15.3

Buffalo

14.2

Washington Univ.

15.4

UCLA

15

Arizona

14

Univ. Washington

15.3

Seton Hall

14.5

Cardozo

13.8

Columbia

14.2

Florida Int.

13.9

SMU

13.4

American

13.8

Texas

13.5

Florida

12.7

SMU

13.3

Columbia

13.1

Chicago

12.6

UCLA

13.3

Richmond

12.8

George Wash.

12.5

Chicago

13

Univ. Washington

12.6

 

 

 

 

Houston

12.6

 

Note that in these two charts, the “repeat players” -- those schools in the top 20 for all three years -- are bolded.  In  2013 and 2014, nine of the top ten schools for number of transfers repeated.  (The notable newcomer this year is Idaho, which received 55 transfers from the Concordia University School of Law when Concordia did not receive provisional accreditation from the ABA.)  Across all three years, eight of the top ten schools for percentage of transfers repeated.

Top Ten Law Schools as a Percentage of All Transfers

 

2006

2011

2012

2013

2014

Total Transfers

482

570

587

724

625

Transfers to 10 Schools with Most   Transfers

2265

2427

2438

2501

2187

Transfers to 10 Schools with Most   Transfers as % of   Transfers

21.3%

23.5%

24.1%

28.9%

28.6%

 

The chart above demonstrates an increasing concentration in the transfer market between 2006 and 2014 and even moreso between 2012 and 2014, as the ten law schools with the most students transferring captured an increasing share of the transfer market. 

NATIONAL AND REGIONAL MARKETS BASED ON NEW DATA

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring schools with more than five transfers in to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least twelve students transferred in to the school).  This allows us to begin to explore the nature of the transfer market by looking at where students are coming from and are going and by looking at the first-year GPA profile of students transferring in to different law schools. 

Percentage of Transfers from Within Geographic Region and Top Feeder School(s)

USNews

Ranking

School

# Transfers

Region

Regional

Transfers

Reg. %

Feeder

Schools

#

2

Harvard

33

NE

6

18

Emory-Wash. Univ.

3

4

Columbia

46

NE

19

41

Brooklyn

5

6

NYU

50

NE

20

40

Cornell

8

9

Berkeley

55

CA

43

78

Hastings

18

12

Northwestern

35

MW

24

69

DePaul-Chicago Kent-Loyola

5

13

Georgetown

113

Mid-Atl

49

43

American

13

15

Texas

43

TX

27

63

Baylor

5

16

UCLA

44

CA

31

70

Loyola Marymount

8

18

Wash. Univ.

44

MW

20

45

SLU

4

19

Emory

53

SE

40

75

Atlanta’s John Marshall

20

20

GWU

97

Mid-Atl

78

80

American

54

20

Minnesota

37

MW

21

57

William Mitchell

6

20

USC

30

CA

22

73

Southwestern

5

31

Azizona St.

66

SW

51

77

Arizona Summit

44

45

Florida St.

31

SE

24

77

Florida Coastal

9

61

Miami

29

SE

21

72

Florida Coastal

5

72

American

44

Mid-Atl

14

32

Baltimore-UDC

6

87

Michigan St.

33

MD

33

100

Thomas Cooley

31

87

Loyola Marymount

32

CA

26

81

Whittier

15

 

For this set of 19 schools with the most transfer students, the vast majority obtained most of the transfers from within the geographic region within which the law school is located.   Only two schools (Harvard and American) had fewer than 40% of their transfers from within the region in which they are located and only four others (Columbia, NYU, Georgetown and Washington University) had fewer than 50% of the transfers from within their regions.  Meanwhile, ten of the 19 schools had 70% or more of their transfers from within the region in which the school is located. 

Moreover, several schools had a significant percentage of their transfers from one particular feeder school.  For Berkeley, roughly 33% of its transfers came from Hastings; for Emory, nearly 40% of its transfers came from Atlanta’s John Marshall Law School; for George Washington, over 55% of its transfers came from American; for Arizona State, 67% of its transfers came from Arizona Summit; for Michigan State nearly 95% of its transfers came from Thomas Cooley; for Loyola Marymount, nearly 50% of its transfers came from Whittier; and for Idaho, over 95% of its transfers came from Concordia.

 Percentage of Transfers from Different Tiers of School(s)

Along With First-Year Law School GPA 75th/50th/25th

USNews Ranking

 

# of Trans.

Top 50

# -- %

51-99

# -- %

100-146

# -- %

Unranked

            # -- %

GPA 75th

GPA 50th

GPA 25th

2

Harvard

33

23

70

10

30

0

0

0

0

3.95

3.9

3.83

4

Columbia

46

29

63

14

30

3

7

0

0

3.81

3.75

3.69

6

NYU

50

41

82

7

14

2

4

0

0

3.74

3.62

3.47

9

Berkeley

55

17

31

27

33

6

11

5

9

3.9

3.75

3.68

12

Northwestern

35

16

46

12

34

6

17

1

3

3.73

3.56

3.4

13

Georgetown

113

27

24

38

34

17

15

31

27

3.77

3.67

3.55

15

Texas

43

17

40

13

3

9

21

4

9

3.62

3.45

3.11

16

UCLA

44

15

34

23

52

2

5

4

9

3.73

3.58

3.44

18

Wash. Univ.

44

3

7

25

57

1

2

15

34

3.43

3.2

3.06

19

Emory

53

3

6

7

13

8

15

35

66

3.42

3.27

2.93

20

GWU

97

13

13

73

75

11

11

0

0

3.53

3.35

3.21

20

Minnesota

37

4

11

12

32

18

49

3

8

3.3

3.1

2.64

20

USC

30

1

3

11

37

6

20

12

40

3.71

3.59

3.44

31

Arizona St.

66

4

6

5

8

8

12

49

74

3.51

3.23

2.97

45

Florida St.

31

2

6

4

13

3

10

22

71

3.29

3.1

2.9

61

Miami

29

1

3

4

14

6

21

18

62

3.3

3.07

2.87

72

American

44

2

5

14

32

3

7

25

57

3.25

2.94

2.78

87

Michigan St.

33

0

0

0

0

1

3

32

97

3.19

3.05

2.83

87

Loyola Mary

32

0

0

0

0

1

3

31

97

3

3

3

 

The chart above shows the tiers of law schools from which the largest schools in the transfer market received their transfer students.  Thirteen of the top 19 schools for transfers are ranked in the top 20 in USNews, but of those 13, only six had 80% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, Northwestern, UCLA and George Washington.  Three additional schools had at least 50% of their transfers from schools ranked between 1 and 99, Berkeley, Georgetown and Washington University.  The other ten schools had at least half of their transfer students from schools ranked 100 or lower, with some schools having a significant percentage of their transfers from schools ranked alphabetically.  This data largely confirms the analysis of Bill Henderson and Jeff Rensberger regarding the rankings migration of transfers – from lower ranked schools to higher ranked schools.

In addition, as you move down the rankings of transfer schools, the general trend in first-year law school GPA shows a significant decline, with several highly-ranked schools taking a number of transfers with first-year GPAs below a 3.0, including Emory, Minnesota, Arizona State, and Florida State.

STILL MANY UNKNOWNS

This new data should be very helpful to prospective law students and to current law students who are considering transferring.  This data gives them at least a little better idea of what transfer opportunities might be available to them depending upon where they go to law school as a first-year student.

Even with this more granular data now available, however, as I noted in my earlier posting on transfer students, there still are a significant number of unknowns relating to transfer students.  These unknowns cover several different points.  

First, what is the acceptance rater for transfers?  We now know how many transferred came from different schools and we have some idea of first-year GPA ranges for those admitted as transfers, but we do not know the acceptance rate on transfers.  Are a significant percentage of transfers not admitted or are most students interested in trasnferring finding a new home someplace.

Second, what are motivations of transfers and what are the demographics of transfers?  Are transfers primarily motivated by better employment opportunities perceived to be available at the higher-ranked law school?  Are some subset of transfers primarily motivated by issues regarding family or geography (with rankings and employment outcomes as secondary concerns)?

Third, how do the employment outcomes of transfer students compare with the employment outcomes of students who started at a given law school?  Does the data support the perception that those who transfer, in fact, have better employment outcomes by virtue of transferring?

Fourth, what are the social/educational experiences of transfers in their new schools and what is the learning community impact on those schools losing a significant number of students to the transfer market?

For those interested in these issues, it might make sense to design some longitudinal research projects that could help find answers to some of these questions.

December 20, 2014 in Current events, Data on legal education | Permalink | Comments (0)

Wednesday, December 10, 2014

BETTER UNDERSTANDING THE TRANSFER MARKET

What do we know about the transfer student market in legal education? 

Not enough.  But that will begin to change in the coming weeks.

NUMBER/PERCENTAGE OF TRANSFER STUDENTS HAS INCREASED MODESTLY

Up until this year, the ABA Section of Legal Education and Admissions to the Bar only asked law schools to report the number of transfer students “in” and the number of transfer students “out.”  This allowed us to understand roughly how many students are transferring and gave us some idea of where they are going, and where they are coming from, but not with any direct “matching” of exit and entrance.

Has the number and percentage of transfer students changed in recent years?

In 2010, Jeff Rensberger published an article in the Journal of Legal Education in which he analyzed much of the then available data regarding the transfer market and evaluated some of the issues associated with transfer students.  He noted that from 2006 to 2009 the number of transfer students had remained within a range that represented roughly 5% of the rising second-year class (after accounting for other attrition) – 2,265 in summer 2006, 2,324 in summer 2007, 2,400 in summer 2008, and 2,333 in summer 2009.)  

Using data published in the law school Standard 509 reports, the number of transfers in 2011, 2012 and 2013 has increased only marginally, from 2427 to 2438 to 2501, but, given the declining number of law students, it has increased as a percentage of the preceding year’s first-year “class,” from 4.6% to 5.6%.  Thus, there is a sense in which the transfer market is growing, even if not growing dramatically.

Numbers of Transfer Students 2006-2008 and 2011-2013

 

2006

2007

2008

2011

2012

2013

Number of Transfers

2265

2324

2400

2427

2438

2501

Previous Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

% of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET

In 2008, Bill Henderson and Brian Leiter highlighted issues associated with transfer students.   Henderson and Leiter were discussing the data from the summer of 2006.  Brian Leiter posted a list of the top ten law schools for net transfer students as a percentage of the first year class.  Bill Henderson noted the distribution of transfer students across tiers of law schools (with the law schools in the top two tiers generally having positive net transfers and the law schools in the bottom two tiers generally having negative net transfers), something Jeff Rensberger also noted in his 2010 article.   

Things haven’t changed too much since 2006.  In 2012, there were 118 law schools with fewer than 10 “transfers in” representing a total of 485 transfers – slightly less than 20% of all transfers.  On the other end, there were 21 schools with 30 or more “transfers in” totaling 996 transfers -- nearly 41% of all transfers. Thus, roughly 10% of the law schools occupied 40% of the market (increasing to nearly 44% of the market in 2013).

We also know who the leading transfer schools have been over the last three years.  The following two charts list the top 20 transfer schools in Summer 2011 (fall 2010 entering class), Summer 2012 (fall 2011 entering class) and Summer 2013 (fall 2012 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers in 2012 and 2013

(BOLD indicates presence on list all three year)

 

School

Number in 2011

School

Number in 2012

School

Number in 2013

George Wash.

104

Florida State

89

Georgetown

122

Georgetown

71

Georgetown

85

George Wash.

93

Florida St.

57

George Wash.

63

Florida St.

90

New York Univ.

56

Columbia

58

Emory

75

American

53

Michigan State

54

Arizona State

73

Michigan State

52

New York Univ.

53

American

68

Columbia

46

American

49

Texas

59

Cardozo

45

Cardozo

48

Columbia

52

Loyola Marymount

44

Loyola Marymount

46

New York Univ.

47

Washington Univ.

42

Rutgers - Camden

42

Minnesota

45

Cal. Los Angeles

40

Minnesota

42

Arizona

44

Michigan

39

Arizona State

42

Northwestern

44

Northwestern

39

Cal. Berkeley

41

Cal. Los Angeles

41

Rutgers - Camden

36

Emory

41

Cardozo

38

San Diego

35

Cal. Los Angeles

39

Southern Cal.

37

Arizona State

34

Northwestern

38

Utah

34

Brooklyn

33

Florida

37

Harvard

34

Cal. Hastings

32

Maryland

34

Florida

33

Minnesota

31

Michigan

33

Cal. Berkeley

32

Lewis & Clark

30

SMU

31

Washington Univ.

31

Harvard

30

Harvard

31

   

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class

(BOLD indicates presence on list in both years)

 

 School

Percentage 2011 (as a percentage of the 2010 first year class)

School

Percentage 2012

(as a percentage of the 2011 first year class)

School

Percentage 2013

(as a percentage of the 2012 first year class)

Florida St.

28.6

Florida St.

44.5

Florida State

48.1

George Wash.

19.9

Arizona State

24.6

Arizona State

48

Utah

19.7

Michigan State

17.5

Utah

34.7

Arizona State

17.8

Utah

17.5

Emory

29.6

Michigan State

17.4

Minnesota

17.1

Arizona

28.9

Washington and Lee

15.3

Emory

16.5

Minnesota

22

Washington Univ.

15.2

Cal. Berkeley

16.2

George Wash.

21.8

Loyola Marymount

15.1

Rutgers - Camden

14.9

Georgetown

21.2

Northwestern

14.2

Georgetown

14.7

Rutgers – Camden

20.7

Richmond

13.7

Southern Cal.

14.7

Southern Cal.

19.7

Rutgers - Camden

13.4

Northwestern

14.4

Texas

19.1

Cal. Los Angeles

13

Cincinnati

14.3

Cincinnati

17.5

Cal. Davis

12.8

Columbia

14.3

Northwestern

17.1

Lewis & Clark

12.1

Buffalo

14.2

Washington Univ.

15.4

Georgetown

12

Arizona

14

Univ. Washington

15.3

Minnesota

11.9

Cardozo

13.8

Columbia

14.2

New York Univ.

11.8

SMU

13.4

American

13.8

Cardozo

11.8

Florida

12.7

SMU

13.3

Columbia

11.4

Chicago

12.6

Cal. Los Angeles

13.3

Buffalo

11

George Wash.

12.5

Chicago

13

 

Note that in these two charts, the “repeat players” are bolded – those schools in the top 20 for all three years – 2011, 2012 and 2013.  (Four of the top ten schools Leiter highlighted from the summer of 2006 remain in the top ten as of the summer of 2013, with four others still in the top 20.)  In addition, it is worth noting some significant changes between 2011 and 2013.  For example, the number of schools with 50 or more transfers increased from six to eight with only two schools with more than 70 transfers in 2011 and 2012, but with five schools with more than 70 transfers in 2013. 

Leiter’s top ten law schools took in a total of 482 transfers, representing 21.3% of the 2,265 transfers that summer.  The top ten law schools in 2011 totaled 570 transfers, representing 23.5% of the 2427 transfer students that summer.  The top ten law schools in 2012 totaled 587 transfers, representing 24.1% of the 2438 transfers that summer.  The top ten law schools in 2013, however, totaled 724 students, representing 28.9% of the 2501 transfers in 2013, demonstrating an increasing concentration in the transfer market between 2006 and 2013 and even moreso between 2012 and 2013. 

In addition, three of the top four schools with the highest number of transfers were the same all three years, with Georgetown welcoming 71 in the summer of 2011, 85 in the summer of 2012, and 122 in the summer of 2013, George Washington, welcoming 104 in the summer of 2011, 63 in the summer of 2012, and 93 in the summer of 2013, and Florida State welcoming 57 in the summer of 2011, 89 in the summer of 2012 and 90 in the summer of 2013.  (Notably, Georgetown and Florida State were the two top schools for transfers in 2006, with 100 and 59 transfers in respectively.)

Similarly, three of the top four schools with the highest “percentage of transfers” were the same all three years, with Utah at 19.7% in 2011, 17.5% in 2012 and 34.7% in 2013, Arizona State at 17.8% in 2011, 24.6% in 2012 and 48% in 2013, and Florida State at 28.6% in 2011, 44.5% in 2012 and 48.1% in 2013.  The top five schools on the “percentage of transfers” chart all increased the “percentage” of transfer students they welcomed between 2011 and 2013, some significantly, which also suggests greater concentration in the transfer market between 2011 and 2013.

More specifically, there are several schools that have really “played” the transfer game in the last two years – increasing their engagement by a significant percentage.  These eight schools had 10.2% of the transfer market in 2011, but garnered 22.2% of the transfer market in 2013.

Schools with Significant Increases in Transfers 2011-2013

School

2011

2012

2013

Percentage Increase

Texas

6

9

59

883%

Arizona

6

24

44

633%

Emory

19

41

75

295%

Arizona State

34

42

73

115%

Georgetown

71

85

122

70%

Florida State

57

89

90

58%

Southern Cal

24

29

37

54%

Minnesota

31

42

45

45%

Totals

248

371

555

124%

 

REGIONAL MARKETS

There appear to be “regional” transfer markets.  In the Southeast in 2013, for example, three schools -- Florida State, Florida and Emory -- had a combined net inflow of 180 transfer students, while Stetson and Miami were flat (43 transfers in and 42 transfers in, combined) and eight other schools from the region -- Florida Coastal, Charlotte, Charleston, Atlanta’s John Marshall, St. Thomas University, Ave Maria, Florida A&M, Nova Southeastern – had a combined net outflow of 303.  It seems reasonable to assume that many of the transfers out of these schools found their way to Emory, Florida and Florida State (and perhaps to Miami and Stetson to the extent that Miami and Stetson lost students to Emory, Florida and Florida State).

NEW DATA – NEW INSIGHTS

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar is collecting and requiring schools to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least five students at the school transferred in).  As a result, we will be able to delineate the regional transfer markets (as well as those schools with more of a national transfer market.

Notably, even though the Section of Legal Education and Admissions to the Bar is not requiring the gathering and publication of the 75%, 50%, and 25% LSAT and UGPA, one thing we are very likely to learn is that for many schools, the “LSAT/UGPA” profile of transfers in is almost certainly lower than the LSAT/UGPA profile of the first-year matriculants in the prior year, a point that both Henderson and Rensberger highlight in their analyses. 

Just look at the schools in the Southeast as an example.  Assume Emory, Florida State and Florida (large “transfer in” schools) are, in fact, admitting a significant number of transfer students from other schools in the Southeast region, such as Miami and Stetson, and schools like Florida Coastal, St. Thomas University, Charlotte, Atlanta’s John Marshall and Ave Maria (large “transfer out” schools in the Southeast).  Even if they are taking students who only came from the top quarter of the entering classes at those schools, the incoming transfers would have a significantly less robust LSAT/UGPA profile when compared with the entering class profile at Emory, Florida State or Florida in the prior year.  Virtually every student who might be transferring in to Emory, Florida or Florida State from one of these transfer out schools (other than Miami and perhaps Stetson) is likely to be in the bottom quarter of the entering class LSAT profile at Emory, Florida, and Florida State.

Comparison of Relative Profiles of Southeast Region Transfer In/Out Schools

TRANSFER IN SCHOOLS

2012 LSAT

2012 UGPA

TRANSFER OUT SCHOOLS

2012 LSAT

 2012 UGPA

Emory

166/165/161

3.82/3.70/3.35

Miami

159/156/155

3.57/3.36/3.14

Florida

164/161/160

3.73/3.59/3.33

Stetson

157/157/152

3.52/3.28/3.02

Florida State

162/160/157

3.72/3.54/3.29

St. Thomas (FL)

150/148/146

3.33/3.10/2.83

 

 

 

Florida Coastal

151/146/143

3.26/3.01/2.71

 

 

 

Charlotte

150/146/142

3.32/2.97/2.65

 

 

 

Atlanta’s John Marshall

153/150/148

3.26/2.99/2.60

 

 

 

Ave Maria

153/148/144

3.48/3.10/2.81

 

This raises an interesting question about LSAT and UGPA profile data.  If we assume that LSAT and UGPA profile data are used not only by law schools as predictors of performance, but that third parties also use this data as evidence of the “strength” of the student body, and ultimately the graduates, of a given law school (for example, USNEWS in its rankings and employers in their assessment of the quality of schools at which to interview), what can we surmise about the impact from significant numbers of transfers?  For those law schools with a significant number/percentage of “transfers in” from law schools whose entering class profiles are seemingly much weaker, the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class.  Similarly, if the “transfers out” from a given school happen to come from the top half of the entering class profile, then for these schools as well the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class. 

Using the chart above, if Emory, Florida and Florida State are drawing a significant number of transfers from the regional transfer out schools, and if they had to report the LSAT and UGPA profile of their second-year class rather than their first-year class, their LSAT and UGPA profiles almost certainly would decline.   (The same likely would be true for other law schools with large numbers of transfers.)

STILL MANY UNKNOWNS

Even with more granular data available in the near future to delineate more clearly the transfer pathways between transfer out schools and transfer in schools, there still will be a significant number of unknowns relating to transfer students, regarding employment outcomes, the demographics of transfers, the experience of transfers and the motivation for transfers.

First, with respect to the employment outcomes of transfer students, how do they compare with the employment outcomes for students who started at a law school as first-years? Do the employment outcomes for transfer students track that of students who started at a law school as first-years, or is the employment market for transfer students less robust than it is for students who started at a law school as first-years?  Are the employment outcomes nonetheless better than they might have been at the school from which they transferred? These are important questions given the perception that many students transfer “up” in the rankings to improve their employment opportunities. 

Second, with respect to demographics, do students of color and women participate proportionately in the transfer market or is the market disproportionately occupied by white males?

Third, with respect to the experience of transfers, the Law School Survey of Student Engagement gathered some data from participating law schools in 2005 regarding the experience of transfers but more could be done to better understand how integrated transfer students are in the life of the learning community into which they transfer.

Fourth, with respect to the motivations of transfers, it is generally assumed that transfers are “climbing” the rankings, and Henderson’s data broadly suggests movement from lower-ranked schools to higher-ranked schools, but what percentage of transfers are doing so partly or primarily for geographic reasons – to be near family or a future career location?  How many are transferring for financial reasons because they lost a conditional scholarship after their first year of law school?  How many truly are transferring to get a JD from a higher ranked law school?  How many of those believe their job opportunities will be better at the school to which they are transferring?

We will have answers to some questions soon, but will still have many questions that remain unanswered.

December 10, 2014 in Data on legal education | Permalink | Comments (10)

Tuesday, December 2, 2014

The Market for Law School Applicants -- A Milestone to Remember

In early 2013, Michael Moffitt, the dean of Oregon Law, was interviewed by the New York Times about the tumult affecting law schools. Moffitt, who is a very thoughtful guy, reponded, "I feel like I am living a business school case study.”  

I think the analogy to the business school case study is a good one.  In the nearly two years since that story was published, the market for law school applicants has actually gotten worse.

Yesterday's Dealbook column in the New York Times featured Northwestern Law Dean Dan Rodriguez (who also serves at President of the AALS) speaking candidly about the meltdown dynamics that have taken hold.  See Elizabeth Olson, "Law School is Buyer's Market, with Top Students in Demand," New York Times, Dec. 1, 2014. 

DanRodriguez"It's insane," said Rodriguez, "We’re in hand-to-hand combat with other schools." The trendlines are indeed terrible.  Year-over-year, LSAT test-taker volume is down another 8.7%.  See Organ, LWB, Nov 11, 2014.  So we can expect the situation to get worse, at least in the near term.      

I applaud Dan Rodriguez for this leadership instincts.  He is being transparent and honest.  Several years ago the leadership of the AALS went to great lengths to avoid engagement with the media. Dan has gone the opposite direction, inviting the press into our living room and kitchen.  

Want to know what leadership and judgment look like?  It looks like Dan's interview with Elizabeth Olson.  Dan's words did not solve anyone's problem, but his honesty and candor made it more likely that we help ourselves.  Because it's Northwestern, and Dan is president of the AALS (something the story did not mention but most of us know), and this was reported by Elizabeth Olson in the New York Times, the substance and tenor of discussions within law school faculties is bound to shift, at least slightly and in the direction favoring change.   

What is the de facto plan at most law schools these days?  Universities are not going to backstop law schools indefinitely. I think the sign below is not far off the mark.  

Outrun-the-bear

We are indeed living through a business school case study, which is both bad and good.   At many schools -- likely well more than half --  hard choices need to be made to ensure survival.  (And for the record, virtually all schools, regardless of rank, are feeling uncomfortable levels of heat.)   A law school needs cash to pay its expenses.  But it also needs faculty and curricula to attract students. The deeper a law school cuts, the less attractive it becomes to students.  Likewise, pervasive steep discounts on tuition reflect a classic collective action problem. Some schools may eventually close, but a huge proportion of survivors are burning through their financial reserves.  

Open admissions, which might pay the bills today, will eventually force the ABA and DOE to do something neither really want to do -- aggressively regulate legal education.  This is not a game that is likely to produce many winners.  Rather than letting this play out, individual law schools would be much better off pursuing a realistic strategic plan that can actually move the market. 

The positive side of the business school case study is that a few legal academics are finding their voice and learning -- for the first time in several generations -- how to lead.  Necessity is a wonderful tutor.  Law is not an industry on the decline -- far from it.  The only thing on the decline is the archetypal artisan lawyer that law schools are geared to churn out.  Indeed, back in 2013 when Dean Moffitt commented about living through a business school case study, he was not referencing imminent failure.   Sure, Moffitt did not like the hand he was being dealt, but as the 2013 article showed, his school was proving to be remarkably resourceful in adapting.

The good news resides on the other side of a successful change effort.  The process of change is painful, yet the effects of change can be transformative and make people truly grateful for the pain that made it all possible.  In our case, for the first time in nearly a century, what we teach, and how we teach it, is actually going matter.  If we believe serious publications like The Economist, employers in law, business, and government need creative problem solvers who are excellent communicators, adept at learning new skills, and comfortable collaborating accross multiple disciplines -- this is, in fact, a meaningful subset of the growing JD-Advantage job market.

In the years to come, employers will become more aggressive looking for the most reliable sources of talent, in part because law schools are going to seek out preferred-provider relationships with high quality employers.  Hiring based on school prestige is a remarkably ineffective way to build a world-class workforce -- Google discovered this empirically.  

From an employer perspective, the best bet is likely to be three years of specialized training, ideally where applicants are admitted based on motivation, aptitude, and past accomplishments. The LSAT/UGPA grid method misses this by a wide margin. After that, the design and content of curricula are going to matter.  It is amazing how much motivated students can learn and grow in three years. And remarkably, legal educators control the quality of the soil.  It brings to mind that seemingly trite Spiderman cliche about great power.

For those of us working in legal education, the next several years could be the best of times or the worst of times.  We get to decide.  Yesterday's article in the Times made it a little more likely that we actually have the difficult conversations needed to get to the other side. 

December 2, 2014 in Current events, Data on legal education, Innovations in legal education, New and Noteworthy, Structural change | Permalink | Comments (4)

Wednesday, November 26, 2014

The Michael S. Maurer Crossword Puzzle

Maurer_announce_20081204_261Apropos of not much at all, I noticed when printing my New York Times crossword puzzle this morning that its constructor was the same Michael S. Maurer who is the named benefactor of Bill Henderson's school.

I knew Mickey Maurer when I was in Indianapolis and he ran the Indiana Economic Development office.  Stand up guy.  He told me that he wasn't very good at doing crosswords, even though he was a regular contributor to the New York Times.

N.B.:  Will Shortz is also an Indiana U. grad.  I don't know if that's a coincidence.

November 26, 2014 | Permalink | Comments (2)

Tuesday, November 11, 2014

What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?

The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July.  Specifically, in a letter to law school deans, the NCBE stated that:  “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results.  All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”

Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.”  Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."

I am not persuaded.   

(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.”  Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)

I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.

If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.

The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score.  The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.

If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013.  But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.

Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0.  The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.

THINKING ABOUT GROUPS

The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam.  I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).

What about “first-time takers”?  That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction.  Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.

So who’s left?  Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014.  If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.

COMPARING LSAT PROFILES

The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013).  The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points? 

The answer to that is no.

The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.

To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade. 

Defining “student quality” can be a challenge.  A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150. 

While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.

In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.

To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134.  To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class. 

(In his article, Unpacking the BarOf Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores.  I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013. 

LSAT RANGE

Prediction of Success on the Bar Exam Based on Lowest LSAT in Range

175-180

.98

170-174

.97

165-169

.95

160-164

.91

155-159

.85

150-154

.76

145-149

.65

140-144

.50

135-139

.36

130-134

.25

Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)

One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010.  Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once.  Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once.  This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.

Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.

Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score.  Those two lines are plotted below for the period from 2005-2012.  (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)

Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012

  Image1

What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes.  But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores.  From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points.  Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points.  This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.

How does this help us understand what has happened in 2014 relative to 2013?  The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010).  Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively. 

Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores?  Certainly.  This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.

But that is not what we have experienced.  We have experienced an historic decline of 2.8 points.  Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.” 

THE EXAMSOFT DEBACLE

What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle

For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam.  This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions). 

In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before.  If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.

ONE STATE’S EXPERIENCE BELIES THE NCBE STORY

It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates.  It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates.  To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).

Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”

The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013.  In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years.  In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.

YEAR

2004

2005

2006

2007

2008

2009

2010

2011

2012

2013

2014

First Time Bar Passage Rate

 

87%

 

 

86%

 

88%

 

89%

 

90%

 

93%

 

91%

 

90%

 

92%

 

93%

 

 

84%

Change from Prior Year

 

 

-1

 

2

 

1

 

1

 

3

 

-2

 

-1

 

2

 

1

 

 

-9

 

The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013.  But here is the problem for the NCBE.

Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam.  The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa.  In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate).  In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year. 

Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively).  In both years, Iowa had a profile of 164/161/158.  In both years, Drake had a profile of 158/156/153.  This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade.  The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.

TIME WILL TELL

This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014.  What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.

When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers.  When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).

It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions.  In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.

November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (4)

Monday, October 20, 2014

What Law Schools Can Learn from Dental Schools in the 1980s Regarding the Consequences of a Decline in Applicants

For four consecutive years we have seen a decline in the number of applicants to law school and a corresponding decline in the number of matriculating first-year students.  Over the last year or two, some have suggested that as a result of this “market adjustment” some law schools would end up closing.  Most recently, the former AALS President, Michael Olivas, in response to the financial challenges facing the Thomas Jefferson Law School, was quoted as stating that he expects several law schools to close. 

To date, however, no law schools have closed (although the Western Michigan University Thomas M. Cooley Law School recently announced the closure of its Ann Arbor branch).  

Have law schools found ways to cut costs and manage expenses in the face of declining revenues such that all will remain financially viable and remain in operation?  Is it realistic to think that no law schools will close?

Although there may be a number of people in the legal academy who continue to believe that somehow legal education is “exceptional” – that market forces may impose financial challenges for law schools in the near term, but will not result in the closing of any law schools -- this strikes me as an unduly optimistic assessment of the situation. 

To understand why, I think those in legal education can learn from the experience of those in dental education in the 1980s.

The Dental School Experience from 1975-1990

In the 1980s, dental school deans, along with provosts and presidents at their host universities, had to deal with the challenge of a significant decline in applicants to dental school. 

At least partially in response to federal funding to support dental education, first-year enrollment at the country’s dental schools grew throughout the 1970s to a peak in 1979 of roughly 6,300 across roughly 60 dental schools.  Even at that point, however, for a number of reasons -- improved dental health from fluoridation, reductions in federal funding, high tuition costs and debt loads -- the number of applicants had already started to decline from the mid-1970s peak of over 15,000. 

By the mid-1980s, applicants had fallen to 6,300 and matriculants had fallen to 5,000.  As of 1985, no dental schools had closed.  But by the late 1980s and early 1990s there were fewer than 5000 applicants and barely 4000 first-year students – applicants had declined by more than two-thirds and first-year enrollment had declined by more than one-third from their earlier peaks. (Source – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author).)

How did dental schools and their associated universities respond to this changing market?  Between 1986 and 1993, six private universities closed their dental schools: Oral Roberts University, Tulsa, Oklahoma (1986); Emory University, Atlanta, Georgia (1988); Georgetown University, Washington, D.C. (1990); Fairleigh Dickinson University, Rutherford, New Jersey (1990); Washington University, St. Louis, Missouri (1991); and Loyola University, Chicago, Illinois (1993). (Source: Dental Education at the Crossroads:  Challenges and Change, Table 1.1 (Institute of Medicine 1995)).  According to a New York Times article from October 29, 1987, “Georgetown, formerly the nation's largest private dental school, decided to close after a Price Waterhouse study found that the school would have a $3.6 million deficit by 1992.” (Source: Tamar Lewin, Plagued by Falling Enrollment, Dental Schools Close or Cut Back, New York Times, Oct. 29, 1987).

Some of the primary factors contributing to the closing of dental schools were described as follows:

Financial issues were repeatedly described as critical. Dental education was cited as an expensive enterprise that is or may become a drain on university resources. On average, current-year expenditures for the average dental school are about $1 million more than current revenues. … The declining size and quality of the applicant pool during the 1980s played a role in some closures by threatening the tuition base and prestige on which private schools rely. Faculty and alumni resistance to change may feed impatience among university administrators. In some institutions, the comparative isolation of dental schools within the university has provided them with few allies or at least informed colleagues and has left them ill-prepared to counter proposals for "downsizing." (Source: Dental Education at the Crossroads:  Challenges and Change, at 202-203 (Institute of Medicine 1995)). 

The Law School Experience from 2004-2014

In terms of applicants and enrollment over the last decade, the trends law schools have experienced look remarkably comparable to the experience of dental schools in the 1970s and 1980s.  According to the LSAC Volume Summary, applicants to law schools peaked in 2004 with 100,600 applicants (and roughly 48,200 first-year students).  By 2010, applicants had fallen to roughly 87,600, but first-year enrollment peaked at 52,500.  Over the last four years, applicants have fallen steadily to roughly 54,700 for fall 2014, with a projected 37,000 first-years matriculating this fall, the smallest number since 1973-74, when there were 40 fewer law schools and over one thousand fewer law professors.  (Source - ABA Statistics)(For the analysis supporting this projection of 37,000 first-years, see my blog post on The Legal Whiteboard from March 18, 2014.)  

The two charts below compare the dental school experience from 1975 to 1990 with the law school experience in the last decade.  One chart compares dental school applicants with law school applicants and one chart compares dental school first-years with law school first-years.  (Note that for purposes of easy comparison, the law school numbers are presented as one-tenth of the actual numbers.)

Applicants

First years

(Sources – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author) and the LSAC’s Volume Summary  (with my own estimates for 2014 based on the LSAC’s Current Volume Summary)).

The Law School Experience 2014-2019

Notably, these charts do not bode well for law schools.  The law school experience tracks pretty closely the dental school experience over the first ten years reflected in the charts.  For law schools, 2014 looks a lot like 1985 did for dental schools.

There might be any number of reasons why the law school experience over the next several years might be different from the dental school experience in the late 1980s and early 1990s, such that the next several years do not continue as a downward trend in applicants and matriculants.  The market forces associated with changes in the dental profession and dental education in the 1980s are not the same as the market forces associated with changes in the legal profession and legal education in the 2010s and the cost structures for dental education and legal education are not exactly the same.

The problem for law schools, however, is that without an upward trend law schools will continue to face significant financial pressures for the next few years just as dental schools did in the late 1980s.  There might be some encouraging news on the employment front over the next few years as the decreasing number of matriculants will mean a decreasing number of graduates in 2015, 2016 and 2017.  Even without any meaningful growth in the employment market for law graduates, this decline in the number of graduates should mean significant increases in the percentage of graduates finding full-time, long-term employment in bar passage required jobs.  Over time, this market signal may begin to gain traction among those considering law school such that the number of applicants to law school stops declining and perhaps starts increasing modestly. 

But the near term remains discouraging.  The number of people taking the June 2014 LSAT was down roughly 9% compared to June 2013 and the anticipation is that the number of test-takers in the most recent administration in late September was down as well compared to October 2013.  Thus, applicants well might be down another 5-8% in the 2014-15 admissions cycle, resulting in perhaps as few as 51,000 applicants and perhaps as few as 35,000 matriculants in fall 2015.  Even if things flatten out and begin to rebound modestly in the next few years, it would appear to be unlikely that the number of matriculants will climb back near or above 40,000 before the fall of 2017 or 2018.

Moreover, if current trends continue, the matriculants in 2015 also are going to have a significantly less robust LSAT/GPA profile than the matriculants in fall 2010.   As I noted in a blog posting on March 2, 2014, between 2010 and 2013, the number of law schools with a median LSAT less than 150 grew from 9 to 32, and the number with a median LSAT of 145 or below grew from 1 to 9.

What Does this Mean for the Average Law School?

Assume you are the Dean at a hypothetical private law school that had 600 students (200 in each class) and a budget based on $18 million in JD tuition revenue in 2010-11.  (This reflects a net tuition of $30,000 from each student – with nominal tuition set at $40,000 but with a discount rate of 25%.)  Further assume that with this budget, your law school was providing $2.0 million annually to the university with which it is affiliated.  As of 2010-11, your entering class profile reflected a median LSAT of 155 and a median GPA of 3.4.

Assume first-year enrollment declined to 170 in 2011, to 145 in 2012, and to 125 in 2013, a cumulative decrease in first-year enrollment since 2010 of 37%.  As you tried to balance enrollment and profile, the law school managed to maintain its median LSAT and GPA in 2011, but saw its LSAT and GPA medians decline to 153 and 3.35 in 2012 and to 152 and 3.30 in 2013.

This means that for the 2013-14 academic year, the law school had only 440 students, a decrease of roughly 27% from its total enrollment of 600 in 2010, with a much less robust entering class profile in comparison with the entering class profile in 2010. (Note that this assumes no attrition and no transfers in or out, so if anything, it likely overstates total enrollment).  (For comparison purposes, the National Jurist recently listed 25 law schools with enrollment declines of 28% or more between 2010-11 and 2013-14.)

Assume further that the law school had to increase its scholarships to attract even this smaller pool of students with less robust LSAT/GPA profiles, such that the net tuition from each first-year student beginning in fall 2012 has been only $25,500 (with nominal tuition now set at $42,500, but with a discount rate of 40%). 

For the 2013-14 academic year, therefore, you were operating with a budget based on $12,411,000 in JD tuition revenue, a decrease in JD tuition revenue of over $5.5 million since the 2010-11 academic year, over 30%.  (170 x $32,500 for third years ($5.525 million), 145 x $25,500 for second years ($3.698 million), and 125 x $25,500 for first-years ($3.188 million)).

What does this mean?  This means you have been in budget-cutting mode for over three years.  Of course, this has been a challenge for the law school, given that a significant percentage of its costs are for faculty and staff salaries and associated fringe benefits.  Through the 2013-14 academic year, however, assume you cut costs by paring the library budget, eliminating summer research stipends for faculty, finding several other places to cut expenditures, cutting six staff positions and using the retirement or early retirement of ten of your 38 faculty members as a de facto “reduction in force,” resulting in net savings of $3.59 million.  In addition, assume you have gotten the university to agree to waive any “draw” saving another $2 million (based on the “draw” in 2010-2011).  Thus, albeit in a significantly leaner state, you managed to generate a “balanced” budget for the 2013-14 year while generating no revenue for your host university.    

The problem is that the worst is yet to come, as the law school welcomes a class of first-year students much smaller than the class of third-years that graduated in May.  With the continued decline in the number of applicants, the law school has lower first-year enrollment again for 2014-15, with only 120 first-year students with a median LSAT and GPA that has declined again to 151 and 3.2.  Projections for 2015-16 (based on the decline in June and October 2014 LSAT takers) suggest that the school should expect no more than 115 matriculants and may see a further decline in profile.  That means that the law school has only 390 students in 2014-15 and may have only 360 students in 2015-16 (an enrollment decline of 40% since 2010-11). Assuming net tuition for first-year students also remains at $25,500 due to the competition on scholarships to attract students (and this may be a generous assumption) – the JD tuition revenue for 2014-15 and 2015-16 is estimated to be $9,945,000, and $9,180,000, respectively (a decline in revenue of nearly 50% from the 2010-11 academic year). 

In reality, then, the “balanced” budget for the 2013-2014 academic year based on revenues of $12,411,000, now looks like a $2,500,000 budget shortfall in 2014-15 and a $3,200,000 budget shortfall for the 2015-16 academic year, absent significant additional budget cuts or new revenue streams (with most of the “low hanging fruit” in terms of budget cuts already “picked”). 

While you may be able to make some extraordinary draws on unrestricted endowment reserves to cover some of the shortfall (assuming the law school has some endowment of its own), and may be creative in pursuing new sources of revenue (a certificate program or a Master of Laws), even if you come up with an extra $400,000 annually in extraordinary draws on endowment and an extra $400,000 annually in terms of non-JD revenue you still are looking at losses of at least $1,700,000 in 2014-15 and at least $2,400,000 in 2015-16 absent further budget cuts.  Even with another round of early retirement offers to some tenured faculty and/or to staff (assuming there are still some that might qualify for early retirement), or the termination of untenured faculty and/or of staff, the budget shortfall well might remain in the $1,000,000 to $1,700,000 range for this year and next year (with similar projections for the ensuing years).  This means the law school may need subsidies from the university with which it is affiliated, or may need to make even more draconian cuts than it has contemplated to date.  (For indications that these estimates have some relation to reality, please see the recent stories about budget issues at Albany, Minnesota and UNLV.)

Difficult Conversations -- Difficult Decisions

This situation will make for some interesting conversations between you as the Dean of the law school and the Provost and President of the university.  As noted above in the discussion of dental schools, the provost and president of a university with a law school likely will be asking:  How “mission critical” is the law school to the university when the law school has transformed from a “cash cow” into a “money pit” and when reasonable projections suggest it may continue to be a money pit for the next few years?  How "mission critical" is the law school when its entering class profile is significantly weaker than it was just a few years ago, particularly if that weaker profile begins to translate into lower bar passage rates and even less robust employment outcomes?   How “mission critical” is the law school to the university if its faculty and alumni seem resistant to change and if the law school faculty and administration are somewhat disconnected from their colleagues in other schools and departments on campus?

Some universities are going to have difficult decisions to make (as may the Boards of Trustees of some of the independent law schools).  As of 1985, no dental schools had closed, but by the late 1980s and early 1990s, roughly ten percent of the dental schools were closed in response to significant declines in the number and quality of applicants and the corresponding financial pressures.  When faced with having to invest significantly to keep dental schools open, several universities decided that dental schools no longer were “mission critical” aspects of the university. 

I do not believe law schools should view themselves as so exceptional that they will have more immunity to these market forces than dental schools did in the 1980s.  I do not know whether ten percent of law schools will close, but just as some universities decided dental schools were no longer “mission critical” to the university, it is not only very possible, but perhaps even likely, that some universities now will decide that law schools that may require subsidies of $1 million or $2 million or more for a number of years are no longer “mission critical” to the university. 

(I am grateful to Bernie Burk and Derek Muller for their helpful comments on earlier drafts of this blog posting.)

 

October 20, 2014 in Cross industry comparisons, Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (5)

Sunday, October 12, 2014

Is the Legal Profession Showing Its Age?

The figure below suggests that a growing number of students are attending law school but not going on to become lawyers.  This conclusion requires some explanation, which I will supply below.  Alternative explanations are also welcome, as I’d like to find a plausible narrative that foreshadows a brighter future for the licensed bar. [PDF version of this essay]

Slide14

I generated this figure based on data from various editions of The Lawyer Statistical Report, which is periodically compiled by the American Bar Foundation (ABF).  The ABF's gets the underlying data from Martindale-Hubbell, which is a comprehensive directory of the licensed bar.  As of 2005, the sample was roughly 1 million lawyers who work in law firms, solo practice, in-house legal departments, government, and the judiciary.

The big surprise here is that the proportion of young lawyers (under age 35) has been declining for several decades.  And not by a little, but by a lot.  During this period, the median age went from 39 in 1980, to 41 in 1991, to 45 in 2000, to 49 in 2005.  See ABA Market Research Department.

I would be tempted to attribute a demographic shift of this magnitude to a computational error.  But that is unlikely because the underlying data were calculated at four different points in time, yet the results come together to produce a single, steady trendline -- a trendline that shows a licensed bar that is steadily aging.  

Another possible factor to consider is whether there are any significant data collection or sampling issues that skew the data in a manner that dramatically undercounts younger lawyers. For example, Martindale-Hubbell is largely irrelevant to today's younger lawyers.  So, in solo and small firm practices, where they are making the business decisions, we might expect plummeting subscription rates.  But subscribing and requesting the publication of additional biographical information (in the hope of garnering referral business) is not the same thing as being listed. Martindale Hubbell attempts to track lawyers who did not subscribe to the directory, as the near-universe level of inclusion increases the directory's value.  

To illustrate this point, consider that in 2000, the Lawyer Statistical Report (which relies on Martindale-Hubbell data) counted 909,000 lawyers.  According to the ABA, the total number of lawyers licensed in the US (compiled from state bar roles) was 1,022,000, and that almost certainly includes some double counting of lawyers licensed in more than one state.  While I have no doubt that younger lawyers are becoming harder to hunt down because of cell phones and home-based offices, the gap of missing lawyers is just not big enough to fully account for the sharp drop-off in younger lawyers. 

I have shown this chart to various law firms, legal departments, law faculty and bar association audiences.  Through this process, I have developed two working theories that are not mutually exclusive:

  1. Increased exits from law practice based on gender integration
  2. Slowing absorption of law graduates into the licensed bar

Theory 1: Gender Integration

One explanation is gender integration.  In short, over the last 40 years, more women have entered the legal profession; and as an empirical matter, they are much more likely to exit the workforce in order to focus on childcare.  Thus, more gender integration over time would cause a proportional decline in the younger lawyer cohort.

So let's examine the data.  According to the figure below, which shows number of male and female 1Ls enrolling by year at ABA-accredited law schools [Click-on to enlarge], the high water mark for male 1L enrollment occurred over 40 years ago -- in 1971!   The high water mark for female enrollment in percentage terms was 2000 (49.4%). In absolute numbers, the high was the class entering in the fall of 2009 (24,305).  

Temp12

Presumably, the higher the percentage of female graduates, the lower the percentage of lawyers under the age of 35.  In 1968, a 22 year old female 1L, if she graduated from law school and stayed in the legal profession, would be part of the younger lawyer cohort in 1980. Yet, her 1L cohort included only 1,179 females (7.4% of all 1Ls).  By 1993 (12 years before 2005), the number of female 1Ls had increased to 19,059 (nearly 44%). So exits based on childcare factors would likely be increasing.  

I can readily accept gender integration as a partial, but not a complete, explanation.  Why? Because female exits are likely to be siphoning off a substantial portion of the over 35 cohort, as this group is still having and raising children.   It seems implausible that female lawyers are leaving in droves before age 35 (reducing the younger lawyer cohort) yet returning in droves thereafter (swelling the over 35 cohort).  Further, according to the figure above, the absolute number of law school graduates is increasing during this entire period.  Sheer numbers are likely a partial counterbalance to the impact of gender-related exits.

Theory 2:  Slowing Absorption of Younger Lawyers

It is important to keep in mind the magnitude of the overall slide in younger lawyers -- from 36% in 1980 to 13% in 2005.  One would think the trendline would be moving in the exact opposite direction -- that larger graduating classes would be replacing the much smaller number of law school graduates from 40 years earlier who were retiring or passing away.  But such a youth movement does not appear to be happening, at least based on data through 2005.

I think the most likely explanation is that the rate of absorption of law school graduates into the licensed bar has been steadily declining over time.  This explanation, which would affect men and women equally, is directionally consistent with the percentage of entry-level jobs in private practice, which has been declining since in the late 1980s. See figure below.

Slide10

The slower absorption theory is also directionally consistent with the shifting demographics of large law firms, which now have more partners than associates.  See figure below.

NLJdemographics

Despite the higher number of partners compared to associates, it is worth noting that large law firms are not becoming more generous in sharing the partnership pie.   

Rather, the real sea change is the decline in the number of traditional law firm associates, who have been slowly supplanted by staff attorneys, permanent of counsel lawyers, and nonequity partners. Indeed, over 40% are large law firm partners (defined at AmLaw 200 / NLJ 250) are nonequity. Three decades ago, this category of partner was relatively rare.  See Henderson, An Empirical Study of Single-Tier Versus Two-Tier Partnerships in the Am Law 200, 84 NC L Rev 1691 (2006).  The growth of nonequity partners reflects a new kind of law firm leverage that relies on senior lawyers. The annual ALM/Major Lindsay & Africa study of partner compensation reveals that equity partners make dramatically higher incomes than nonequity partners and that the size of the pay gap is widening over time. See Ross Todd, A Widening Partner Pay Gap, American Lawyer, Sept 29, 2014.  

The primary advantage of nonequity partners and other senior lawyers, like permanent counsel, is that training costs fall to near zero. Cf. Elizabeth Olson, Corporations Drive Drop in Law Firms’ Use of Starting Lawyers, Study Finds, New York Times, Oct. 10. 2014 (showing drop over time in use of first year associates because clients are refusing to pay for training costs).

To my mind, however, the most persuasive support for the lower absorption theory is the simple delta between the growth in the licensed bar--which has clearly hit a plateau--and the size of graduating classes from ABA-accredited law schools--which, until recently, had been steadily increasing. The figure below shows these macro-level trendlines.

Slide1

If younger lawyers were replacing older lawyers and also growing to keep pace with the broader economy, the under 35 young lawyers cohort would be getting bigger or at least remain relatively constant in size.  But instead, as the first figure in this essay showed, the younger lawyer cohort has gotten smaller.  Arguably, the simplest explanation for these patterns is that it has gotten much harder over time to parlay a JD degree into paid employment as a licensed lawyer.  So, faced with a saturated legal market, law school graduates have been pursuing careers outside of law.  

What Does This Mean?

The analysis above suggests that the JD Advantage / JD Preferred employment market started to take shape several decades ago, long before these terms were put in place by the ABA and NALP.  Yet, we really don't know about these careers.  To construct a more useful, informative narrative, we'd have to systematically study the career paths of our alumni.  That task is long overdue.  

I started teaching at Indiana Law in 2003.  Since I first saw the declining trendline for the young lawyer cohort, I have been thinking about the roughly 1,600 students who have taken my Corporations, Securities Regulation, Business Planning, Project Management, Law Firms as a Business Organization, and Legal Professions classes.  

  • What percentage are working as licensed lawyers?  
  • For those doing something different, where are they working?  
  • Has their legal education opened doors for them? 
  • Did those doors lead to interesting and remunerative work?

The After the JD Study is based on law school graduates who passed in the bar in the year 2000.  The Wave III results provide some clues to how at least one cohort of younger lawyers fared during their first ten years in practice.  

  • Roughly a quarter of the class of 2000 is no longer practicing law (remember the base sample excluded those who never took or passed the bar).
  • The migration out of practice is generally in the direction of private sector business.  
  • Ten years out, the median pay for full-time work is more than $100,000 for both men and women.  No tears need to be shed here.
  • Roughly three-quarters report being satisfied with their decision to attend law school. 

These statistics are generally encouraging, but some caution is in order, as the entry-level legal economy was quite different in 2000.  

Because of the law school transparency movement, we lack commensurable data between 2000 and 2013.  That is an important piece of information right there, as changes in collection and reporting standards were caused by student protests, including several lawsuits surrounding allegedly misleading employment data. Yet, we can cobble together some potentially useful comparisons:

Even if NALP's full-time legal positions in 2000 is a more expansive category than the ABA's full-time bar-passage jobs in 2013, the gap is startling -- over 20 percent!  Further, we have additional evidence of a major shift in the job market, as law firm summer associate positions have declined in size by more than 50% since in the early 2000s. See Henderson, Sea Change in the Legal Market, NALP Bulletin (Aug 2013).   Between 2008 and 2013, there has also been a drop in median starting salaries, from $72,000 to $62,500. See NALP, Employment for the Class of 2013 – Selected Findings

Demand Drops, but Supply Marchs On

Cumulatively, the trendlines presented in this essay suggest that we are on the tail end of a multi-decade structural shift in the legal economy.  So what comes next?

Law schools were recently taken to task in an editorial by the Young Lawyers Board of Philadelphia Legal Intelligencer.  See If Unchanged, Legal Education will Remain a Business in Decline, Legal Intelligencer, Sept 25, 2014.  According to the young lawyers, "One reason graduates have difficulty obtaining employment is that most of them need to be trained in how to practice law, and clients are unwilling to pay for training new lawyers. Law schools need to step up and train students on how to practice law."

I am very sympathetic to the young lawyers, but I think they are missing something essential.  A law school that improves the quality of its skills training reduces the training costs to prospective employers.  That is a good thing, but it does not change the underlying demand for legal services. And it appears that that demand is eroding on several fronts:  (a) wealthy corporations are balking at the price of outside counsel and looking for credible substitutes, (b) ordinary citizens are struggling to afford a lawyer at all, and (c) a new segment of the legal economy is emerging that is financed by nonlawyers and heavily focused on data, process, and technology, which taps into skill sets not traditionally taught in law school. See Henderson, A Counterpoint to "The most robust legal market that ever existed in this country", Legal Whiteboard, Mar 17, 2014.  

Conclusion

My own conclusion is that neither the organized bar nor the legal academy has a firm grip on the changes that are occurring in the legal marketplace.  This uncertainty and confusion is understandable in light of the magnitude of the shift.

Nonetheless, these market shifts create special urgency for legal educators because we can't teach what we don't understand.  The thesis of the Young Lawyers Board is surely right -- if unchanged, legal education will remain a business in decline.  Much of legal education today is premised on a 20th century professional archetype--an archetype that is, based on the data, becoming less and less relevant with each passing day.  Thus, we are under-serving our students.  And frankly, they are figuring that out.  

Change is hard for people and organizations they work in.  And law professors and law schools are no different.  The retooling of legal education will likely be a slow, painful process that will take the better part of a full generation to complete.  I am trying to do my part.

Yet, the brunt of the demographic shift falls on the licensed bar, which is getting older and thus weaker with each passing year.  This is a problem that belongs to the ABA, the state bars, and the state supreme courts, not the legal academy.   [PDF version of this essay]

October 12, 2014 in Data on the profession, Structural change | Permalink | Comments (1)

Tuesday, October 7, 2014

Does Cooperative Placement Accelerate Law Student Professional Development?

The title of an earlier essay posed a threshold question for legal ed reform: "If We Make Legal Education More Experiential, Would it Really Matter?" (Legal Whiteboard, Feb 2014) (PDF). I answered "yes" but admitted it was only my best guess.  Thus, to be more rigorous, I outlined the conditions necessary to prove the concept.

The essay below is a companion to the first essay.  It is a case study on how one type and brand of experiential education -- cooperative placements at Northeastern Law -- appears to accelerate the professional development of its law students. The outcome criteria are comprised of the three apprenticeships of Educating Lawyers (2007) (aka The Carnegie Report) --cognitive skills, practice skills, and professional identity.

The better outcomes flow from Northeastern's immersive, iterative, and integrative approach. First, students are immersed in full-time coops that last a standard 11 weeks. Second, students move through four iterations of coops interspersed with four quarters of upper-level classes. Third, this experiential approach is integrated into the Law School's value system -- i.e., the experiential component is perceived as central rather than marginal to the School's educational mission.

Northeastern's coop model asks more of faculty and students, thus it may be hard to replicate. Yet, there is evidence that such an approach does in fact accelerate professional development in ways that ought to please law school critics and reformers. The benefits may be well worth the costs. 

[PDF version at JD Supra]

[The text below was original published as the Northeastern Law Outcomes Assessment Project (OAP) Research Bulletin No. 3]

Immersive, Iterative and Integrative:
Does Cooperative Placement Accelerate Law Student  Professional Development?

A steep decline in the job prospects for entry-level lawyers has been followed by a sharp drop in law school applications. Media stories criticize traditional legal education for being too expensive while producing graduates unprepared for practice. Throughout the country, legal educators and administrators at law schools are trying to formulate an effective response.

A common thread running through many new law school initiatives is greater emphasis on experiential education. Fundamentally, experiential education is learning by doing, typically by assuming the role of the lawyer in an in-class simulation, law school clinic, externship or cooperative placement. As law schools seek to add hands-on opportunities to their curricular offerings, empirical evidence on experiential education’s impact on law student professional development becomes invaluable.

Northeastern University School of Law’s Outcomes Assessment Project (OAP) is an evidenced-based approach to understanding experiential learning in the law school curriculum. A focal point of the OAP is Northeastern’s Cooperative Legal Education Program, an integral part of the school’s curriculum since the late 1960s. After completing a mostly traditional first year of law school,Northeastern students enter a quarter system in which 11-week cooperative placements alternate with 11-week upper-level courses. Through the four co-op placements during the 2L and 3L years, every Northeastern student gains the functional equivalent of nearly one year of full-time legal experience, typically across a diverse array of practice areas.

The Learning Theory of Cooperative Placement

Northeastern’s Cooperative Legal Education Program is based on a learning theory with three interconnected elements: immersion, iteration and integration.

  • Immersion: Immersion in active legal work in a real-world setting enables students to feel the weight and responsibility of representing real-world clients and exercising professional judgment.
  • Iteration: Iterative movement between the classroom and co-op placements provides students with concrete opportunities to connect theory with practice and understand the role of reflection and adjustment in order to improve one’s skill and judgment as a lawyer.
  • Integration: Integrating experiential learning into the law school curriculum signals its high value to the law school mission — when 50 percent of the upper-level activities involve learning by doing, practice skills are on par with doctrinal learning.

The purpose of the OAP Research Bulletin No. 3 is to use preliminary project data to explore whether the immersion-iteration-integration approach to legal education has the effect of accelerating the professional development of law students.

Three Effects of Co-op Placements

The findings in Research Bulletin No. 3 are based on surveys and focus groups conducted with 2L and 3L Northeastern law students and a small number of Northeastern law graduates, who served as facilitators. In our conversations with these students and alumni, we identified three ways that co-op is impacting the professional development of students.

Continue reading

October 7, 2014 in Data on legal education, Important research, Scholarship on legal education | Permalink | Comments (0)

Thursday, September 18, 2014

Cassidy on Reforming the Law School Curriculum From the Top Down

Mike Cassidy at the Boston College Law School has an interesting essay on curricular reform forthcoming in the Journal of Legal Education.  Here's the abstract:

With growing consensus that legal education is in turmoil if not in crisis, law schools need to take advantage of industry upheaval to catalyze innovation in the way they train their students. Curriculum reform, long the “third rail” of faculty politics, is now essential if some law schools are going to survive the present tsunami of low enrollments and stagnant hiring. One cautiously optimistic note within this doomsday symphony is that law school deans are now in extremely strong bargaining positions with their faculties and boards of trustees with respect to curriculum innovation.

In this essay, the author proposes a pivotal reform to the third year curriculum involving team-taught “Advanced Legal Problem Solving” workshops in subject specific areas, and describes the precise structure, content and staffing of such capstone courses. He argues that such workshops would significantly enhance the preparation of law students for entry into the profession, and would create an efficient and cost-effective route for law schools to satisfy rigorous new ABA accreditation standards regarding experiential learning and outcomes assessment.

September 18, 2014 | Permalink | Comments (0)

Thursday, September 4, 2014

Artificial Intelligence and the Law

Plexus, a NewLaw law firm based in Australia, has just released a new legal product that purports to apply artificial intelligence to a relatively common, discrete legal issue -- detemining whether a proposed trade promotion (advertisement in US parlance) is in compliance with applicable law. 

In the video below, Plexus Managing Partner Andrew Mellett (who is a MBA, not a lawyer), observes that this type of legal work would ordinarily take four to six weeks to complete and cost several thousand dollars.  Mellett claims that the Plexus product can provide "a legal solution in 10 minutes" at 20% to 30% of the cost of the traditional consultative method -- no lawyer required, albeit Plexus lawyers were the indispensible architects for the underlying code. 

From the video, it is unclear whether the innovation is an expert system -- akin to what Neota Logic or KM Standards are creating -- or artificial intelligence (AI) in the spirit of machine learning used in some of the best predictive coding algorithms or IBM's Watson applied to legal problems.   Back when Richard Susskind published his PhD dissertation in 1987, Expert Systems In Law, an expert system was viewed as artificial intelligence--there was no terminology to speak of because the application of technology to law was embryonic.  Now we are well past birth, as dozen of companies in the legal industry are in the toolmaking business, some living on venture or angel funding and others turning a handsome profit.

My best guess is that Plexus's new innovation is an expert system.  But frankly, the distinction does not matter very much because both expert systems and AI as applied to law are entering early toddler stage.   Of course, that suggests that those of us now working in the legal field will soon be grappling with the growth spurt of legal tech adolescence.  For law and technology, it's Detroit circa 1905.  

September 4, 2014 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, New and Noteworthy, Structural change, Video interviews | Permalink | Comments (2)

Sunday, August 24, 2014

Ahead of the Curve: Three Big Innovators in BigLaw

Nashville, TN.  It is time to put down the broad brush used to paint BigLaw as inefficient and out of touch.  At least for me, that is the big takeaway from the 2014 International Legal Technology Association (ILTA) conference, which took place this past week at the Gaylord Opryland Hotel in Nashville and included nearly 2,000 lawyers, administrators, staff, and vendors from around the world.

My takeaway is based on what I saw during the presentation session for the ILTA Most Innovative Law Firm Award.  The three finalists all qualify as big:  Bryan Cave (985 lawyers), Seyfarth Shaw (779 lawyers), and Littler Mendelson (1002 lawyers). Presenters from each firm had 15 minutes to share their innovations followed by 5 minutes of Q&A.  Afterwards, ILTA members in attendance casted ballots for first, second, and third place.

Kudos to Bryan Cave, Seyfarth Shaw, and Littler Mendelson for publicly sharing their innovations, as it demonstrates a commitment to the broader legal profession.

In this post, I will describe the salient points of each innovation. I will err on the side of detail because, when it comes to innovation in the legal space, there is a short supply of “guts of the operations” commentary.  I will then offer some macro-level observations.  As it turns out, BigLaw has on balance a surprisingly good hand to play.  Many will thrive, but at the expense of taking market share from the rest.

Bryancave

Bryan Cave

Presenter: John Alber, Strategic Technology Partner

Bryan Cave has developed an ingenious and highly efficient way to educate its lawyers on the economics of its business.  Prior to the presentation, I was familiar with the firm’s investment in a rigorous cost accounting system to guide the firm’s strategy and operations.

Yet, to get the full benefit out of such a system, the understanding needs to filter down to the individual lawyer-timekeeper level so that each lawyer-timekeeper can use the superior data to allocate time and effort in ways that strengthen the enterprise.  Even in the year 2014, many successful and skilled BigLaw lawyers confuse revenues with profit. And the confusion is understandable because portable books of business, which tend to be measured in terms of revenue, drive the valuation of lateral partners.  See Henderson & Zorn, Of Partners and Peacocks, Am. Law., February 2014.

Based on what I saw at ILTA, such confusion appears to have been substantially eliminated at Bryan Cave. No_math_arithmophobia

The core Bryan Cave innovation is a simple dashboard that tracks a variety of statistics at the lawyer, practice group, and firm level.  What is most striking about the Bryan Cave initiative is the sensitivity shown to the large percentage of lawyers who are not comfortable processing numbers (“arithmophobia” was the term used in the presentation).  The Bryan Cave innovation team dealt with this constraint in two ways.

1. The Octagon.  The Octagon is a data visualization technique that communicates eight key metrics in an octagon-shaped graphic.  Wondering what the term "data visualization" means? It's finding graphical ways to communicate complex multivariable data in a format that requires the end user, such as a lawyer, to have very little technical training.  The Octagon is a textbook example. It uses colors and distance from the center of the graphic to convey essential information related to origination, client relationships, matter management, days to bill, days to collect, hours billed, leverage, and profit margins. (There may be other octagons containing other metrics--the one we were shown appeared to be geared toward partners.)

Each lawyer each month gets a new updated Octagon; and that graphic communicates, through its shape, the lawyer’s relative contributions to the firm.  Specifically, there are distinctive patterns well known within the firm that tend to signal rainmaker, service partner, project manager, technical specialist, or some blend thereof.  The features of the Octagon also communicate how well a lawyer is performing in his or her various roles relative to his or her peers.  So, on a monthly basis, self-image confronts hard numbers.

This type of transparency is bound to have a profound effect on behavior.  (During another ILTA session I heard, from another Bryan Cave presenter, that since the introduction of the Octagon a couple of years ago, the average days to collect has fallen from 60 to 44.)

2. The Rosetta. Some lawyers are bound to prefer a story rather than a picture.  For these lawyers, the firm has created a narrative, referred to as the Rosetta, that translates the numbers into a diagnostic story of strengths, weaknesses, and, most importantly, specific prescriptive advice on how to improve.

But there is an interesting catch—the stories are all written with a computer algorithm.  How is this possible?  It’s a technology pioneered by a company called Narrative Science.  Note that computers that are fed nothing but a traditional baseball scoring sheet now routinely write sports stories that summarize the game for the local sports page.  This narrative summary accompanying the Octagon removes any lingering ambiguity regarding what the diagram means.  Further, all report generation, including practice-group level Octagon and Rosetta reports, has been entirely automated.

I am told that the Octagon and Rosetta programs can handle, and properly incentivize, work that is done on either a billable or alternative fee arrangement basis. If this is true, Bryan Cave has an innovation designed for the legal market of the future.

Some readers may be turned off that the Bryan Cave innovation may seem, on the surface anyway, entirely focused on law firm financial performance.  I am not. To my mind, this type of technology is valuable for communicating the fundamentals of the business.  This reduces the myths and false narratives that routinely take hold in data-poor environments.  This innovation is also timely because it is getting harder to give clients superior value while also delivering a strong return to the firm's owners -- the best of whom could lateral to another firm tomorrow.

The challenge of every BigLaw firm is getting all of the firm's stakeholders to row in the same direction. The combination of the Dashboard, Octagon, and Rosetta is a breakthrough in lawyer communication and, by extension, change management.  Bryan Cave attorneys have the information they need to both build their practices while also advancing the broader goals of the enterprise.

Seyfarth Shaw Seyfarth

Presenters: Kathy Perrelli, Chair of Litigation Practice; Kim Craig, Global Director of Legal Project Management.

Seyfarth Shaw’s innovation is the creation of a true Research & Development Department staffed by lawyers, project managers, technologists, and software developers.  The charge of Seyfarth’s R&D Department is to build solutions in advance of perceived client needs.  As the presenters mentioned, “we are not doing this because our clients are asking for these solutions; we are doing this because our clients will ask.”

Continue reading

August 24, 2014 in Blog posts worth reading, Current events, Data on the profession, Innovations in law, Law Firms | Permalink | Comments (1)

Monday, July 28, 2014

Conditional Scholarship Retention Update for the 2012-2013 Academic Year

In comparing the conditional scholarship universe between the 2011-12 academic year and the 2012-13 academic year (with a brief look at 2013-14) there are a handful of things worth noting.

First, as shown in Table 1, the number of law schools with conditional scholarships declined between 2011-12 and 2012-13 from 144 law schools to 136 law schools, and declined again for the 2013-14 academic year to 128 law schools.  The number of law schools that do not have conditional scholarships grew from 49 in 2011-12 to 58 in 2012-13 to 66 in 2013-14.  In addition, the number of schools with just one-year scholarships declined from five in 2011-12 to four in 2012-13, where it remained for 2013-14.

 Table 1:  Changes in Number of Law Schools with Conditional Scholarship Programs

Category

2011-12

2012-13

2013-14 (indications)

Law Schools with Conditional Scholarship Programs

 

144

 

136

 

128

Law Schools with One-Year Scholarships

5

4

4

Law Schools with Scholarships that are not Conditional Scholarships

 

49

 

58

 

66

 

Second, as shown in Table 2, the number of students receiving conditional scholarships in 2012-13 declined slightly from 2011-12, from 12786 to 12470, but the percentage of first-years with conditional scholarships actually increased from 27.3% to 29.2% (given the smaller number of first-years in 2012-13 compared to 2011-12).  That said, the number of students whose scholarships were reduced or eliminated declined from 4359 to 3712, meaning that the percentage of first-years whose scholarships were reduced or eliminated dropped from 9.3% to 8.7%.

Table 2: Overall Comparisons Between 2011-12 and 2012-13

Category

2011-2012

2012-13

First-years*

46778

42769

First-years with Conditional Scholarships**

12786 (27.3% of first-years)

12470 (29.2% of first-years)

First-years whose conditional scholarships were reduced or eliminated**

 

4359 (9.3% of first-years)

 

3712 (8.7% of first-years)

Average Renewal Rate (across law schools)

69%

71%

Overall Renewal Rate Among Scholarship Recipients

65.9%

70.2%

*Drawn from first-year enrollment at the 198 law schools included in this analysis (excluding the law schools in Puerto Rico and treating Widener as one law school for these purposes) based on information published in the Standard 509 reports.
** Based on information published in the mandated Conditional Scholarship Retention charts by each law school with a conditional scholarship program.

Third, the distribution of conditional scholarship programs across tiers of law schools is even more pronounced in 2012-13 than it was in 2011-12.  Using the USNews rankings from March 2014, only 16 law schools ranked in the top 50 had conditional scholarship programs in 2012-13 and eight of those 16 had a renewal rate of 97% or higher.  Three of these law schools also eliminated their conditional scholarship programs as of the fall 2013 entering class.  (Moreover, only six in the top 25 had conditional scholarship programs, five of whom had a renewal rate of 97% or higher.)

As you move further down the rankings, conditional scholarship programs become more common and manifest lower scholarship retention rates on average.

Of the 53 law schools ranked between 51 and 100 (with three tied at 100), 37 law schools (nearly 70%) had conditional scholarship programs, of which two eliminated their conditional scholarship programs as of fall 2013.  Notably, of the 37 law schools with conditional scholarship programs, eight had a renewal rate of 91% or better (nearly 22%), while seven had a renewal rate of 65% or less (nearly 19%) (with the other 22 (nearly 60%) with renewal rates between 67% and 88%)

For law schools ranked between 104 and 146 (44 law schools in total), 35 law schools (nearly 80%) had conditional scholarship programs, of which three eliminated their conditional scholarship programs as of fall 2013.   Notably, of the 35 law schools with conditional scholarship programs, six of the 35 had a renewal rate of 93% or better (roughly 17%) while 16 had a renewal rate of 65% or less (nearly 46%) (with the other 13 (roughly 37%) with renewal rates between 67% and 88%).

Finally, among the unranked schools, 47 of 51 had conditional scholarship programs – over 92% – only five of which had a renewal rate of 91% or better (nearly 11%), while 23 had a renewal rate of 65% or less (nearly 49%) (with the other 19 (roughly 40%) with renewal rates between 66% and 88%).

Tables 3 and 4 present comparative data across law schools in different USNews rankings categories.  Table 3 describes the number of law schools with conditional scholarship programs and the distribution of scholarship retention rates among law schools.  Table 4 describes the total number of students within each USNews rankings category along with the number of students on conditional scholarships and the number of students who had their conditional scholarship reduced or eliminated.

 Table 3: Scholarship Retention Rates by USNews Ranking Categories

Category

Top 50

51-100 (n=53)

104-146 (n=44)

Unranked (n=51)

Schools with Conditional Scholarship Programs

 

16

 

37

 

35

 

47

Retention Rates of 90% or More

8

8

6

5

Retention Rates of 66%-88%

4

22

13

19

Retention Rates of 65% or Less

4

7

16

23

 Table 4: Number and Percentage of First-Year Students in 2012 by USNews Rankings Categories Having Conditional Scholarships and Having Conditional Scholarships Reduced or Eliminated

 

Top 50 Law Schools (n=50)

Law Schools Ranked 51-100 (n=53)

Law Schools Ranked 104-146

(n=44)

Law Schools Ranked Alphabetically (n=51)

Number (%) of Law Schools with Conditional Scholarship Programs

16 (32%)

37 (70%)

35 (79.5%)

47 (92%)

Total First-Years at These Law Schools

11,862

10,937

7,611

12,180

Number (%) of First-Years with Conditional Scholarships

1,587 (13.4%)

3,192 (29.2%)

3,247 (42.7%)

4,444 (36.5%)

Number (%) of Conditional Scholarship Recipients Whose Scholarships were Reduced or Eliminated

154 (9.7% of conditional scholarship recipients and 1.3% of first-years)

734 (23% of conditional scholarship recipients and 6.7% of first-years)

1,124 (34.6% of conditional scholarship recipients and 14.8% of first-years)

1,700 (38.3% of conditional scholarship recipients and 14% of first-years)

Overall, as shown in Table 5, the distribution of retention rates across law schools was as follows for the 2012-13 academic year:  18 law schools had retention rates less than 50%, 20 law schools had retention rates between 50% and 59.99%, 25 law schools had retention rates between 60% and 69.99%, 21 law schools had retention rates between 70% and 79.99%, 25 law schools had retention rates between 80% and 89.99%, and 27 law schools had retention rates of 90% or better. 

 Table 5 – Number of Law Schools with Conditional Scholarship Renewal Rates in Different Deciles

Renewal Category

Number of Schools

90% or More

27 (16 of which were ranked in top 100)

80%-89.9%

25 (12 of which were ranked in top 100)

70%-79.9%

21 (10 of which were ranked in top 100)

60%-69.9%

25 (8 of which were ranked in top 100)

50%-59.9%

20 (5 of which were ranked in top 100)

Less than 50%

18 (2 of which were ranked in top 100)

Notably, of the 52 law schools ranked in the top 100 with conditional scholarship programs, only two (four percent) had retention rates that were less than 50%, while 16 (nearly 31%) had retention rates of 90% or better.  By contrast, of the 82 (of 95) law schools ranked 104 or lower with conditional scholarship programs, 16 (nearly 20%) had retention rates of 50% or less, while only 11 (roughly 13%) had retention rates of 90% or better.

In sum then, with several schools eliminating their conditional scholarship programs as of fall 2013, less than 50% of the law schools ranked in the top 100 (47 of 103 – nearly 46%) still had conditional scholarship programs, and of those, more than 27% (13 of 47) had retention rates for the 2012-13 academic year of 90% or better while less than 22% (10 of 47) had retention rates of 65% or less.

By contrast, as of fall 2013, more than 80% of the schools ranked below 100 (79 of 95 – roughly 83%) still had conditional scholarship programs, and of those, less than 12% (9 of 79) had retention rates for the 2012-13 academic year of 90% or better and nearly half (39 of 79 – roughly 49%) had retention rates of 65% or less.

July 28, 2014 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)