Wednesday, July 22, 2015

What is more important for lawyers: where you go to law school or what you learned? (Part II)

If you're trying to maximize the financial value of an undergraduate degree, it is better to bet on course of study than college prestige.  Indeed, prestige is largely irrelevant to those who major in engineering, computer science, or math.  In contrast, prestige does matter for art & humanities grads, albeit the financial returns are significantly lower than their tech counterparts.  

These are some of the takeaways from Part I of this blog post. Part I also presented data showing that law is a mix of both: financial returns have been high (cf. "red" tech majors) and prestige matters (cf. "blue" arts & humanities crowd).  

The goal of Part II is to address the question of whether the pattern of high earnings/prestige sensitivity will change in the future. I think the answer to this question is yes, albeit most readers would agree that if law will change is a less interesting and important question than how it will change.  Speed of change is also relevant because, as humans, we want to know if the change is going to affect us or just the next generation of lawyers.

Shifts in the Legal Market

There are a lot of changes occurring in the legal market, and those changes are altering historical patterns of how legal services are being sold and delivered to clients. In the past, I have thrown around the term structural change, yet not with any clear definition.  To advance the conversation, I need to correct that lack of precision. 

In economics, there is a literature on structural change as applied to national or regional economies (e.g. moving from a developing nation to an industrial nation; or moving from an industrial to a knowledge-based economy).  Investors also focus on structural change within a specific industry because, obviously, large changes can affect investor returns.  When I have used the term structural change on this blog, it has been much closer to investor conceptions.  Investopedia offers a useful definition even if it's somewhat colloquial: 

Definition of 'structural change': An economic condition that occurs when an industry or market changes how it functions or operates. A structural change will shift the parameters of an entity, which can be represented by significant changes in time series data.

Under this definition, the legal industry is certainly undergoing structural change.  The proportion of law graduates getting a job in private practice has been on the decline for 30 years; over the last 35 years, the average age of the licensed lawyer has climbed from 39 to 49 despite record numbers of new law school graduates; the proportion of associates to partners has plummeted since the late 1980s.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014.  Since the early 2000s, long before the great recession, associate-level hiring has been cut in half. See Sea Change in the Legal Market, NALP Bulletin, August 2013.

Likewise, among consumers of legal services, there is a lot of evidence to suggest that lower and middle class citizens can't afford a lawyer to solve life's most basic legal problems, thus leading to a glut of pro se litigants in state courts and many more who simply go without things like contracts and wills.  This troubling trend line was obscured by a boom in corporate legal practice, albeit now even rich corporations have become more sensitive to legal costs -- the sheer volume and complexity of legal need is outstripping their budgets.  In response to the lag in lawyer productivity and innovation, there is a ton of investor-backed enterprises that are now elbowing their way into the legal industry.  See A Counterpoint to "the most robust legal market that ever existed in this country"LWB, March 17, 2014.  

The impact of all this change -- structural or otherwise -- is now being felt by law schools. Applicants are down to levels not seen since the 1970s, yet we have dozens more law schools. It has been said by many that law schools are losing money, albeit we have zero data to quantify the problem.  Based on my knowledge of my own law school and several others I am close to, I am comfortable saying that we have real changes afoot that affect how the legal education market "functions or operates."

There is a sense among many lawyers and legal academics that the legal world changed after 2008. None of the "structural" changes I cite above are pegged in any way to the events of that year.  

What did change in 2008, however, was the national conversation on the legal industry, partially due to the news coverage of the mass law firm layoffs, partially due to important books by Richard Susskind and later Brian Tamanaha and Steve Harper, and partially due to a robust blogosphere.  This change in conversation emboldened corporate legal departments to aggressively use their new found market power with "worthless" young associates becoming the biggest casualty. This new conversation in turn exposed some of the risks of attending law school, which affected law school demand.  But alas, this was all fallout from deeper shifts in the market that were building for decades. Let's not blame the messengers.

Dimensions of Change

I am confident that the future of law is going to be a lot different than its past. But I want to make sure I break these changes into more discrete, digestible parts because (a) multiple stakeholders are affected, and (b) the drivers of change are coming from multiple directions.

Dimension 1: basic supply and demand for legal education

To unpack my point regarding multiple dimensions, let's start with legal education. Some of the challenges facing law schools today are entirely within the four corners of our own house.  Yet, legal education also has challenges (and opportunities) that arise from our connection to the broader legal industry.  This can be illustrated by looking at the relationship between the cost of legal education (which law schools control, although we may blame US News or the ABA) and entry level salaries (which are driven largely by the vagaries of a client-driven market).  

The chart below looks at these factors.  My proxy for cost is average student debt (public and private law schools) supplied by the ABA.  My income variables are median entry level salaries from NALP for law firm jobs and all entry level jobs.  2002 is the first year where I have all the requisite data.  But here is my twist:  I plot debt against entry-level salary based on percentage change since 2002.  

Debtversusincome-2002

If a business nearly doubles its price during the same period when customer income is flat, demand is going to fall.  Thus, the sluggish entry-level market presents a difficult problem for legal education.  Sure, we can point to the favorable statistics from the AJD or the premium that a JD has historically conferred on lifetime earnings, but law professors are not the people who are signing the loan papers.  The chart above documents a changing risk/reward tradeoff.  To use the frame of Part I, the red dots are sinking into the blue dot territory, or at least that is the way prospective students are likely to view things.

Fortunately, smaller law school classes are going to be a partial corrective to low entry-level salaries.  The biggest law school class on record entered in the fall of 2010 (52,488); in 2014, the entering class had shrunk by over 27% (37,942). When entry-level supply is reduced by 25+%, upward pressure on salaries will build.  Yet, the composition of the legal economy and the nature of legal work is clearly changing.  Further, the rate of absorption of law school graduates into the licensed bar has been slowing for decades.  See Is the Legal Profession Showing its Age? LWB, October 12, 2014. It would be foolhardy to believe that time and fiscal austerity alone are going to solve our business problems. Instead, we need to better understand our role as suppliers to a labor market.

Dimension 2:  The content of legal education

The content of legal education is not necessarily fixed or static.  We could change the content, thus affecting how the market responds.  

To provide a simple example, one of my students is starting work this fall at Kirkland & Ellis.  From a financial perspective, this is a good employment outcome.  He will be moving to Chicago with his girlfriend who just received her MS in Information Systems from IU's Kelley School of Business.  The MS from Kelley is a very "red" degree.  It can also be completed in one year (30 credit hours).  Well before she graduated, this recent grad had competing offers from PWC and Deloitte, both in the $80,000 range.   For many Indiana Law students, an ideal post-grad outcome would be $80K in Chicago at an employer who provides challenging work and high-quality training.  Yet, my student's girlfriend got this ideal outcome in 1/3 the time and likely 1/2 the cost of an Indiana Law grad.  

Perhaps we should consider cross-pollinating these disciplines. A huge portion of the legal profession's economic challenges is attributable to flat lawyer productivity -- customers are struggling to pay for solutions to their legal needs.  Information systems are a huge part law's productivity puzzle.  Below is a chart I use in many of my presentations on the legal industry.  The chart summarizes the emerging legal ecosystem by plotting the Heinz-Laumann two-hemisphere model against Richard Susskind's bespoke-to-commodity continuum. [Click-on to enlarge.]

Ecosystem

The key takeaway from this diagram is that the largest area of growth is going to be in the multidisciplinary green zone -- the legally trained working shoulder-to-shoulder with those skilled in information systems, statistics, software development, and computational linguistics, to name but a few.  These are "red" disciplines.  Do law schools want to be part of this movement?  Let me ask this another way -- do law schools want to be relevant to the bulk of the legal market that needs to be rationalized in order to maintain its affordability? Harvard grads will have options on Wall Street for the foreseeable future.  But 98% of law schools operate in a different market.  Further, some HLS grads, or students who might qualify for admission to Harvard, might prefer the big upside rewards that are only available in the green zone.  In short, a new hierarchy is emerging in law that is still very much up for grabs.

If an academic wants to better understand the rapidly changing nature of legal work, I would urge them to visit a large legal department with a substantial legal operations ("legal ops") staff.  These are the professionals who have been empowered by general counsel to find ways to drive up quality and drive down cost using data, process, and technology.  These are the folks who are making build-versus-buy decisions, putting pressure on law firms to innovate in order to hang on to legal work, and experimenting with NewLaw legal vendors. 

I am finishing up a story on legal ops professionals for the ABA Journal.  (By the way, legal ops exist in law firms as well as legal departments and green zone legal vendors. The role is most developed, however, in legal departments.)  My editor flagged the issue that virtually all of the legal ops people in the story did not graduate from prestigious law schools (or any law school).

My only response is that legal operations people have specialized skills and knowledge (often "red" but sometimes involving EQ) that others lack; without these skills, they can't do the job.  Legal ops people live in a world of outputs and metrics.  For example, are legal expenses and settlement amounts trending down over time -- yes or no? If so, by how much?  How much internal staff time does it take to negotiate a revenue contract? How much of this process can be automated? What will it take to get our staff to accept the new system?

As these examples show, a legal ops person is typically going to be evaluated based on measurable outputs -- do they get results? Where someone went to law school is an input that is likely irrelevant to the question.  The only qualifier is whether the curriculum of that school provided valuable, specialized domain knowledge -- most likely non-legal red skills but also skills related to teams, communication, and collaboration. 

Dimension 3:  The value of pedigree to the customer 

Law has historically been what economists call a “credence good.”  This means that a layperson has a difficult time assessing quality.  As a result, proxies for quality, such as pedigree or prestige, have historically been very important when hiring a lawyer or law firm.  

One of the reasons that the field of legal operations is gaining momentum is because it is creating tools and systems that enable clients to look past credentials to obtain information on things they really care about, such as cost, outcome, and speed of delivery. There are now companies coming into existence that are gathering data on lawyers' win-loss rates. See Another Example of Using Big Data to Improve Odds of Winning in Court, LWB, April 12, 2015.  Sure, apples-to-apples comparisons are very difficult to make -- every case is unique in some respect. But the amount of money at stake is large enough that the data challenges will be surmounted.  When that day arrives, we won't opine on the value of pedigree to legal outcomes; we'll just calculate it. More significantly, clients focused on outcomes will change their buying patterns.  Early returns I have seen suggest that the value of pedigree to legal outcomes may be close to negligible.

Do any of us care where the engineers who designed our smart phones went to college? Not really. We just care how well the smart phone works. 

In this respect, the future of law is likely headed in the direction of Google (a pure red company).  In the early days, the founders of Google favored grads of Caltech, Stanford and Berkeley.  But over time, the company learned that prestige of graduate school was a poor predictor of job success. Because Google lives and dies by its outputs, the company changed its hiring model to attract the most qualified engineers.  See George Anders, The Rare Find: How Great Talent Stand Out 1-5 (2012) (telling the story of how data changed the attitudes of Google founders regarding elite credentials and altered the Google hiring model).

I have lived long enough to know that the changes I describe above are not necessarily going to be welcomed by many lawyers and law professors.  If a group benefits from a lifelong presumption of merit, it is natural that group will resist evidence that the presumption is not fully warranted. Indeed, much of the skepticism will be rooted in subconscious emotion.  If the presumption is dashed, those of us in the elite crowd will have to spend our days competing with others and proving ourselves, or even worse, watching our kids soldier through it.  We have little to gain and a lot to lose in the world we are heading into.  Yet, behind the Rawls veil of ignorance, how can we complain?

So with the red-blue crosscurrents, is law school still worth the investment?

That is a relevant and reasonable question that many young people are contemplating.  I will offer my opinion, but markets are bound to follow their own logic. 

This is a time of enormous uncertainty for young people. Education clearly opens doors, but tuition is going up much faster than earnings.  Further, competition among knowledge workers is becoming more global, which is a check on wages.  Of course, if you don't invest in education, what are your options?

I am generally on the side of Michael Simkovic and Frank McIntrye that the education provided by a law degree, on average, significantly increases lifetime earnings.  See The Economic Value of a Law Degree (April 2013).  How could it not?  The law is too interconnected to every facet of society to not, on average, enhance the law grad's critical thinking skills. Nearly 15 years of out of law school and I regularly use what I learned at Chicago Law to solve problems and communicate solutions, particularly in my applied research work with law firms and legal departments. While my Chicago Law credential has value independent of the skills and knowledge I obtained (the red AJD bar chart in Part I strongly suggests that), I can't deny the additional value of the actual skills and knowledge I obtained to solve real world business problems. It's been substantial.

In general, I also agree with Deborah Jones Merritt that there is significant evidence that the entry-level market for lawyers is weak and oversaturated.  See What Happened to the Class of 2010? Empirical Evidence of Structural Change in the Legal Profession (April 2015).   The class of 2010 is not faring as well as the class of 2000.  Indeed, the lead economist for Payscale, Katie Bardaro, recently noted that wages are stagnating in many fields, but especially in the legal profession. "More law schools are graduating people than there are jobs for them...There’s an over-saturated labor market right now. That works to drive down the pay rate.” See Susan Adams, The Law Schools Whose Grads Earn the Biggest Paychecks in 2014, Forbes, Mar. 14, 2014. 

In the face of these stiff headwinds, I think law schools have an opportunity to pack more value into three years of education. See Dimension 2 above.  To be more specific, if you are a protege of Dan Katz at Chicago-Kent, you will have a lot of career options. Ron Staudt, also at Chicago-Kent, has quietly built a pipeline into the  law and technology space.  Oliver Goodenough and his colleague at Vermont Law are making rapid progress with a tech law curriculum.  And at Georgetown Law, Tanina Rostain and Ed Walters (CEO of Fastcase) provide courses that are cutting edge.  

But absent these types of future-oriented instruction, what is the value of a JD degree as it is commonly taught today? That value is clearly positive; I would even call it high.  But whether the value is sufficient to cover the cost of attendance is likely to vary from law grad to law grad.  Lord knows, in a world of variable tuition based on merit scholarships and merit scholarships that go away after the 1L year, the swing in cost can be a $250K plus interest.

What is killing law school applications these days is the lack of near certainty among prospective students that the time and expense of law school will pay off.  The world looks different than it did in the fall of 1997 when the vast majority of the AJD respondents entered law school. Tuition and debt loads are higher and high paying entry-level jobs are harder to obtain.

So what is the solution?  For students, it's to bargain shop for law schools, which is bad news for law schools.  For law schools, it's to add more value to an already valuable degree.  Some of that value will come in the form of red technical skills that will make lawyers more productive.  In turn, this will prime demand for more legal products and services.

July 22, 2015 in Blog posts worth reading, Data on legal education, Data on the profession, Legal Departments, Structural change | Permalink | Comments (0)

Sunday, July 19, 2015

What is more important for lawyers: where you go to law school or what you learned? (Part I)

The Economist reports a very interesting analysis from Payscale.  The questions being asked are pretty simple: If you want to generate earnings that justify the time and cost of an undergraduate education, what should you study and where should you enroll?

Lots of people have strong opinions on this set of questions, but Payscale has the data to answer them empirically. It turns out that at the undergraduates level, course of study is much more important than the prestige of the college or university you attend.  The hard evidence is shown below.

Payscalegraphic

For those working in law or thinking about attending law school, a natural question to ask is whether the legal industry is closer to the blue dot (art & humanities) or red dot pattern (engineering/CS/math).  A second, related question whether the future of law is more blue or more red.

This a two-part blog post.  Part I tries to answer the first question, starting with a careful analysis of the undergraduate chart, which provides a valuable frame of reference that can be discussed more dispassionately (at least among lawyers and law students) than an analysis that questions the value of law school prestige and hierarchy.  

Part II, which I will post on Wednesday, explores the second, future-oriented question.  I will tip my hand now and say that the future of law will be less blue (arts & humanity) and more red (math/CS/engineering).  Within the legal industry, there will be winners and losers; but from the perspective of broader society, this change is a very good thing. 

Undergraduate ROI

In the Payscale chart above, the y-axis (vertical) is 20-year annualized returns from college fees paid.  The x-axis is selectivity, running from under 10 percent to near open admissions.  

The Payscale chart is a very good example of how data visualization can be used to communicate both core facts and useful nuance.  Here, the lede is unmistakable:  the red dots (engineering/CS/math) are overwhelming higher on the ROI scale than the blue dots (arts & humanities).  Sure, there are exceptions to this rule, but they don't occur very often. (Observe how rarely a blue dot is above the red fit-line.) This suggests it would be very foolish to get a blue degree and expect a red paycheck unless you have very good information (or skills or talent) that others lack.

The chart conveys another important piece of information -- the red fit-line is flat.  This means that for engineering/CS/math majors, prestige has not been very relevant to their eventual earnings.  I'll add a nuance here that some empirically savvy readers are bound to point out:  It is possible (indeed likely) that fees are higher at more selective schools. So if MIT costs twice as much as a public polytech, and both yield 12% over 20 years, one might wish they had gone to MIT.   Still, the flat trendline is surprising.  As a general matter, lower ranked schools are not dramatically cheaper than higher ranked schools, and many public schools are highly selective.  The flat red trendline suggests that there are (or were, remember these are historical data) many bargains out there.  If one is trying to maximize financial returns, the goal is to find a school that will, in the future, be well above the red fit-line (and avoid those below).  

The flat red fit-line is also surprising because college selectivity is almost certainly highly correlated with ACT or SAT scores, which our society often views as measures of general intelligence. Yet, there we have it -- a flat trendline. Four years of education seem to be more relevant than a standardized test score taken during high school.  That is heartening at many levels.

A third interesting trend -- the blue fit-line is sloped downward.  This suggests that in the arts & humanities, selectivity/prestige does have a financial payoff.  I don't think this will surprise many readers, albeit the prestige payoff is not very large. To use a simple metaphor, if you attend a more selective college or university to get your arts or humanity degree, you are likely to have a better house in the arts & humanities neighborhood.  But on average, you won't be able to afford the same neighborhood as the engineers, computer scientists, and math majors.

What about Law?

Moving on to law, if we want to examine the relationship between earnings and law school attended, the best available evidence is probably the After the JD Study (AJD), which is large, representative sample of law graduates who took and passed the bar in 2000.  

Data from AJD Wave 3 suggests that the financial returns are relatively strong for all law school graduates -- ten years out and graduates of Tier 4 schools have median earnings of $100,000 per year. As shown in chart below, this is akin to shifting the blue dots up into the red territory.  

AJDearnnings

The downward sloping fit-line remains, but that doesn't seem to matter very much to happiness. Other AJD data shows that regardless of tier of graduating school, AJD respondents show relatively high and uniform satisfaction with (a) the decision to become a lawyer, and (b) the value of the law degree as an investment. By 2010, 48% of respondents had no debt; only 5.1% had more than $100K in educational debt remaining. 

This is all good news.  But is it reasonable to extrapolate forward and assume the past is a fairly accurate barometer of the present and the future? 

One way to address that question is to ascertain what has changed since 2000.  As noted earlier, the AJD sample was composed of law graduates who passed the bar in the year 2000. Figures published by NALP and the ABA show that the percentage of full-time bar passage required jobs has dropped significantly over the last 13+ years -- from 77.3% for the class of 2000 to 57% for the class of 2013. That is a huge delta.

Barpassagerequiredjob

One of the reasons why law school applicants have plummeted is that the career path from JD graduates has become murky.  And that is a good place to start Part II

July 19, 2015 in Blog posts worth reading, Cross industry comparisons, Data on legal education, Data on the profession, Structural change | Permalink | Comments (3)

Thursday, May 14, 2015

Further Thoughts on the July 2014 Bar Results -- A Response to Erica Moeser

Late last fall, Erica Moeser responded to a letter from Dean Kathryn Rand of the University of North Dakota (on behalf of a large number of law school deans), reiterating that the NCBE had double-checked its scoring of the MBE on the July 2014 bar examination and could find no errors in its calculations.  Erica Moeser also took to the pages of the December 2014 issue of The Bar Examiner to further validate her conclusion that the historic drop in the mean MBE scaled score is attributable solely to the fact that the class that sat for the July 2014 bar exam was “less able” than the class that sat for the July 2013 bar exam.  In January, Dean Stephen Ferruolo of the University of San Diego also wrote to Erica Moeser requesting the release of more information on which to assess the July 2014 bar examination results in comparison with previous years’ results.  In February, Erica Moeser responded to Dean Ferruolo’s request by declining to provide more detailed information and reiterating her belief that the July 2014 scores “represent the first phase of results reflecting the dramatic and continuing downturn in law school applications.”

In an earlier blog posting, I explained why Erica Moeser is partly right (that the Class of 2014 could be understood to be slightly less able than the Class of 2013), but also explained why the decline in “quality” of the Class of 2014 does not explain the historic drop in mean MBE scaled score.  The decline in “quality” between the Class of 2013 and the Class of 2014 was modest, not historic, and would suggest that the decline in the mean MBE scaled score also should have been modest, rather than historic.  Similar declines in “quality” in the 2000s resulted in only modest declines in the MBE, suggesting that more was going on with the July 2014 exam. 

Others have written about these issues as well.  In January, Vikram Amar had a thoughtful reflection on Moeser’s statements and in recent weeks Debby Merritt has written a series of posts -- here, here, and here -- indicating in some detail why she believes, as I do, that the ExamSoft debacle in July could have impacted the MBE scaled scores in jurisdictions that used ExamSoft as well as in other jurisdictions.

I write now to take issue with four statements from Erica Moeser – three from her President’s Page in the December 2014 issue of the Bar Examiner and one from her letter responding to Dean Kathryn Rand.  I remain unpersuaded that the historic decline in the mean MBE scaled score is solely attributable to a decline in quality of the class that sat for the July 2014 bar examination and remain baffled that the NCBE refuses to acknowledge the possibility that issues with test administration may have exacerbated the decline in the performance on the July 2014 MBE.

Item One – Differential Declines in MBE Scores

In her December article, Moeser stated: 

I then looked to two areas for further corrobo­ration. The first was internal to NCBE. Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that col­lect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent dur­ing the previous 10 years. (Emphasis in original.)

Moeser starts by referencing data that is not publicly available to support her cause.  This is unfortunate, because it makes it really hard to understand and critique the data.  Nevertheless, there are some inferences we can take from what she does disclose and some questions we can ask.  Moeser asserts that the 19% of MBE “retakers” saw an MBE drop of 1.7 points compared with MBE “retakers” in July 2013, while the 65% believed to be first-time takers saw a drop of 2.7 points compared with first-time takers in July 2013.  It would have been helpful here if Erica Moeser would have released publicly the declines among MBE retakers in the previous 10 years and the declines among first-time takers in the previous 10 years so that patterns could be assessed, particularly in relation to the changes in class composition for each of those years.  Without that information available it is hard to do much more with Moeser’s assertion.  (I find it odd that she would reference this point without providing the underlying data.) 

Nonetheless, this assertion raises other questions.  First, the overall decline in the mean MBE scaled score was 2.8 points. Moeser notes that 19% of takers (MBE retakers) had an average drop of 1.7 points, while 65% of takers (first-time takers) had an average drop of 2.7 points.  Unless there is something I am missing here, that should mean the remaining 16% of test-takers had to have an average decline of 4.51 points!  (This 16% of test-takers represents those who Moeser notes could not be tracked as first-time takers or MBE retakers “because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.”) (Here is the equation --- 2.8 = (.19*1.7)+(.65*2.7)+(.16*x).  Solve for X. This translates to 2.8 = .323+1.755+.16x.  This translates to .722 = .16x and then .722/.16 = X.  X then equals 4.51.)  It would have helped, again, if Moeser had indicated which jurisdictions had these even larger declines in mean MBE scaled scores, as we could then look at the composition of graduates taking the bar in those jurisdictions to see if there was an unusual decline in entering class statistics in 2011 at the law schools from which most bar takers in those states graduated.

Item Two – The MPRE

In the December article, Moeser also stated:

I also looked at what the results from the Multistate Professional Responsibility Examination (MPRE), separately administered three times each year, might tell me. The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candi­dates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

At first blush, this looks like a pretty compelling argument, but Moeser’s selectiveness in looking at the data is troubling, and her failure to discuss whether the MPRE and MBE are meaningfully comparable test-taking experiences also is troubling.  Essentially, Moeser is making the following assertion – because the mean MPRE scaled score declined by 1.92 points between 2012 and 2013, we should have expected a large decline in the mean MBE scaled score in July 2014 (and because the mean MPRE scaled score declined another 2.08 points between 2013 and 2014, we should expect another large decline in the mean MBE scaled score in July 2015).

But the “relationship” between changes in the mean MPRE scaled score and changes in the mean MBE scaled score over the last decade does not support this assertion. If one looks at a decade’s worth of data, rather than data just for the last couple of years, the picture looks significantly more complicated, and suggests the collective performance on the MPRE may not tell us much at all about likely collective performance on the MBE in the following year. 

MPRE Year

Mean MPRE Score

Change

MBE Year

July Mean MBE Scaled Score

Change

2004

99.1

 

2005

141.6

 

2005

98.7

-0.4

2006

143.3

+1.7

2006

98

-0.7

2007

143.7

+0.4

2007

98.6

+0.6

2008

145.6

+1.9

2008

97.6

-1.0

2009

144.5

-1.1

2009

97.4

-0.2

2010

143.6

-.9

2010

96.8

-0.6

2011

143.8

+0.2

2011

95.7

-1.1

2012

143.4

-0.4

2012

97.6

+1.9

2013

144.3

+0.9

2013

95.6

-2.0

2014

141.5

-2.8

2014

93.6

-2.0

2015

????

????

The data Moeser cites from the last two years conveniently makes her point, but it consists of a very small sample size.  The data over the last decade looks much more random.  In three of the nine years, the change is not in the same direction (MPRE 2005, 2006, 2010, MBE 2006, 2007, 2011).  In the six years where the change is in the same direction, there are two years in which the MBE change is significantly larger than the MPRE change (MPRE 2007, 2009, MBE 2008, 2010) and there are two years in which the MBE change is significantly smaller than the MBE change (MPRE 2011, 2012, MBE 2012, 2013).  In only two of the nine years, do the changes in the MPRE and MBE roughly approximate each other (MPRE 2008, 2013, MBE 2009, 2014).   Nonetheless, this remains a very small sample and more analysis of data over a longer period might be helpful to better understand how/whether changes in mean MPRE scores inform meaningfully changes in mean MBE scores the following year.  At this point, I think the predictive value seems marginal given the wide range of changes on a year-over-year basis.

Item Three – Mean LSAT Scores

In the December article, Moeser further stated:

Specifically, I looked at what happened to the overall mean LSAT score as reported by the Law School Admission Council for the first-year matricu­lants between 2010 (the class of 2013) and 2011 (the class of 2014). The reported mean dropped a modest amount for those completing the first year (from 157.7 to 157.4). What is unknown is the extent to which the effect of a change to reporting LSAT scores (from the average of all scores to the highest score earned) has offset what would otherwise have been a greater drop. (LSAC Research Reports indicate that roughly 30% of LSAT takers are repeaters and that this num­ber has increased in recent years.

This assertion is misguided for purposes of this comparison, a point Vikram Amar made in his post.  If we were comparing the first-year matriculants in 2009 with the first-year matriculants in 2010, the question of the change in reporting from average LSAT score to highest LSAT score would have mattered.  But the 2010 matriculants were the first class for which the mean was reported based on highest LSAT score and the 2011 matriculants were the second class for which the mean was reported based on highest LSAT score.  Thus, there is no “unknown” here.  The reported mean LSAT dropped only a modest amount between the matriculants in 2010 and the matriculants in 2011.  Nonetheless, the mean MBE scaled score in July 2014 decreased by an historic 2.8 points from the mean MBE scaled score in July 2013. 

Item Four – Administration Issues

In her letter to Dean Kathryn Rand, Moeser stated:  "To the extent that the statement you attached referenced both administration and scoring of the July 2014, bar examination, note that NCBE does not administer the exam; jurisdictions do."

This response suggests not only that the NCBE is not responsible for administering the bar examinations in the many different jurisdictions, but implicitly suggests that issues with administration could not have contributed to the historic decline in the mean MBE scaled score. 

Were there issues with administration?  Yes.   Could they have contributed to the historic decline in the mean MBE scaled score?  Yes.

Debby Merritt’s recent posts discuss the administration issues and the potential consequences of the administration issues in some detail.  In over forty states that used ExamSoft to administer the bar examination, the MBE came on Wednesday, after the essay portion of the exam on Tuesday.  But because of an ExamSoft technical problem, tens of thousands of test-takers, who were initially informed by their respective state board of bar examiners that they would FAIL THE EXAM if their essay answers were not uploaded in a timely manner, spent most of Tuesday night dealing with the profound stress of not being able to upload their exam answers and not being able to contact anyone at the board of bar examiners (who were not answering phones) or at ExamSoft (due to the flood of calls and emails from anxious, frustrated, stressed out exam takers) to figure out what was going on and what they should do. 

Given that this “administration” issue caused untold stress and anxiety for thousands of test-takers, who spent Tuesday night completely anxious and stressed out trying repeatedly and unsuccessfully to upload their essay answers, should it be a surprise that they might have underperformed somewhat on the MBE on Wednesday?  (If you want a sense of the stress and anxiety, check the twitter feed for the evening of Tuesday, July 29, 2014)

The responses from the boards of bar examiners to this issue with administration of the bar examination were far from uniform.  Different jurisdictions granted extensions at different times of the night on Tuesday, July 29, or on Wednesday, July 30, with some granting short extensions and some granting longer extensions.  Thus, in states that gave notice of an extension out earlier on Tuesday, July 29, test-takers may have had less stress and anxiety, while in those states that didn’t give notice of an extension out until later (or for which the extension was relatively short), or where there may not have been any communication regarding extensions of the submission deadline, test takers likely experienced more stress and anxiety.  (It would be worth studying exactly when each jurisdiction gave notice of an extension and whether there is any correlation between timing of notice of the extension and the relative performance of bar takers in those states.)

The NCBE’s unwillingness to acknowledge any issues with administration of the bar examination is all the more surprising at a time when the NCBE is pushing for adoption of the Uniform Bar Examination.  On its webpage, the NCBE states: “[The UBE] is uniformly administered, graded, and scored by user jurisdictions and results in a portable score that can be transferred to other UBE jurisdictions.” (Emphasis added.)  This simply was not true in July 2014.  The Uniform Bar Examination was administered under different exam conditions across jurisdictions.  First, three of the states administering the Uniform Bar Examination in July 2014 did not use ExamSoft – Arizona, Nebraska and Wyoming -- and therefore, bar takers in those states had a vastly different “exam administration” experience than bar takers in ExamSoft jurisdictions.  Across ExamSoft jurisdictions, different approaches to extensions also meant different administration experiences. Given the significance of consistent administration for the purpose of equating performance on a standardized exam like the bar exam, that the NCBE allows such varied approaches to administering a supposedly “uniform” exam strikes me as very problematic.

Many questions remain unanswered, largely because adequate information has not been made available on which to assess the various factors that might have contributed to the historic decline in the mean MBE scaled score.  With the release of February bar results and the NCBE’s publication of the 2014 statistical report, some additional information is now available to put the results of July 2014 in context.  In my next blog posting regarding the July 2014 bar results, I will delve into some of those statistics to see what they tell us.

(Edited as of May 20 to correct the 2013 MPRE and 2014 MBE change and corresponding discussion.)

May 14, 2015 in Current events, Data on legal education, Data on the profession | Permalink | Comments (0)

Thursday, May 7, 2015

Revisiting Conditional Scholarships

Having been one of the people who brought attention to the issue of conditional scholarships a few years ago, I feel compelled to offer a few insights on a rekindled conversation about conditional scholarships involving Jeremy Telman and Michael Simkovic and Debby Merritt.

I am not sure what prompted Prof. Telman to write about conditional scholarships, but the first sentence of his initial post seems to be a few years late: 

One of the ways in which law schools are allegedly inadequately transparent is in the award of merit scholarships conditional on the students’ achievement of a certain grade point average. (Emphasis added)

A few years ago, one accurately could have said that law schools were inadequately transparent regarding the awarding and retention of conditional scholarships.  I did say that in an article Prof. Telman describes as “interesting.” 

Today, this is no longer accurate, because we have much greater transparency regarding conditional scholarships given the disclosures mandated pursuant to Standard 509.

Thus, I am not sure anyone is alleging that law schools are inadequately transparent regarding conditional scholarships and I am not sure why this is once again an item for discussion.  It has been well settled and law schools and prospective law students have adjusted to a new reality.  Indeed, in his follow up posting, Prof. Telman essentially acknowledges this point:

It seems we are all agreed that the disclosure problems related to conditional scholarships have largely been addressed through the ABA website that enables students to comparison shop among scholarship offers from various schools and know their chances of retaining their conditional scholarships.

That said, given that Prof. Telman got the conversation started, I have a response to one of his assertions and some observations to share.

The general context of his posting (and Prof. Simkovic’s related posts) is that college students have lived with conditional scholarships without apparent problems so conditional scholarships shouldn’t present a concern for law students.  In making his case, Prof. Telman relies on my 2011 article to support a proposition that the article actually disproves in some detail.  Specifically, Prof. Telman states:

Professor Organ was able to find information about how scholarships work at 160 law schools.  That means that the information was out there.  Since Professor Organ was able to gather information about 160 law schools, it should not be difficult for students to gather relevant information about the one law school that they are considering attending. 

He further states:  “Why are law students assumed to be incapable of looking into standard grade normalizations curves for the first year?”  Prof. Telman seems to be suggesting that there actually weren’t any disclosure problems because “the information was out there.”  The information was not out there. 

To be more precise, in putting together the article, with the efforts of research assistants as well as my own sleuthing, I was able to find sufficient information from the NAPLA-SAPLA Book of Lists, the ABA-LSAC Guide, and law school web pages from which to classify 160 law schools regarding whether the law school had a competitive scholarship program or some other type of scholarship program.  If Prof. Telman would have looked carefully at the article, however, he would have noted that “only four of these 160 schools had any information posted on their webpages indicating renewal rates on scholarships.”  (A point Derek Tokarz makes in the comments to Prof. Telman’s post.)

Prospective law students not only need relevant information about one law school, they need relevant and comparable information about the set of three or five or seven law schools they are considering seriously.  Prior to the Standard 509 mandated disclosure of conditional scholarship information, it was profoundly difficult if not impossible for students to gather relevant information from a few or several law schools.  The information simply was not “out there.”

Indeed, two of the primary points of my article were to highlight the information asymmetry between law schools and prospective law students relating to competitive scholarships and to recommend greater disclosure of the number of students receiving competitive scholarships and the number who had them renewed (or had them reduced or eliminated).   

Prof. Merritt discusses in some depth this information asymmetry, noting particularly that college students who have been successful in retaining their conditional scholarships as undergrads do not appreciate the reality of the mandatory curve they will encounter in law school, a point Stephen Lubet also makes cogently in a comment to Prof. Telman’s post. (Indeed, to his credit, Prof. Telman acknowledges that prospective law students also may suffer from optimism bias in assessing their likelihood of retaining their scholarship.)

Regarding the need for greater disclosure, regardless of how savvy and sophisticated we would like to believe prospective law students might have been or might be, the nuances of conditional scholarships and mandatory curves were not things that were clearly understood in the era prior to the mandatory Standard 509 disclosure.  I noted in my article that many students posting on Law School Numbers valued their scholarships based on a three-year total, regardless of whether they were conditional scholarships, suggesting these students failed to appreciate that the “value” should be discounted by the risk of non-renewal.  I also spoke with pre-law advisors around the country regarding conditional scholarship and consistently was told that this information was very helpful because pre-law students (and sometimes pre-law advisors) had not appreciated the realities of conditional scholarships.

While there are other things mentioned by Prof. Telman, Prof. Simkovic and Prof. Merritt to which I could respond, this post is already long enough and I am not interested in a prolonged exchange, particularly given that many of the points to which I would respond would require a much more detailed discussion and more nuance than blog postings sometimes facilitate.  My 2011 article describes my views on competitive scholarship programs and their impact on law school culture well enough.  Accordingly, let me end with one additional set of observations about what has happened with conditional scholarships in an era of increased transparency.

In my follow up article available on SSRN, I analyzed the frequency of conditional scholarships generally and the extent to which conditional scholarships were utilized by law schools in different rankings tiers for the 2011-2012 academic year (the first year following the ABA's mandated disclosure of conditional scholarship retention rates). 

For the entering class in the fall of 2011, I noted that there were 140 law schools with conditional scholarship programs, and 54 law schools with scholarship renewal based only on good academic standing, one-year scholarships, or only need-based scholarship assistance.  I also noted that conditional scholarship programs were much less common among top-50 law schools than among bottom-100 law schools. 

Based on the data reported in fall of 2014 compiled by the ABA for the entering class in the fall of 2013 (the 2013-2014 academic year), the percentage of all entering first-year students with conditional scholarships has increased slightly (from 26.1% in fall 2011 to 29% in fall 2013), while the percentage of all entering first-year students who had their scholarships reduced or eliminated has decreased slightly (from 9% as of summer of 2012 to 8.4% as of summer of 2014).  The average renewal rate across law schools increased from 68.5% to 73%.

More significantly, however, the number of law schools with conditional scholarship programs has declined, while the number with other types of scholarship programs has increased fairly significantly.  By 2013-2014, there were 78 law schools with scholarships renewed based on good academic standing, with one-year scholarships or with only need-based scholarship assistance, a significant growth in just two years in the number of law schools going away from conditional scholarship programs -- 24 more law schools, a 40% increase.  This would seem to indicate that at least some law schools have decided conditional scholarships aren’t as good for law schools or for law students. 

May 7, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (1)

Monday, April 13, 2015

PROJECTIONS FOR LAW SCHOOL ENROLLMENT FOR FALL 2015

          This blog posting is designed to do three things.  First, following up on recent discussions regarding trends in applicants by Al Brophy at The Faculty Lounge and Derek Muller at Excess of Democracy, I provide a detailed analysis to project the likely total applicant pool we can expect at the end of the cycle based on trends from March through the end of the cycle in 2013 and 2014.  Second, using the likely total pool of applicants, I estimate the number of admitted students and matriculants, but also question whether the estimates might be too high given the decline in quality of the applicant pool in this cycle.  Third, building on the second point, I suggest that law schools in the lower half of the top tier are likely to see unusual enrollment/profile pressure that may then have a ripple effect down through the rankings.

1. ESTIMATES OF THE TOTAL NUMBER OF APPLICANTS

Reviewing the 2013 and 2014 Cycles to Inform the 2015 Cycle

2013   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   25, 2013

30,098

56%

53,750

Mar.   8, 2013

46,587

84%

55,460

May   17, 2013

55,764

95%

58,700

End   of Cycle

 

 

59,400

 

2014   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   31, 2014

29,638

58%

51,110

Mar.   7, 2014

42,068

79%

53,250

April   25, 2014

48,698

89%

54,720

End   of Cycle

 

 

55,700

 

2015   Current Volume Summary Date

Applicants

%   of Cycle

Projected   Total Applicant Pool

Jan.   30, 2015

26,702

54%

49,450

Mar.   6, 2015

39,646

76%

52,160

April   3, 2015

45,978

87%

52,848

End   of Cycle

 

 

54,000   (Estimate)

        In each of the last two years, a modest surge in late applicants meant the final count exceeded the March/April projections by a couple thousand.  That would suggest that the current projection (for just under 53,000) likely understates the end of cycle applicant pool, which I am now estimating conservatively at 54,000 (down about 3% from 2014).  (In 2014, the amount by which the final pool total exceeded the early March projection was nearly 2,500.  With an estimated pool of 54,000 applicants, I am estimating that the final pool in 2015 will exceed the early March projection by roughly 2,000.)  (That said, if the employment results for 2014 graduates, which will be released shortly, show modest improvement over 2013, I anticipate that even more people might come off the fence and perhaps apply late for the fall 2015 class.)

2. ESTIMATES FOR ADMITTED APPLICANTS AND MATRICULANTS  

        The chart below shows the number of applicants, admitted students and matriculants over the last three years along with an estimate for fall 2015 based on the assumption above that we have a total of 54,000 applicants this cycle.  With 1,700 fewer applicants, I am assuming 1,000 fewer admitted students (a slight increase in the percentage admitted from 2014), and then assuming the number of matriculants will reflect the three-year average for the percentage of admitted students who matriculate – 87%.  This would yield a first-year entering class of 36,975, down about 2.5% from 2014.   

Estimates of Admitted Students and Matriculants for 2015 Based on Trends in 2012-2014

 

Applicants

Admitted   Students

Percent   of Applicants

Matriculants

Percent  of Admitted

2012

67,900

50,600

74.5%

44,481

87.9%

2013

59,400

45,700

76.9%

39,675

86.8%

2014

55,700

43,500

78.1%

37,924

87.2%

2015   (est.)

54,000

42,500

78.7%

36,975

87%

Why These Estimates for Admitted Students and Matriculants Might be Too High

        a.      Significant Decline in Applicants with LSATs of 165+

        Because of changes in the nature of the applicant pool in 2015, however, the estimates of the number of admitted students and number of matriculants in the chart above may be too high.  In 2014, almost all of the decrease in applicants came among those with LSATs of <165.  The pool of applicants with LSATs of 165+ in 2014 was only slightly smaller than in 2013 (7,477 compared with 7,496). Indeed, as a percentage of the applicant pool, those with LSATs of 165+ increased from 12.6% in 2013 to 13.4% in 2014.  This resulted in a slight increase in the number of matriculants with LSATs of 165+ in 2014 compared to 2013 (6,189 compared with 6,154).

        In the current cycle, however, the number of applicants with LSATs of 165+ was only 6,320 as of March 6, 2015. In 2013, there were 7,228 on March 8, 2013 (of a final total of 7,496).  In 2014, there were 7,150 on March 7 (of a final total of 7,477).  Thus, the average increase in applicants with LSATs of 165+ between early March and the end of the cycle is only about 4%.  That would suggest that we could anticipate having roughly 6,585 applicants with LSATs of 165+ at the end of the cycle – down nearly 900 from 2014 – over 12%.

Estimate of Number of Total Applicants for 2015 with LSATs of 165+ Based on Trends in 2013 and 2014

 

Applicants at 165+

 

Applicants at 165+

# Increase to end of Cycle

% Increase to end of Cycle

March 8, 2013

7228

End of Cycle 2013

7496

268

3.7%

March 7, 2014

7150

End of Cycle 2014

7477

327

4.6%

March 6, 2015

6320

End of Cycle 2015 (est.)

6585

265

4.2%

        On a longer term basis, if the estimates in the preceding paragraphs are accurate, the entering class in fall of 2015 will again extend the slide in the number and percentage of first-year students with LSATs of 165+ that has been underway since the class that entered in fall of 2010.

Five-Year Trend in Applicants and Matriculants with LSATs of 165+  and Estimates for 2015

 

Applicants with LSATs of 165+

Matriculants   with LSATs of 165+

Percent   of Applicants Matriculating

2010

12,177

9,477  

77.8%

2011

11,190

8,952  

80%

2012

9,196

7,571  

82.3%

2013

7,496

6,154  

82.1%

2014

7,477

6,189

82.8%

2015 (est.)

6,585

5,420

82.4%

        Given that on average over the last three years roughly 82.4% of admitted students with LSATs of 165+ actually matriculated, one could expect that the 6,585 applicants would translate into 5,420 matriculants with LSATs of 165+ for fall 2015, a decline of nearly 770 from 2014.  Notably, this would represent a 45.9% drop in applicants with LSATs of 165+ since 2010 and a 42.8% drop in matriculants with LSATs of 165+ since 2010.

        b. Modest Decrease Among Applicants with LSATs <150

        On the other end of the LSAT distribution, it is a completely different story. Although the number of applicants with LSATs <150 also has declined, the decline has been more modest than among those with LSATs of 165+.  Moreover, those with LSATs of <150 are much more likely to apply late in the cycle.  In the last two years there has been significant growth among applicants with LSATs of <150 between early March and the end of the cycle.   As a result, I would estimate that we would have 18,350 applicants with LSATs of <150 by the end of this cycle, a decline of only about 4.5%.

Estimate of Number of Total Applicants for 2015 with LSATs of <150 Based on Trends in 2013 and 2014

 

Applicants with LSATs of <150

 

Applicants with LSATs of <150

# Increase

% Increase

March 8, 2013

13,364

End of Cycle 2013

20,706

6,642

49.7%

March 7, 2014

11,662

End of Cycle 2014

19,239

7,577

65%

March 6, 2015

11,467

End of Cycle 2015 (est.)

18,350

6,880

60%

        With applicants with LSATs <150 making up a larger percentage of the declining applicant pool, the number of matriculants with LSATs of <150 actually had grown each year up until 2014, when the slight increase in matriculants with LSATs of 165+ was mirrored by a slight decrease in matriculants with LSATs <150. 

Five-Year Trend in Applicants and Matriculants with LSATs of <150 and Estimates for 2015

 

Applicants   with LSATs of <150

Matriculants   with LSATs of <150

Percent   of Applicants Matriculating

2010

26,548

7,013

26.4%

2011

24,192

7,101

29.4%

2012

22,089

7,906

35.8%

2013

20,706

8,482

41%

2014

19,239

8,361

43.5%

2015 (est.)

18,350

8,700

47.4%

        Given that the percentage of applicants with LSATs <150 matriculating has increased each of the last five years, it seems reasonable to expect another increase – to 47.4% -- resulting in roughly 8,700 matriculants with LSATs of <150, particularly given the decrease in the number of applicants with LSATs of 165+.  Even so, it seems unlikely to make up for the drop of nearly 770 matriculants among those with LSATs of 165+.  Notably, while the pool of applicants with LSATs <150 has decreased by of 30.9% since 2010, the number of matriculants has increased by 24.2%.

        Thus, while the smaller decline in applicants that is expected this year might suggest a correspondingly smaller decline in matriculants, with the weaker profile of the applicant pool in 2015 compared to 2014, it is quite possible that the total number of admitted students will be lower than the chart above suggests and that the corresponding number of matriculants also will be lower than the chart above suggests.

        Phrased differently, if there really is going to be a decline of roughly 770 matriculants just in the group with LSATs of 165+, then the total decline in matriculants may well be greater than the 950 estimated in the chart above.  Between 2013 and 2014, a decline in applicants of 3,700, almost all with LSATs of 164 and below, resulted in a decline in matriculants of 1,750, all with LSATs of 164 and below.  If the decline in applicants is 1,700 this cycle, with over half the decline among those with LSATs of 165+, with a decline of perhaps several hundred with LSATs between 150-164, and with a modest decrease (or possibly a slight increase) among those with LSATs <150, we may well see that the decline in admitted students and in matriculants is slightly larger than estimated in the chart above.

3. PROFILE CHALLENGES AMONG ELITE SCHOOLS

        One interesting side note is that the significant decrease in the number of applicants with LSATs of 165+ is likely to put significant pressure on a number of top-50 law schools as they try to hold their enrollment and their LSAT profiles.  Simply put, there are not enough applicants with LSATs of 165+ to allow all the law schools in the top-50 or so to maintain their profiles and their enrollment. 

        If the estimates above are correct – that there will be roughly 5420 matriculants with LSATs of 165+– and if we assume that at least a few hundred of these matriculants are going to be going to law schools ranked 50 or below either due to geography or scholarships or both – and if we assume that the top 15 law schools are likely to leverage rankings prestige (and perhaps scholarships) to hold enrollment and profile -- then the decrease of roughly 770 matriculants with LSATs of 165+ is going to be felt mostly among the law schools ranked 16-50 or so. 

        In 2014, the top 15 law schools probably had roughly 3,800 first-year matriculants with LSATs of 165+.  The schools ranked 16-50 likely had another 1,900 or so.  The remaining 500 plus matriculants with LSATs of 165 and above likely were scattered among other law schools lower in the rankings. Let’s assume the top-15 law schools manage to keep roughly 3,700 of the 3,800 they had in 2014.  Let’s assume law schools ranked 50 and below keep roughly 500 or so.  That means the law schools ranked between 16 and 50 have to get by with 1,220 matriculants with LSATs of 165+ rather than 1,900 last year.  While many schools will be dealing with the challenges of maintaining enrollment (and revenue) while trying to hold profile, this likely will be a particularly challenging year for law schools ranked between 16 and 50 that are trying to navigate concerns about enrollment (and revenue) with concerns about profile.  To the extent that those schools look toward applicants with lower LSAT profiles to maintain enrollment, that will then have a ripple effect through the law schools lower in the rankings.

April 13, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, January 6, 2015

The Variable Affordability of Law School – How Geography and LSAT Profile Impact Tuition Costs

I have posted to SSRN the PowerPoint slides I presented yesterday at the AALS Conference session sponsored by the Section on Law School Administration and Finance.  The presentation was entitled The Variable Affordability of Law School – How Geography and LSAT Impact Tuition Cost.   (I am very grateful to my research assistant, Kate Jirik, and her husband, Sam, for awesome work on the spreadsheet that supported the data I presented.)

The presentation begins with two slides summarizing data presented in my article Reflections on the Decreasing Affordability of Legal Education showing the extent to which average public school and private school tuition increased between 1985 and 2011 relative to law school graduate income.  While many have observed that law school has become increasingly expensive over the last few decades, this "macro" discussion fails to highlight the extent to which differences in tuition exist at a “micro” level either based on geography or on LSAT score.

Using 2012 tuition data, the first set of slides focuses on geographic differences – noting some states where legal education generally is very expensive, some states where legal education generally is very affordable and the balance of states in which tuition costs are in the middle or have a mix of affordable and expensive. 

Following those slides, there is a set of slides that describe the process I used to calculate net tuition costs after accounting for scholarships for all entering first-year students at the 195 fully accredited and ranked law schools in fall 2012 in an effort to allocate all students into a five-by-five grid with five LSAT categories (165+, 160-164, 155-159, 150-154 and <150) and five cost categories ($0-$10,000, $10,000-$20,000, $20,000-$30,000, $30,000-$40,000, and $40,000+).  There then are a set of slides summarizing this data and trying to explain what we can learn from how students are allocated across the five-by-five grid, which includes a set of slides showing the average rank of the schools at which students in each LSAT/Cost category cell are enrolled.

The concluding slide sets forth a couple of short observations about the data. There was a robust discussion with some great questions following the presentation of this data.

Here are four of the slides to give you a flavor for the presentation on net cost generally and then net cost relative to LSAT categories -- Image1
Image1

 

Image1

 

Image1


Image1

January 6, 2015 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Monday, December 29, 2014

The Composition of Graduating Classes of Law Students -- 2013-2016 -- Part One

PART ONE -- Analyzing the LSAT Profile/Composition of Entering First-Years from 2010 to 2013 and 2014

In the fall of 2013, I had a series of blog posting about the changing demographics of law students.  In the first, I noted that fewer students were coming to law school from elite colleges and universities.  In the second, I noted that between 2010 and 2013 there had been a decline in the number of matriculants with high LSATs and an increase in the number of matriculants with low LSATs such that the “composition” of the class that entered law school in the fall of 2013 was demonstrably less robust (in terms of LSAT profile) than the “composition” of the class that entered law school in the fall of 2010.  In describing this phenomenon, I noted that when the entering class in fall 2013 graduates in 2016, it might encounter greater problems with bar passage than previous classes. 

In light of the significant decline in the median MBE scaled score in July, which Derek Muller has discussed here and here, and which I have discussed here, and a significant decline in first-time bar passage rates in many jurisdictions this year, it seems like an appropriate time to look more closely at changing class profiles and the likely impact on bar passage in the next few years.

This is the first of two blog posts regarding the changing composition of entering classes and the changing composition of graduating classes.  In Part I, I analyze the distribution of LSAT scores across categories based on the LSAC’s National Decision Profiles for the years 2009-2010 through 2012-2013, and then analyze the distribution of law school median LSATs and the 25th percentile LSATs across ranges of LSAT scores.  In Part II, I will analyze how attrition trends have changed since 2010 to assess what that might tell us about the composition of graduating classes three years after entering law school as a way of thinking about the likely impact on bar passage over time.

Tracking Changes Based on National Decision Profiles – 2010-2013

The following discussion summarizes data in the LSAC’s National Decision Profiles from the 2009-10 admission cycle (fall 2010) through the 2012-13 admission cycle (fall 2013).  The National Decision Profile for the 2013-14 admission cycle (fall 2014) has not yet been released.

Let’s start with the big picture.  If you take the matriculants each year and break them into three LSAT categories – 160+, 150-159, and <150 – the following chart and graph show the changes in percentages of matriculants in each of these categories over the last four years. 

Percentage of Matriculants in LSAT Categories – 2010-2013

                        2010    2011    2012    2013

160+                40.8     39        36.3     33.4

150-159           45        45.3     44.3     44.1

<150                14.2     15.7     19.3     22.5

Image1
Notably, this chart and graph show almost no change in the “middle” category (150-159 -- purple) with most of the change at the top (160+ -- orange -- decreasing from 40.8% to 33.4%) and bottom (<150 -- blue -- increasing from 14.2% to 22.5%).  This chart and graph also show only a modest change between 2010 and 2011 with more significant changes in 2012 and again in 2013 – when the percentage of students with LSATs of 160+ declines more substantially and the percentage of students with LSATs of <150 grows more substantially.

While I think this tells the story pretty clearly, for those interested in more detail, the following charts provide a more granular analysis.

Changes in LSAT Distributions of Matriculants – 2010-2013       

                            2010    2011    2012    2013         Chg in Number     % Chg in Number       

170+                3635    3330    2788    2072                -1563               -43%   

165-169           5842    5622    4783    4082                -1760               -30%   

160-164           10666  8678    7281    6442                -4224               -39.6%

155-159           11570   10657  9700    8459                -3111                -26.9%

150-154           10626  9885    8444    8163                -2463               -23.2%

145-149           5131     5196    5334    5541                 410                  8%      

<145                1869    1888    2564    2930                1061    `           56.8% 

                        49339  45256  40894  37689 

Note that in terms of percentage change in the number of matriculants in each LSAT category, the five highest LSAT categories are all down at least 20%, with 160-164 down nearly 40% and 170+ down over 40%, while the two lowest LSAT categories are up, with <145 being up over 50%.

 

Image1
Note that in the line graph above, the top two categories have been combined into 165+ while the bottom two categories have been combined into <150.  Perhaps most significantly, in 2010, the <150 group, with 7,000 students, was over 2,400 students smaller than the next smallest category (165+ with 9.477) and more than 4,500 students smaller than the largest category (155-159 with 11,570).  By 2013, however, the <150 category had become the largest category, with 8,471, just surpassing the 155-159 category, with 8,459, and now 2,300 larger than the smallest category, 165+ with only 6,154.

Changes in Percentage of Matriculants in LSAT Ranges – 2010-2013

                        PERCENTAGE OF MATRICULANTS

                        2010    2011    2012    2013    % Chg in %    

>169                0.074   0.074   0.068   0.055   -25.7%

165-169           0.118   0.124   0.117    0.108   -8.5%  

160-164           0.216   0.192   0.178   0.171   -20.8%

155-159           0.235   0.235   0.237   0.224   -4.7%  

150-154           0.215   0.218   0.206   0.217   0.9%   

145-149           0.104   0.115    0.13     0.147   41.3% 

<145                0.038   0.042   0.063   0.078   105.3%                       

In terms of the “composition” of the class, the percentage of matriculants in each LSAT category, as noted above, little has changed in the “middle” – 155-159 and 150-154, but significant changes have occurred at the top and bottom, with declines of 20% or more at 160-164 and 170+ and with increases of 40% at 145-149 and over 100% at <145.

Tracking Changes in Law School Median LSATs by LSAT Category

A different way of looking at this involves LSAT profiles among law schools over this period.  Based on the data law schools reported in their Standard 509 Reports, from 2010 to 2014, the chart below lists the numbers of law schools reporting median LSATs within certain LSAT ranges.  (This chart excludes law schools in Puerto Rico and provisionally-approved law schools.)

Number of Law Schools with LSAT Medians in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

30

31

26

23

21

160-164

47

41

39

31

29

155-159

59

57

56

53

51

150-154

50

52

53

56

59

145-149

9

14

22

28

29

<145

0

1

0

5

7

 

Image1

The chart above pretty clearly demonstrates the changes that have taken place since 2010, with declines in the number of law schools with median LSATs in higher LSAT categories and increases in the number of law schools with median LSATs in the lower LSAT categories.  The number of law schools with median LSATs of 160 or higher has declined from 77 to 50.  By contrast, the number of law schools with median LSATs of <150 has quadrupled, from 9 to 36.   Moreover, the “mode” in 2010 was in the 155-159 category, with nearly 60 law schools, but as of 2014, the “mode” had shifted to the 150-154 category with nearly 60 law schools.

Number of Law Schools with 25th Percentile LSAT in LSAT Categories – 2010-2014

 

2010

2011

2012

2013

2014

165+

17

16

11

10

10

160-164

26

20

21

17

15

155-159

55

54

49

42

41

150-154

67

69

59

65

57

145-149

26

33

46

48

48

<145

4

4

10

14

25

 

Image1

For those who want to focus on the bottom 25th percentile of LSAT profile among law schools, the chart above shows a similar trend when compared with the medians, except that the number of law schools with a 25th  percentile LSAT between 150-154 also declined (as opposed to an increase with respect to medians). The number of law schools with 25th percentile LSATs of 160 or higher has declined from 43 to 25.  Similarly, the number of law schools with 25th percentile LSATs of 150-159 has declined from 122 to 98.  By contrast, the number of law schools with 25th percentile LSATs of 145-149 has nearly doubled from 26 to 48, while the number of law schools with 25th percentile LSATs of <145 has sextupled from 4 to 25. 

One other way of looking at this is just to see how the average first-year LSAT profiles have changed over the last four years. 

Average LSATs of Matriculants at Fully-Accredited ABA Law Schools

            75th Percentile             Median            25th Percentile

2010                160.5               158.1               155.2

2011                160.1               157.8               154.5

2012                159.6               157                  153.6

2013                158.7               156                  152.6

2014                158.2               155.4               151.8

This shows that between 2010 and 2014, the average 75th percentile LSAT has declined by 2.3 points, the average median LSAT has declined by 2.7 points and that the average 25th percentile LSAT has declined by 3.4 points.

Conclusion

If one focuses on the LSAT score as one measure of “quality” of the entering class of law students each year, then the period from 2010-2014 not only has seen a significant decline in enrollment, it also has seen a significant decline in quality.  On an axis with high LSATs to the left and low LSATs to the right, the “composition” of the entering class of law students between 2010 and 2014 has shifted markedly to the right, as shown in the graph below.  Moreover, the shape of the curve has changed somewhat, thinning among high LSAT ranges and growing among low LSAT ranges.  

Image1

This shift in entering class composition suggests that bar passage rates are likely to continue to decline in the coming years.  But in terms of bar passage, the entering class profile is less meaningful than the graduating class profile.  In part two, I will look at attrition data from 2011 to 2014 to try to quantify the likely “composition” of the graduating classes from 2010 to 2013, which will give us a more refined idea of what to expect in terms of trends in bar passage in 2015 and 2016.

(I am grateful to Bernie Burk and Alice Noble-Allgire for helpful comments on earlier drafts.)

December 29, 2014 in Data on legal education, Structural change | Permalink | Comments (5)

Saturday, December 20, 2014

Further Understanding the Transfer Market -- A Look at the 2014 Transfer Data

This blog posting is designed to update my recent blog posting on transfers to incorporate some of the newly available data on the Summer 2014 transfer market.  Derek Muller also has written about some of the transfer data and I anticipate others will be doing so as well.

NUMBERS AND PERCENTAGES OF TRANSFERS – 2006-2008, 2011-2014

While the number of transfers dropped to 2187 in 2014 down from 2501 in 2013, the percentage of the previous fall’s entering class that engaged in the transfer market remained the same at roughly 5.5%, down slightly from 5.6% in 2013, but still above the percentages that prevailed from 2006-2008 and in 2011 and 2012.

 

2006

2007

2008

2011

2012

2013

2014

Number   of Transfers

2265

2324

2400

2427

2438

2501

2187

Previous   Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

39700

%   of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

5.5%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET – 2012-2014

The following two charts list the top 20 transfer schools in Summer 2012 (fall 2011 entering class), Summer 2013 (fall 2012 entering class) and Summer 2014 (fall 2013 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers from 2012-2014

School

Number   in 2012

School

Number   in 2013

School

Number   in 2014

Florida   State

89

Georgetown

122

Georgetown

113

Georgetown

85

George   Wash.

93

George Wash.

97

George   Wash.

63

Florida   St.

90

Arizona St.

66

Columbia

58

Emory

75

Idaho

57

Mich. State

54

Arizona   State

73

Cal. Berkeley

55

NYU

53

American

68

NYU

53

American

49

Texas

59

Emory

50

Cardozo

48

Columbia

52

Columbia

46

Loyola Marymount

46

NYU

47

American

44

Rutgers   - Camden

42

Minnesota

45

UCLA

44

Minnesota

42

Arizona

44

Wash. Univ.

44

Arizona   State

42

Northwestern

44

Texas

43

Cal. Berkeley

41

UCLA

41

Minnesota

37

Emory

41

Cardozo

38

Northwestern

35

UCLA

39

Southern   Cal.

37

Harvard

33

Northwestern

38

Utah

34

Mich. State

33

Florida

37

Harvard

34

Loyola Marymount

32

Maryland

34

Florida

33

Florida State

31

Michigan

33

Cal. Berkeley

32

Southern   Cal.

30

SMU

31

Wash Univ.

31

Miami

29

Harvard

31

 

 

 

 

 

Largest Law Schools by Transfers as Percentage of Previous First-Year Class

2012-2014 

School

% 2012

School

% 2013

School

% 2014

 

Florida St.

44.5

Florida State

48.1

Arizona State

51.6

Arizona State

24.6

Arizona State

48

Idaho

51.4

Michigan State

17.5

Utah

34.7

Washington Univ.

23.3

Utah

17.5

Emory

29.6

Emory

22.9

Minnesota

17.1

Arizona

28.9

Georgetown

20.8

Emory

16.5

Minnesota

22

George Wash.

20.2

Cal. Berkeley

16.2

George Wash.

21.8

Cal. Berkeley

19.4

Rutgers - Camden

14.9

Georgetown

21.2

Florida St.

18.2

Georgetown

14.7

Rutgers – Camden

20.7

Rutgers - Camden

17.1

Southern Cal.

14.7

Southern Cal.

19.7

Southern Cal.

17.1

Northwestern

14.4

Texas

19.1

Minnesota

16.7

Cincinnati

14.3

Cincinnati

17.5

Utah

15.9

Columbia

14.3

Northwestern

17.1

Northwestern

15.3

Buffalo

14.2

Washington Univ.

15.4

UCLA

15

Arizona

14

Univ. Washington

15.3

Seton Hall

14.5

Cardozo

13.8

Columbia

14.2

Florida Int.

13.9

SMU

13.4

American

13.8

Texas

13.5

Florida

12.7

SMU

13.3

Columbia

13.1

Chicago

12.6

UCLA

13.3

Richmond

12.8

George Wash.

12.5

Chicago

13

Univ. Washington

12.6

 

 

 

 

Houston

12.6

 

Note that in these two charts, the “repeat players” -- those schools in the top 20 for all three years -- are bolded.  In  2013 and 2014, nine of the top ten schools for number of transfers repeated.  (The notable newcomer this year is Idaho, which received 55 transfers from the Concordia University School of Law when Concordia did not receive provisional accreditation from the ABA.)  Across all three years, eight of the top ten schools for percentage of transfers repeated.

Top Ten Law Schools as a Percentage of All Transfers

 

2006

2011

2012

2013

2014

Total Transfers

482

570

587

724

625

Transfers to 10 Schools with Most   Transfers

2265

2427

2438

2501

2187

Transfers to 10 Schools with Most   Transfers as % of   Transfers

21.3%

23.5%

24.1%

28.9%

28.6%

 

The chart above demonstrates an increasing concentration in the transfer market between 2006 and 2014 and even moreso between 2012 and 2014, as the ten law schools with the most students transferring captured an increasing share of the transfer market. 

NATIONAL AND REGIONAL MARKETS BASED ON NEW DATA

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar began collecting and requiring schools with more than five transfers in to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least twelve students transferred in to the school).  This allows us to begin to explore the nature of the transfer market by looking at where students are coming from and are going and by looking at the first-year GPA profile of students transferring in to different law schools. 

Percentage of Transfers from Within Geographic Region and Top Feeder School(s)

USNews

Ranking

School

# Transfers

Region

Regional

Transfers

Reg. %

Feeder

Schools

#

2

Harvard

33

NE

6

18

Emory-Wash. Univ.

3

4

Columbia

46

NE

19

41

Brooklyn

5

6

NYU

50

NE

20

40

Cornell

8

9

Berkeley

55

CA

43

78

Hastings

18

12

Northwestern

35

MW

24

69

DePaul-Chicago Kent-Loyola

5

13

Georgetown

113

Mid-Atl

49

43

American

13

15

Texas

43

TX

27

63

Baylor

5

16

UCLA

44

CA

31

70

Loyola Marymount

8

18

Wash. Univ.

44

MW

20

45

SLU

4

19

Emory

53

SE

40

75

Atlanta’s John Marshall

20

20

GWU

97

Mid-Atl

78

80

American

54

20

Minnesota

37

MW

21

57

William Mitchell

6

20

USC

30

CA

22

73

Southwestern

5

31

Azizona St.

66

SW

51

77

Arizona Summit

44

45

Florida St.

31

SE

24

77

Florida Coastal

9

61

Miami

29

SE

21

72

Florida Coastal

5

72

American

44

Mid-Atl

14

32

Baltimore-UDC

6

87

Michigan St.

33

MD

33

100

Thomas Cooley

31

87

Loyola Marymount

32

CA

26

81

Whittier

15

 

For this set of 19 schools with the most transfer students, the vast majority obtained most of the transfers from within the geographic region within which the law school is located.   Only two schools (Harvard and American) had fewer than 40% of their transfers from within the region in which they are located and only four others (Columbia, NYU, Georgetown and Washington University) had fewer than 50% of the transfers from within their regions.  Meanwhile, ten of the 19 schools had 70% or more of their transfers from within the region in which the school is located. 

Moreover, several schools had a significant percentage of their transfers from one particular feeder school.  For Berkeley, roughly 33% of its transfers came from Hastings; for Emory, nearly 40% of its transfers came from Atlanta’s John Marshall Law School; for George Washington, over 55% of its transfers came from American; for Arizona State, 67% of its transfers came from Arizona Summit; for Michigan State nearly 95% of its transfers came from Thomas Cooley; for Loyola Marymount, nearly 50% of its transfers came from Whittier; and for Idaho, over 95% of its transfers came from Concordia.

 Percentage of Transfers from Different Tiers of School(s)

Along With First-Year Law School GPA 75th/50th/25th

USNews Ranking

 

# of Trans.

Top 50

# -- %

51-99

# -- %

100-146

# -- %

Unranked

            # -- %

GPA 75th

GPA 50th

GPA 25th

2

Harvard

33

23

70

10

30

0

0

0

0

3.95

3.9

3.83

4

Columbia

46

29

63

14

30

3

7

0

0

3.81

3.75

3.69

6

NYU

50

41

82

7

14

2

4

0

0

3.74

3.62

3.47

9

Berkeley

55

17

31

27

33

6

11

5

9

3.9

3.75

3.68

12

Northwestern

35

16

46

12

34

6

17

1

3

3.73

3.56

3.4

13

Georgetown

113

27

24

38

34

17

15

31

27

3.77

3.67

3.55

15

Texas

43

17

40

13

3

9

21

4

9

3.62

3.45

3.11

16

UCLA

44

15

34

23

52

2

5

4

9

3.73

3.58

3.44

18

Wash. Univ.

44

3

7

25

57

1

2

15

34

3.43

3.2

3.06

19

Emory

53

3

6

7

13

8

15

35

66

3.42

3.27

2.93

20

GWU

97

13

13

73

75

11

11

0

0

3.53

3.35

3.21

20

Minnesota

37

4

11

12

32

18

49

3

8

3.3

3.1

2.64

20

USC

30

1

3

11

37

6

20

12

40

3.71

3.59

3.44

31

Arizona St.

66

4

6

5

8

8

12

49

74

3.51

3.23

2.97

45

Florida St.

31

2

6

4

13

3

10

22

71

3.29

3.1

2.9

61

Miami

29

1

3

4

14

6

21

18

62

3.3

3.07

2.87

72

American

44

2

5

14

32

3

7

25

57

3.25

2.94

2.78

87

Michigan St.

33

0

0

0

0

1

3

32

97

3.19

3.05

2.83

87

Loyola Mary

32

0

0

0

0

1

3

31

97

3

3

3

 

The chart above shows the tiers of law schools from which the largest schools in the transfer market received their transfer students.  Thirteen of the top 19 schools for transfers are ranked in the top 20 in USNews, but of those 13, only six had 80% or more of their transfers from schools ranked between 1 and 99 in the USNews rankings – Harvard, Columbia, NYU, Northwestern, UCLA and George Washington.  Three additional schools had at least 50% of their transfers from schools ranked between 1 and 99, Berkeley, Georgetown and Washington University.  The other ten schools had at least half of their transfer students from schools ranked 100 or lower, with some schools having a significant percentage of their transfers from schools ranked alphabetically.  This data largely confirms the analysis of Bill Henderson and Jeff Rensberger regarding the rankings migration of transfers – from lower ranked schools to higher ranked schools.

In addition, as you move down the rankings of transfer schools, the general trend in first-year law school GPA shows a significant decline, with several highly-ranked schools taking a number of transfers with first-year GPAs below a 3.0, including Emory, Minnesota, Arizona State, and Florida State.

STILL MANY UNKNOWNS

This new data should be very helpful to prospective law students and to current law students who are considering transferring.  This data gives them at least a little better idea of what transfer opportunities might be available to them depending upon where they go to law school as a first-year student.

Even with this more granular data now available, however, as I noted in my earlier posting on transfer students, there still are a significant number of unknowns relating to transfer students.  These unknowns cover several different points.  

First, what is the acceptance rater for transfers?  We now know how many transferred came from different schools and we have some idea of first-year GPA ranges for those admitted as transfers, but we do not know the acceptance rate on transfers.  Are a significant percentage of transfers not admitted or are most students interested in trasnferring finding a new home someplace.

Second, what are motivations of transfers and what are the demographics of transfers?  Are transfers primarily motivated by better employment opportunities perceived to be available at the higher-ranked law school?  Are some subset of transfers primarily motivated by issues regarding family or geography (with rankings and employment outcomes as secondary concerns)?

Third, how do the employment outcomes of transfer students compare with the employment outcomes of students who started at a given law school?  Does the data support the perception that those who transfer, in fact, have better employment outcomes by virtue of transferring?

Fourth, what are the social/educational experiences of transfers in their new schools and what is the learning community impact on those schools losing a significant number of students to the transfer market?

For those interested in these issues, it might make sense to design some longitudinal research projects that could help find answers to some of these questions.

December 20, 2014 in Current events, Data on legal education | Permalink | Comments (0)

Wednesday, December 10, 2014

BETTER UNDERSTANDING THE TRANSFER MARKET

What do we know about the transfer student market in legal education? 

Not enough.  But that will begin to change in the coming weeks.

NUMBER/PERCENTAGE OF TRANSFER STUDENTS HAS INCREASED MODESTLY

Up until this year, the ABA Section of Legal Education and Admissions to the Bar only asked law schools to report the number of transfer students “in” and the number of transfer students “out.”  This allowed us to understand roughly how many students are transferring and gave us some idea of where they are going, and where they are coming from, but not with any direct “matching” of exit and entrance.

Has the number and percentage of transfer students changed in recent years?

In 2010, Jeff Rensberger published an article in the Journal of Legal Education in which he analyzed much of the then available data regarding the transfer market and evaluated some of the issues associated with transfer students.  He noted that from 2006 to 2009 the number of transfer students had remained within a range that represented roughly 5% of the rising second-year class (after accounting for other attrition) – 2,265 in summer 2006, 2,324 in summer 2007, 2,400 in summer 2008, and 2,333 in summer 2009.)  

Using data published in the law school Standard 509 reports, the number of transfers in 2011, 2012 and 2013 has increased only marginally, from 2427 to 2438 to 2501, but, given the declining number of law students, it has increased as a percentage of the preceding year’s first-year “class,” from 4.6% to 5.6%.  Thus, there is a sense in which the transfer market is growing, even if not growing dramatically.

Numbers of Transfer Students 2006-2008 and 2011-2013

 

2006

2007

2008

2011

2012

2013

Number of Transfers

2265

2324

2400

2427

2438

2501

Previous Year First Year Enrollment

48,100

48,900

49,100

52,500

48,700

44,500

% of Previous First-Year Total

4.7%

4.8%

4.9%

4.6%

5%

5.6%

 

SOME SCHOOLS DOMINATE THE TRANSFER MARKET

In 2008, Bill Henderson and Brian Leiter highlighted issues associated with transfer students.   Henderson and Leiter were discussing the data from the summer of 2006.  Brian Leiter posted a list of the top ten law schools for net transfer students as a percentage of the first year class.  Bill Henderson noted the distribution of transfer students across tiers of law schools (with the law schools in the top two tiers generally having positive net transfers and the law schools in the bottom two tiers generally having negative net transfers), something Jeff Rensberger also noted in his 2010 article.   

Things haven’t changed too much since 2006.  In 2012, there were 118 law schools with fewer than 10 “transfers in” representing a total of 485 transfers – slightly less than 20% of all transfers.  On the other end, there were 21 schools with 30 or more “transfers in” totaling 996 transfers -- nearly 41% of all transfers. Thus, roughly 10% of the law schools occupied 40% of the market (increasing to nearly 44% of the market in 2013).

We also know who the leading transfer schools have been over the last three years.  The following two charts list the top 20 transfer schools in Summer 2011 (fall 2010 entering class), Summer 2012 (fall 2011 entering class) and Summer 2013 (fall 2012 entering class) – with one chart based on “numbers” of transfers and the other chart based on the number of transfer students as a percentage of the prior year’s first year class.

Largest Law Schools by Number of Transfers in 2012 and 2013

(BOLD indicates presence on list all three year)

 

School

Number in 2011

School

Number in 2012

School

Number in 2013

George Wash.

104

Florida State

89

Georgetown

122

Georgetown

71

Georgetown

85

George Wash.

93

Florida St.

57

George Wash.

63

Florida St.

90

New York Univ.

56

Columbia

58

Emory

75

American

53

Michigan State

54

Arizona State

73

Michigan State

52

New York Univ.

53

American

68

Columbia

46

American

49

Texas

59

Cardozo

45

Cardozo

48

Columbia

52

Loyola Marymount

44

Loyola Marymount

46

New York Univ.

47

Washington Univ.

42

Rutgers - Camden

42

Minnesota

45

Cal. Los Angeles

40

Minnesota

42

Arizona

44

Michigan

39

Arizona State

42

Northwestern

44

Northwestern

39

Cal. Berkeley

41

Cal. Los Angeles

41

Rutgers - Camden

36

Emory

41

Cardozo

38

San Diego

35

Cal. Los Angeles

39

Southern Cal.

37

Arizona State

34

Northwestern

38

Utah

34

Brooklyn

33

Florida

37

Harvard

34

Cal. Hastings

32

Maryland

34

Florida

33

Minnesota

31

Michigan

33

Cal. Berkeley

32

Lewis & Clark

30

SMU

31

Washington Univ.

31

Harvard

30

Harvard

31

   

 

Largest Law Schools by Transfers as a Percentage of Previous First Year Class

(BOLD indicates presence on list in both years)

 

 School

Percentage 2011 (as a percentage of the 2010 first year class)

School

Percentage 2012

(as a percentage of the 2011 first year class)

School

Percentage 2013

(as a percentage of the 2012 first year class)

Florida St.

28.6

Florida St.

44.5

Florida State

48.1

George Wash.

19.9

Arizona State

24.6

Arizona State

48

Utah

19.7

Michigan State

17.5

Utah

34.7

Arizona State

17.8

Utah

17.5

Emory

29.6

Michigan State

17.4

Minnesota

17.1

Arizona

28.9

Washington and Lee

15.3

Emory

16.5

Minnesota

22

Washington Univ.

15.2

Cal. Berkeley

16.2

George Wash.

21.8

Loyola Marymount

15.1

Rutgers - Camden

14.9

Georgetown

21.2

Northwestern

14.2

Georgetown

14.7

Rutgers – Camden

20.7

Richmond

13.7

Southern Cal.

14.7

Southern Cal.

19.7

Rutgers - Camden

13.4

Northwestern

14.4

Texas

19.1

Cal. Los Angeles

13

Cincinnati

14.3

Cincinnati

17.5

Cal. Davis

12.8

Columbia

14.3

Northwestern

17.1

Lewis & Clark

12.1

Buffalo

14.2

Washington Univ.

15.4

Georgetown

12

Arizona

14

Univ. Washington

15.3

Minnesota

11.9

Cardozo

13.8

Columbia

14.2

New York Univ.

11.8

SMU

13.4

American

13.8

Cardozo

11.8

Florida

12.7

SMU

13.3

Columbia

11.4

Chicago

12.6

Cal. Los Angeles

13.3

Buffalo

11

George Wash.

12.5

Chicago

13

 

Note that in these two charts, the “repeat players” are bolded – those schools in the top 20 for all three years – 2011, 2012 and 2013.  (Four of the top ten schools Leiter highlighted from the summer of 2006 remain in the top ten as of the summer of 2013, with four others still in the top 20.)  In addition, it is worth noting some significant changes between 2011 and 2013.  For example, the number of schools with 50 or more transfers increased from six to eight with only two schools with more than 70 transfers in 2011 and 2012, but with five schools with more than 70 transfers in 2013. 

Leiter’s top ten law schools took in a total of 482 transfers, representing 21.3% of the 2,265 transfers that summer.  The top ten law schools in 2011 totaled 570 transfers, representing 23.5% of the 2427 transfer students that summer.  The top ten law schools in 2012 totaled 587 transfers, representing 24.1% of the 2438 transfers that summer.  The top ten law schools in 2013, however, totaled 724 students, representing 28.9% of the 2501 transfers in 2013, demonstrating an increasing concentration in the transfer market between 2006 and 2013 and even moreso between 2012 and 2013. 

In addition, three of the top four schools with the highest number of transfers were the same all three years, with Georgetown welcoming 71 in the summer of 2011, 85 in the summer of 2012, and 122 in the summer of 2013, George Washington, welcoming 104 in the summer of 2011, 63 in the summer of 2012, and 93 in the summer of 2013, and Florida State welcoming 57 in the summer of 2011, 89 in the summer of 2012 and 90 in the summer of 2013.  (Notably, Georgetown and Florida State were the two top schools for transfers in 2006, with 100 and 59 transfers in respectively.)

Similarly, three of the top four schools with the highest “percentage of transfers” were the same all three years, with Utah at 19.7% in 2011, 17.5% in 2012 and 34.7% in 2013, Arizona State at 17.8% in 2011, 24.6% in 2012 and 48% in 2013, and Florida State at 28.6% in 2011, 44.5% in 2012 and 48.1% in 2013.  The top five schools on the “percentage of transfers” chart all increased the “percentage” of transfer students they welcomed between 2011 and 2013, some significantly, which also suggests greater concentration in the transfer market between 2011 and 2013.

More specifically, there are several schools that have really “played” the transfer game in the last two years – increasing their engagement by a significant percentage.  These eight schools had 10.2% of the transfer market in 2011, but garnered 22.2% of the transfer market in 2013.

Schools with Significant Increases in Transfers 2011-2013

School

2011

2012

2013

Percentage Increase

Texas

6

9

59

883%

Arizona

6

24

44

633%

Emory

19

41

75

295%

Arizona State

34

42

73

115%

Georgetown

71

85

122

70%

Florida State

57

89

90

58%

Southern Cal

24

29

37

54%

Minnesota

31

42

45

45%

Totals

248

371

555

124%

 

REGIONAL MARKETS

There appear to be “regional” transfer markets.  In the Southeast in 2013, for example, three schools -- Florida State, Florida and Emory -- had a combined net inflow of 180 transfer students, while Stetson and Miami were flat (43 transfers in and 42 transfers in, combined) and eight other schools from the region -- Florida Coastal, Charlotte, Charleston, Atlanta’s John Marshall, St. Thomas University, Ave Maria, Florida A&M, Nova Southeastern – had a combined net outflow of 303.  It seems reasonable to assume that many of the transfers out of these schools found their way to Emory, Florida and Florida State (and perhaps to Miami and Stetson to the extent that Miami and Stetson lost students to Emory, Florida and Florida State).

NEW DATA – NEW INSIGHTS

Starting this fall, the ABA Section of Legal Education and Admissions to the Bar is collecting and requiring schools to report not only the number of students who have transferred in, but also the schools from which they came (indicating the number from each school) along with the 75%, 50% and 25% first-year, law school GPAs of the pool of students who transferred in to a given school (provided that at least five students at the school transferred in).  As a result, we will be able to delineate the regional transfer markets (as well as those schools with more of a national transfer market.

Notably, even though the Section of Legal Education and Admissions to the Bar is not requiring the gathering and publication of the 75%, 50%, and 25% LSAT and UGPA, one thing we are very likely to learn is that for many schools, the “LSAT/UGPA” profile of transfers in is almost certainly lower than the LSAT/UGPA profile of the first-year matriculants in the prior year, a point that both Henderson and Rensberger highlight in their analyses. 

Just look at the schools in the Southeast as an example.  Assume Emory, Florida State and Florida (large “transfer in” schools) are, in fact, admitting a significant number of transfer students from other schools in the Southeast region, such as Miami and Stetson, and schools like Florida Coastal, St. Thomas University, Charlotte, Atlanta’s John Marshall and Ave Maria (large “transfer out” schools in the Southeast).  Even if they are taking students who only came from the top quarter of the entering classes at those schools, the incoming transfers would have a significantly less robust LSAT/UGPA profile when compared with the entering class profile at Emory, Florida State or Florida in the prior year.  Virtually every student who might be transferring in to Emory, Florida or Florida State from one of these transfer out schools (other than Miami and perhaps Stetson) is likely to be in the bottom quarter of the entering class LSAT profile at Emory, Florida, and Florida State.

Comparison of Relative Profiles of Southeast Region Transfer In/Out Schools

TRANSFER IN SCHOOLS

2012 LSAT

2012 UGPA

TRANSFER OUT SCHOOLS

2012 LSAT

 2012 UGPA

Emory

166/165/161

3.82/3.70/3.35

Miami

159/156/155

3.57/3.36/3.14

Florida

164/161/160

3.73/3.59/3.33

Stetson

157/157/152

3.52/3.28/3.02

Florida State

162/160/157

3.72/3.54/3.29

St. Thomas (FL)

150/148/146

3.33/3.10/2.83

 

 

 

Florida Coastal

151/146/143

3.26/3.01/2.71

 

 

 

Charlotte

150/146/142

3.32/2.97/2.65

 

 

 

Atlanta’s John Marshall

153/150/148

3.26/2.99/2.60

 

 

 

Ave Maria

153/148/144

3.48/3.10/2.81

 

This raises an interesting question about LSAT and UGPA profile data.  If we assume that LSAT and UGPA profile data are used not only by law schools as predictors of performance, but that third parties also use this data as evidence of the “strength” of the student body, and ultimately the graduates, of a given law school (for example, USNEWS in its rankings and employers in their assessment of the quality of schools at which to interview), what can we surmise about the impact from significant numbers of transfers?  For those law schools with a significant number/percentage of “transfers in” from law schools whose entering class profiles are seemingly much weaker, the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class.  Similarly, if the “transfers out” from a given school happen to come from the top half of the entering class profile, then for these schools as well the entering class profile presently published in the Standard 509 disclosure report for each school arguably fails to accurately reflect the LSAT and UGPA quality of the graduating class. 

Using the chart above, if Emory, Florida and Florida State are drawing a significant number of transfers from the regional transfer out schools, and if they had to report the LSAT and UGPA profile of their second-year class rather than their first-year class, their LSAT and UGPA profiles almost certainly would decline.   (The same likely would be true for other law schools with large numbers of transfers.)

STILL MANY UNKNOWNS

Even with more granular data available in the near future to delineate more clearly the transfer pathways between transfer out schools and transfer in schools, there still will be a significant number of unknowns relating to transfer students, regarding employment outcomes, the demographics of transfers, the experience of transfers and the motivation for transfers.

First, with respect to the employment outcomes of transfer students, how do they compare with the employment outcomes for students who started at a law school as first-years? Do the employment outcomes for transfer students track that of students who started at a law school as first-years, or is the employment market for transfer students less robust than it is for students who started at a law school as first-years?  Are the employment outcomes nonetheless better than they might have been at the school from which they transferred? These are important questions given the perception that many students transfer “up” in the rankings to improve their employment opportunities. 

Second, with respect to demographics, do students of color and women participate proportionately in the transfer market or is the market disproportionately occupied by white males?

Third, with respect to the experience of transfers, the Law School Survey of Student Engagement gathered some data from participating law schools in 2005 regarding the experience of transfers but more could be done to better understand how integrated transfer students are in the life of the learning community into which they transfer.

Fourth, with respect to the motivations of transfers, it is generally assumed that transfers are “climbing” the rankings, and Henderson’s data broadly suggests movement from lower-ranked schools to higher-ranked schools, but what percentage of transfers are doing so partly or primarily for geographic reasons – to be near family or a future career location?  How many are transferring for financial reasons because they lost a conditional scholarship after their first year of law school?  How many truly are transferring to get a JD from a higher ranked law school?  How many of those believe their job opportunities will be better at the school to which they are transferring?

We will have answers to some questions soon, but will still have many questions that remain unanswered.

December 10, 2014 in Data on legal education | Permalink | Comments (10)

Tuesday, December 2, 2014

The Market for Law School Applicants -- A Milestone to Remember

In early 2013, Michael Moffitt, the dean of Oregon Law, was interviewed by the New York Times about the tumult affecting law schools. Moffitt, who is a very thoughtful guy, reponded, "I feel like I am living a business school case study.”  

I think the analogy to the business school case study is a good one.  In the nearly two years since that story was published, the market for law school applicants has actually gotten worse.

Yesterday's Dealbook column in the New York Times featured Northwestern Law Dean Dan Rodriguez (who also serves at President of the AALS) speaking candidly about the meltdown dynamics that have taken hold.  See Elizabeth Olson, "Law School is Buyer's Market, with Top Students in Demand," New York Times, Dec. 1, 2014. 

DanRodriguez"It's insane," said Rodriguez, "We’re in hand-to-hand combat with other schools." The trendlines are indeed terrible.  Year-over-year, LSAT test-taker volume is down another 8.7%.  See Organ, LWB, Nov 11, 2014.  So we can expect the situation to get worse, at least in the near term.      

I applaud Dan Rodriguez for this leadership instincts.  He is being transparent and honest.  Several years ago the leadership of the AALS went to great lengths to avoid engagement with the media. Dan has gone the opposite direction, inviting the press into our living room and kitchen.  

Want to know what leadership and judgment look like?  It looks like Dan's interview with Elizabeth Olson.  Dan's words did not solve anyone's problem, but his honesty and candor made it more likely that we help ourselves.  Because it's Northwestern, and Dan is president of the AALS (something the story did not mention but most of us know), and this was reported by Elizabeth Olson in the New York Times, the substance and tenor of discussions within law school faculties is bound to shift, at least slightly and in the direction favoring change.   

What is the de facto plan at most law schools these days?  Universities are not going to backstop law schools indefinitely. I think the sign below is not far off the mark.  

Outrun-the-bear

We are indeed living through a business school case study, which is both bad and good.   At many schools -- likely well more than half --  hard choices need to be made to ensure survival.  (And for the record, virtually all schools, regardless of rank, are feeling uncomfortable levels of heat.)   A law school needs cash to pay its expenses.  But it also needs faculty and curricula to attract students. The deeper a law school cuts, the less attractive it becomes to students.  Likewise, pervasive steep discounts on tuition reflect a classic collective action problem. Some schools may eventually close, but a huge proportion of survivors are burning through their financial reserves.  

Open admissions, which might pay the bills today, will eventually force the ABA and DOE to do something neither really want to do -- aggressively regulate legal education.  This is not a game that is likely to produce many winners.  Rather than letting this play out, individual law schools would be much better off pursuing a realistic strategic plan that can actually move the market. 

The positive side of the business school case study is that a few legal academics are finding their voice and learning -- for the first time in several generations -- how to lead.  Necessity is a wonderful tutor.  Law is not an industry on the decline -- far from it.  The only thing on the decline is the archetypal artisan lawyer that law schools are geared to churn out.  Indeed, back in 2013 when Dean Moffitt commented about living through a business school case study, he was not referencing imminent failure.   Sure, Moffitt did not like the hand he was being dealt, but as the 2013 article showed, his school was proving to be remarkably resourceful in adapting.

The good news resides on the other side of a successful change effort.  The process of change is painful, yet the effects of change can be transformative and make people truly grateful for the pain that made it all possible.  In our case, for the first time in nearly a century, what we teach, and how we teach it, is actually going matter.  If we believe serious publications like The Economist, employers in law, business, and government need creative problem solvers who are excellent communicators, adept at learning new skills, and comfortable collaborating accross multiple disciplines -- this is, in fact, a meaningful subset of the growing JD-Advantage job market.

In the years to come, employers will become more aggressive looking for the most reliable sources of talent, in part because law schools are going to seek out preferred-provider relationships with high quality employers.  Hiring based on school prestige is a remarkably ineffective way to build a world-class workforce -- Google discovered this empirically.  

From an employer perspective, the best bet is likely to be three years of specialized training, ideally where applicants are admitted based on motivation, aptitude, and past accomplishments. The LSAT/UGPA grid method misses this by a wide margin. After that, the design and content of curricula are going to matter.  It is amazing how much motivated students can learn and grow in three years. And remarkably, legal educators control the quality of the soil.  It brings to mind that seemingly trite Spiderman cliche about great power.

For those of us working in legal education, the next several years could be the best of times or the worst of times.  We get to decide.  Yesterday's article in the Times made it a little more likely that we actually have the difficult conversations needed to get to the other side. 

December 2, 2014 in Current events, Data on legal education, Innovations in legal education, New and Noteworthy, Structural change | Permalink | Comments (4)

Tuesday, November 11, 2014

What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?

The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July.  Specifically, in a letter to law school deans, the NCBE stated that:  “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results.  All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”

Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.”  Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."

I am not persuaded.   

(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.”  Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)

I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.

If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.

The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score.  The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.

If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013.  But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.

Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0.  The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.

THINKING ABOUT GROUPS

The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam.  I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).

What about “first-time takers”?  That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction.  Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.

So who’s left?  Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014.  If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.

COMPARING LSAT PROFILES

The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013).  The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points? 

The answer to that is no.

The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.

To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade. 

Defining “student quality” can be a challenge.  A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150. 

While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.

In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.

To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134.  To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class. 

(In his article, Unpacking the BarOf Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores.  I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013. 

LSAT RANGE

Prediction of Success on the Bar Exam Based on Lowest LSAT in Range

175-180

.98

170-174

.97

165-169

.95

160-164

.91

155-159

.85

150-154

.76

145-149

.65

140-144

.50

135-139

.36

130-134

.25

Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)

One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010.  Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once.  Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once.  This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.

Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.

Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score.  Those two lines are plotted below for the period from 2005-2012.  (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)

Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012

  Image1

What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes.  But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores.  From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points.  Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points.  This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.

How does this help us understand what has happened in 2014 relative to 2013?  The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010).  Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively. 

Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores?  Certainly.  This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.

But that is not what we have experienced.  We have experienced an historic decline of 2.8 points.  Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.” 

THE EXAMSOFT DEBACLE

What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle

For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam.  This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions). 

In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before.  If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.

ONE STATE’S EXPERIENCE BELIES THE NCBE STORY

It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates.  It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates.  To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).

Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”

The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013.  In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years.  In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.

YEAR

2004

2005

2006

2007

2008

2009

2010

2011

2012

2013

2014

First Time Bar Passage Rate

 

87%

 

 

86%

 

88%

 

89%

 

90%

 

93%

 

91%

 

90%

 

92%

 

93%

 

 

84%

Change from Prior Year

 

 

-1

 

2

 

1

 

1

 

3

 

-2

 

-1

 

2

 

1

 

 

-9

 

The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013.  But here is the problem for the NCBE.

Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam.  The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa.  In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate).  In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year. 

Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively).  In both years, Iowa had a profile of 164/161/158.  In both years, Drake had a profile of 158/156/153.  This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade.  The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.

TIME WILL TELL

This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014.  What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.

When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers.  When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).

It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions.  In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.

November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (4)

Monday, October 20, 2014

What Law Schools Can Learn from Dental Schools in the 1980s Regarding the Consequences of a Decline in Applicants

For four consecutive years we have seen a decline in the number of applicants to law school and a corresponding decline in the number of matriculating first-year students.  Over the last year or two, some have suggested that as a result of this “market adjustment” some law schools would end up closing.  Most recently, the former AALS President, Michael Olivas, in response to the financial challenges facing the Thomas Jefferson Law School, was quoted as stating that he expects several law schools to close. 

To date, however, no law schools have closed (although the Western Michigan University Thomas M. Cooley Law School recently announced the closure of its Ann Arbor branch).  

Have law schools found ways to cut costs and manage expenses in the face of declining revenues such that all will remain financially viable and remain in operation?  Is it realistic to think that no law schools will close?

Although there may be a number of people in the legal academy who continue to believe that somehow legal education is “exceptional” – that market forces may impose financial challenges for law schools in the near term, but will not result in the closing of any law schools -- this strikes me as an unduly optimistic assessment of the situation. 

To understand why, I think those in legal education can learn from the experience of those in dental education in the 1980s.

The Dental School Experience from 1975-1990

In the 1980s, dental school deans, along with provosts and presidents at their host universities, had to deal with the challenge of a significant decline in applicants to dental school. 

At least partially in response to federal funding to support dental education, first-year enrollment at the country’s dental schools grew throughout the 1970s to a peak in 1979 of roughly 6,300 across roughly 60 dental schools.  Even at that point, however, for a number of reasons -- improved dental health from fluoridation, reductions in federal funding, high tuition costs and debt loads -- the number of applicants had already started to decline from the mid-1970s peak of over 15,000. 

By the mid-1980s, applicants had fallen to 6,300 and matriculants had fallen to 5,000.  As of 1985, no dental schools had closed.  But by the late 1980s and early 1990s there were fewer than 5000 applicants and barely 4000 first-year students – applicants had declined by more than two-thirds and first-year enrollment had declined by more than one-third from their earlier peaks. (Source – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author).)

How did dental schools and their associated universities respond to this changing market?  Between 1986 and 1993, six private universities closed their dental schools: Oral Roberts University, Tulsa, Oklahoma (1986); Emory University, Atlanta, Georgia (1988); Georgetown University, Washington, D.C. (1990); Fairleigh Dickinson University, Rutherford, New Jersey (1990); Washington University, St. Louis, Missouri (1991); and Loyola University, Chicago, Illinois (1993). (Source: Dental Education at the Crossroads:  Challenges and Change, Table 1.1 (Institute of Medicine 1995)).  According to a New York Times article from October 29, 1987, “Georgetown, formerly the nation's largest private dental school, decided to close after a Price Waterhouse study found that the school would have a $3.6 million deficit by 1992.” (Source: Tamar Lewin, Plagued by Falling Enrollment, Dental Schools Close or Cut Back, New York Times, Oct. 29, 1987).

Some of the primary factors contributing to the closing of dental schools were described as follows:

Financial issues were repeatedly described as critical. Dental education was cited as an expensive enterprise that is or may become a drain on university resources. On average, current-year expenditures for the average dental school are about $1 million more than current revenues. … The declining size and quality of the applicant pool during the 1980s played a role in some closures by threatening the tuition base and prestige on which private schools rely. Faculty and alumni resistance to change may feed impatience among university administrators. In some institutions, the comparative isolation of dental schools within the university has provided them with few allies or at least informed colleagues and has left them ill-prepared to counter proposals for "downsizing." (Source: Dental Education at the Crossroads:  Challenges and Change, at 202-203 (Institute of Medicine 1995)). 

The Law School Experience from 2004-2014

In terms of applicants and enrollment over the last decade, the trends law schools have experienced look remarkably comparable to the experience of dental schools in the 1970s and 1980s.  According to the LSAC Volume Summary, applicants to law schools peaked in 2004 with 100,600 applicants (and roughly 48,200 first-year students).  By 2010, applicants had fallen to roughly 87,600, but first-year enrollment peaked at 52,500.  Over the last four years, applicants have fallen steadily to roughly 54,700 for fall 2014, with a projected 37,000 first-years matriculating this fall, the smallest number since 1973-74, when there were 40 fewer law schools and over one thousand fewer law professors.  (Source - ABA Statistics)(For the analysis supporting this projection of 37,000 first-years, see my blog post on The Legal Whiteboard from March 18, 2014.)  

The two charts below compare the dental school experience from 1975 to 1990 with the law school experience in the last decade.  One chart compares dental school applicants with law school applicants and one chart compares dental school first-years with law school first-years.  (Note that for purposes of easy comparison, the law school numbers are presented as one-tenth of the actual numbers.)

Applicants

First years

(Sources – American Dental Association – Trends in Dental Education – U.S. Dental School Applicant and First-Year Enrollment Trends 1955-2009 (copy on file with author) and the LSAC’s Volume Summary  (with my own estimates for 2014 based on the LSAC’s Current Volume Summary)).

The Law School Experience 2014-2019

Notably, these charts do not bode well for law schools.  The law school experience tracks pretty closely the dental school experience over the first ten years reflected in the charts.  For law schools, 2014 looks a lot like 1985 did for dental schools.

There might be any number of reasons why the law school experience over the next several years might be different from the dental school experience in the late 1980s and early 1990s, such that the next several years do not continue as a downward trend in applicants and matriculants.  The market forces associated with changes in the dental profession and dental education in the 1980s are not the same as the market forces associated with changes in the legal profession and legal education in the 2010s and the cost structures for dental education and legal education are not exactly the same.

The problem for law schools, however, is that without an upward trend law schools will continue to face significant financial pressures for the next few years just as dental schools did in the late 1980s.  There might be some encouraging news on the employment front over the next few years as the decreasing number of matriculants will mean a decreasing number of graduates in 2015, 2016 and 2017.  Even without any meaningful growth in the employment market for law graduates, this decline in the number of graduates should mean significant increases in the percentage of graduates finding full-time, long-term employment in bar passage required jobs.  Over time, this market signal may begin to gain traction among those considering law school such that the number of applicants to law school stops declining and perhaps starts increasing modestly. 

But the near term remains discouraging.  The number of people taking the June 2014 LSAT was down roughly 9% compared to June 2013 and the anticipation is that the number of test-takers in the most recent administration in late September was down as well compared to October 2013.  Thus, applicants well might be down another 5-8% in the 2014-15 admissions cycle, resulting in perhaps as few as 51,000 applicants and perhaps as few as 35,000 matriculants in fall 2015.  Even if things flatten out and begin to rebound modestly in the next few years, it would appear to be unlikely that the number of matriculants will climb back near or above 40,000 before the fall of 2017 or 2018.

Moreover, if current trends continue, the matriculants in 2015 also are going to have a significantly less robust LSAT/GPA profile than the matriculants in fall 2010.   As I noted in a blog posting on March 2, 2014, between 2010 and 2013, the number of law schools with a median LSAT less than 150 grew from 9 to 32, and the number with a median LSAT of 145 or below grew from 1 to 9.

What Does this Mean for the Average Law School?

Assume you are the Dean at a hypothetical private law school that had 600 students (200 in each class) and a budget based on $18 million in JD tuition revenue in 2010-11.  (This reflects a net tuition of $30,000 from each student – with nominal tuition set at $40,000 but with a discount rate of 25%.)  Further assume that with this budget, your law school was providing $2.0 million annually to the university with which it is affiliated.  As of 2010-11, your entering class profile reflected a median LSAT of 155 and a median GPA of 3.4.

Assume first-year enrollment declined to 170 in 2011, to 145 in 2012, and to 125 in 2013, a cumulative decrease in first-year enrollment since 2010 of 37%.  As you tried to balance enrollment and profile, the law school managed to maintain its median LSAT and GPA in 2011, but saw its LSAT and GPA medians decline to 153 and 3.35 in 2012 and to 152 and 3.30 in 2013.

This means that for the 2013-14 academic year, the law school had only 440 students, a decrease of roughly 27% from its total enrollment of 600 in 2010, with a much less robust entering class profile in comparison with the entering class profile in 2010. (Note that this assumes no attrition and no transfers in or out, so if anything, it likely overstates total enrollment).  (For comparison purposes, the National Jurist recently listed 25 law schools with enrollment declines of 28% or more between 2010-11 and 2013-14.)

Assume further that the law school had to increase its scholarships to attract even this smaller pool of students with less robust LSAT/GPA profiles, such that the net tuition from each first-year student beginning in fall 2012 has been only $25,500 (with nominal tuition now set at $42,500, but with a discount rate of 40%). 

For the 2013-14 academic year, therefore, you were operating with a budget based on $12,411,000 in JD tuition revenue, a decrease in JD tuition revenue of over $5.5 million since the 2010-11 academic year, over 30%.  (170 x $32,500 for third years ($5.525 million), 145 x $25,500 for second years ($3.698 million), and 125 x $25,500 for first-years ($3.188 million)).

What does this mean?  This means you have been in budget-cutting mode for over three years.  Of course, this has been a challenge for the law school, given that a significant percentage of its costs are for faculty and staff salaries and associated fringe benefits.  Through the 2013-14 academic year, however, assume you cut costs by paring the library budget, eliminating summer research stipends for faculty, finding several other places to cut expenditures, cutting six staff positions and using the retirement or early retirement of ten of your 38 faculty members as a de facto “reduction in force,” resulting in net savings of $3.59 million.  In addition, assume you have gotten the university to agree to waive any “draw” saving another $2 million (based on the “draw” in 2010-2011).  Thus, albeit in a significantly leaner state, you managed to generate a “balanced” budget for the 2013-14 year while generating no revenue for your host university.    

The problem is that the worst is yet to come, as the law school welcomes a class of first-year students much smaller than the class of third-years that graduated in May.  With the continued decline in the number of applicants, the law school has lower first-year enrollment again for 2014-15, with only 120 first-year students with a median LSAT and GPA that has declined again to 151 and 3.2.  Projections for 2015-16 (based on the decline in June and October 2014 LSAT takers) suggest that the school should expect no more than 115 matriculants and may see a further decline in profile.  That means that the law school has only 390 students in 2014-15 and may have only 360 students in 2015-16 (an enrollment decline of 40% since 2010-11). Assuming net tuition for first-year students also remains at $25,500 due to the competition on scholarships to attract students (and this may be a generous assumption) – the JD tuition revenue for 2014-15 and 2015-16 is estimated to be $9,945,000, and $9,180,000, respectively (a decline in revenue of nearly 50% from the 2010-11 academic year). 

In reality, then, the “balanced” budget for the 2013-2014 academic year based on revenues of $12,411,000, now looks like a $2,500,000 budget shortfall in 2014-15 and a $3,200,000 budget shortfall for the 2015-16 academic year, absent significant additional budget cuts or new revenue streams (with most of the “low hanging fruit” in terms of budget cuts already “picked”). 

While you may be able to make some extraordinary draws on unrestricted endowment reserves to cover some of the shortfall (assuming the law school has some endowment of its own), and may be creative in pursuing new sources of revenue (a certificate program or a Master of Laws), even if you come up with an extra $400,000 annually in extraordinary draws on endowment and an extra $400,000 annually in terms of non-JD revenue you still are looking at losses of at least $1,700,000 in 2014-15 and at least $2,400,000 in 2015-16 absent further budget cuts.  Even with another round of early retirement offers to some tenured faculty and/or to staff (assuming there are still some that might qualify for early retirement), or the termination of untenured faculty and/or of staff, the budget shortfall well might remain in the $1,000,000 to $1,700,000 range for this year and next year (with similar projections for the ensuing years).  This means the law school may need subsidies from the university with which it is affiliated, or may need to make even more draconian cuts than it has contemplated to date.  (For indications that these estimates have some relation to reality, please see the recent stories about budget issues at Albany, Minnesota and UNLV.)

Difficult Conversations -- Difficult Decisions

This situation will make for some interesting conversations between you as the Dean of the law school and the Provost and President of the university.  As noted above in the discussion of dental schools, the provost and president of a university with a law school likely will be asking:  How “mission critical” is the law school to the university when the law school has transformed from a “cash cow” into a “money pit” and when reasonable projections suggest it may continue to be a money pit for the next few years?  How "mission critical" is the law school when its entering class profile is significantly weaker than it was just a few years ago, particularly if that weaker profile begins to translate into lower bar passage rates and even less robust employment outcomes?   How “mission critical” is the law school to the university if its faculty and alumni seem resistant to change and if the law school faculty and administration are somewhat disconnected from their colleagues in other schools and departments on campus?

Some universities are going to have difficult decisions to make (as may the Boards of Trustees of some of the independent law schools).  As of 1985, no dental schools had closed, but by the late 1980s and early 1990s, roughly ten percent of the dental schools were closed in response to significant declines in the number and quality of applicants and the corresponding financial pressures.  When faced with having to invest significantly to keep dental schools open, several universities decided that dental schools no longer were “mission critical” aspects of the university. 

I do not believe law schools should view themselves as so exceptional that they will have more immunity to these market forces than dental schools did in the 1980s.  I do not know whether ten percent of law schools will close, but just as some universities decided dental schools were no longer “mission critical” to the university, it is not only very possible, but perhaps even likely, that some universities now will decide that law schools that may require subsidies of $1 million or $2 million or more for a number of years are no longer “mission critical” to the university. 

(I am grateful to Bernie Burk and Derek Muller for their helpful comments on earlier drafts of this blog posting.)

 

October 20, 2014 in Cross industry comparisons, Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (5)

Tuesday, October 7, 2014

Does Cooperative Placement Accelerate Law Student Professional Development?

The title of an earlier essay posed a threshold question for legal ed reform: "If We Make Legal Education More Experiential, Would it Really Matter?" (Legal Whiteboard, Feb 2014) (PDF). I answered "yes" but admitted it was only my best guess.  Thus, to be more rigorous, I outlined the conditions necessary to prove the concept.

The essay below is a companion to the first essay.  It is a case study on how one type and brand of experiential education -- cooperative placements at Northeastern Law -- appears to accelerate the professional development of its law students. The outcome criteria are comprised of the three apprenticeships of Educating Lawyers (2007) (aka The Carnegie Report) --cognitive skills, practice skills, and professional identity.

The better outcomes flow from Northeastern's immersive, iterative, and integrative approach. First, students are immersed in full-time coops that last a standard 11 weeks. Second, students move through four iterations of coops interspersed with four quarters of upper-level classes. Third, this experiential approach is integrated into the Law School's value system -- i.e., the experiential component is perceived as central rather than marginal to the School's educational mission.

Northeastern's coop model asks more of faculty and students, thus it may be hard to replicate. Yet, there is evidence that such an approach does in fact accelerate professional development in ways that ought to please law school critics and reformers. The benefits may be well worth the costs. 

[PDF version at JD Supra]

[The text below was original published as the Northeastern Law Outcomes Assessment Project (OAP) Research Bulletin No. 3]

Immersive, Iterative and Integrative:
Does Cooperative Placement Accelerate Law Student  Professional Development?

A steep decline in the job prospects for entry-level lawyers has been followed by a sharp drop in law school applications. Media stories criticize traditional legal education for being too expensive while producing graduates unprepared for practice. Throughout the country, legal educators and administrators at law schools are trying to formulate an effective response.

A common thread running through many new law school initiatives is greater emphasis on experiential education. Fundamentally, experiential education is learning by doing, typically by assuming the role of the lawyer in an in-class simulation, law school clinic, externship or cooperative placement. As law schools seek to add hands-on opportunities to their curricular offerings, empirical evidence on experiential education’s impact on law student professional development becomes invaluable.

Northeastern University School of Law’s Outcomes Assessment Project (OAP) is an evidenced-based approach to understanding experiential learning in the law school curriculum. A focal point of the OAP is Northeastern’s Cooperative Legal Education Program, an integral part of the school’s curriculum since the late 1960s. After completing a mostly traditional first year of law school,Northeastern students enter a quarter system in which 11-week cooperative placements alternate with 11-week upper-level courses. Through the four co-op placements during the 2L and 3L years, every Northeastern student gains the functional equivalent of nearly one year of full-time legal experience, typically across a diverse array of practice areas.

The Learning Theory of Cooperative Placement

Northeastern’s Cooperative Legal Education Program is based on a learning theory with three interconnected elements: immersion, iteration and integration.

  • Immersion: Immersion in active legal work in a real-world setting enables students to feel the weight and responsibility of representing real-world clients and exercising professional judgment.
  • Iteration: Iterative movement between the classroom and co-op placements provides students with concrete opportunities to connect theory with practice and understand the role of reflection and adjustment in order to improve one’s skill and judgment as a lawyer.
  • Integration: Integrating experiential learning into the law school curriculum signals its high value to the law school mission — when 50 percent of the upper-level activities involve learning by doing, practice skills are on par with doctrinal learning.

The purpose of the OAP Research Bulletin No. 3 is to use preliminary project data to explore whether the immersion-iteration-integration approach to legal education has the effect of accelerating the professional development of law students.

Three Effects of Co-op Placements

The findings in Research Bulletin No. 3 are based on surveys and focus groups conducted with 2L and 3L Northeastern law students and a small number of Northeastern law graduates, who served as facilitators. In our conversations with these students and alumni, we identified three ways that co-op is impacting the professional development of students.

Continue reading

October 7, 2014 in Data on legal education, Important research, Scholarship on legal education | Permalink | Comments (0)

Thursday, September 4, 2014

Artificial Intelligence and the Law

Plexus, a NewLaw law firm based in Australia, has just released a new legal product that purports to apply artificial intelligence to a relatively common, discrete legal issue -- detemining whether a proposed trade promotion (advertisement in US parlance) is in compliance with applicable law. 

In the video below, Plexus Managing Partner Andrew Mellett (who is a MBA, not a lawyer), observes that this type of legal work would ordinarily take four to six weeks to complete and cost several thousand dollars.  Mellett claims that the Plexus product can provide "a legal solution in 10 minutes" at 20% to 30% of the cost of the traditional consultative method -- no lawyer required, albeit Plexus lawyers were the indispensible architects for the underlying code. 

From the video, it is unclear whether the innovation is an expert system -- akin to what Neota Logic or KM Standards are creating -- or artificial intelligence (AI) in the spirit of machine learning used in some of the best predictive coding algorithms or IBM's Watson applied to legal problems.   Back when Richard Susskind published his PhD dissertation in 1987, Expert Systems In Law, an expert system was viewed as artificial intelligence--there was no terminology to speak of because the application of technology to law was embryonic.  Now we are well past birth, as dozen of companies in the legal industry are in the toolmaking business, some living on venture or angel funding and others turning a handsome profit.

My best guess is that Plexus's new innovation is an expert system.  But frankly, the distinction does not matter very much because both expert systems and AI as applied to law are entering early toddler stage.   Of course, that suggests that those of us now working in the legal field will soon be grappling with the growth spurt of legal tech adolescence.  For law and technology, it's Detroit circa 1905.  

September 4, 2014 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, New and Noteworthy, Structural change, Video interviews | Permalink | Comments (2)

Monday, July 28, 2014

Conditional Scholarship Retention Update for the 2012-2013 Academic Year

In comparing the conditional scholarship universe between the 2011-12 academic year and the 2012-13 academic year (with a brief look at 2013-14) there are a handful of things worth noting.

First, as shown in Table 1, the number of law schools with conditional scholarships declined between 2011-12 and 2012-13 from 144 law schools to 136 law schools, and declined again for the 2013-14 academic year to 128 law schools.  The number of law schools that do not have conditional scholarships grew from 49 in 2011-12 to 58 in 2012-13 to 66 in 2013-14.  In addition, the number of schools with just one-year scholarships declined from five in 2011-12 to four in 2012-13, where it remained for 2013-14.

 Table 1:  Changes in Number of Law Schools with Conditional Scholarship Programs

Category

2011-12

2012-13

2013-14 (indications)

Law Schools with Conditional Scholarship Programs

 

144

 

136

 

128

Law Schools with One-Year Scholarships

5

4

4

Law Schools with Scholarships that are not Conditional Scholarships

 

49

 

58

 

66

 

Second, as shown in Table 2, the number of students receiving conditional scholarships in 2012-13 declined slightly from 2011-12, from 12786 to 12470, but the percentage of first-years with conditional scholarships actually increased from 27.3% to 29.2% (given the smaller number of first-years in 2012-13 compared to 2011-12).  That said, the number of students whose scholarships were reduced or eliminated declined from 4359 to 3712, meaning that the percentage of first-years whose scholarships were reduced or eliminated dropped from 9.3% to 8.7%.

Table 2: Overall Comparisons Between 2011-12 and 2012-13

Category

2011-2012

2012-13

First-years*

46778

42769

First-years with Conditional Scholarships**

12786 (27.3% of first-years)

12470 (29.2% of first-years)

First-years whose conditional scholarships were reduced or eliminated**

 

4359 (9.3% of first-years)

 

3712 (8.7% of first-years)

Average Renewal Rate (across law schools)

69%

71%

Overall Renewal Rate Among Scholarship Recipients

65.9%

70.2%

*Drawn from first-year enrollment at the 198 law schools included in this analysis (excluding the law schools in Puerto Rico and treating Widener as one law school for these purposes) based on information published in the Standard 509 reports.
** Based on information published in the mandated Conditional Scholarship Retention charts by each law school with a conditional scholarship program.

Third, the distribution of conditional scholarship programs across tiers of law schools is even more pronounced in 2012-13 than it was in 2011-12.  Using the USNews rankings from March 2014, only 16 law schools ranked in the top 50 had conditional scholarship programs in 2012-13 and eight of those 16 had a renewal rate of 97% or higher.  Three of these law schools also eliminated their conditional scholarship programs as of the fall 2013 entering class.  (Moreover, only six in the top 25 had conditional scholarship programs, five of whom had a renewal rate of 97% or higher.)

As you move further down the rankings, conditional scholarship programs become more common and manifest lower scholarship retention rates on average.

Of the 53 law schools ranked between 51 and 100 (with three tied at 100), 37 law schools (nearly 70%) had conditional scholarship programs, of which two eliminated their conditional scholarship programs as of fall 2013.  Notably, of the 37 law schools with conditional scholarship programs, eight had a renewal rate of 91% or better (nearly 22%), while seven had a renewal rate of 65% or less (nearly 19%) (with the other 22 (nearly 60%) with renewal rates between 67% and 88%)

For law schools ranked between 104 and 146 (44 law schools in total), 35 law schools (nearly 80%) had conditional scholarship programs, of which three eliminated their conditional scholarship programs as of fall 2013.   Notably, of the 35 law schools with conditional scholarship programs, six of the 35 had a renewal rate of 93% or better (roughly 17%) while 16 had a renewal rate of 65% or less (nearly 46%) (with the other 13 (roughly 37%) with renewal rates between 67% and 88%).

Finally, among the unranked schools, 47 of 51 had conditional scholarship programs – over 92% – only five of which had a renewal rate of 91% or better (nearly 11%), while 23 had a renewal rate of 65% or less (nearly 49%) (with the other 19 (roughly 40%) with renewal rates between 66% and 88%).

Tables 3 and 4 present comparative data across law schools in different USNews rankings categories.  Table 3 describes the number of law schools with conditional scholarship programs and the distribution of scholarship retention rates among law schools.  Table 4 describes the total number of students within each USNews rankings category along with the number of students on conditional scholarships and the number of students who had their conditional scholarship reduced or eliminated.

 Table 3: Scholarship Retention Rates by USNews Ranking Categories

Category

Top 50

51-100 (n=53)

104-146 (n=44)

Unranked (n=51)

Schools with Conditional Scholarship Programs

 

16

 

37

 

35

 

47

Retention Rates of 90% or More

8

8

6

5

Retention Rates of 66%-88%

4

22

13

19

Retention Rates of 65% or Less

4

7

16

23

 Table 4: Number and Percentage of First-Year Students in 2012 by USNews Rankings Categories Having Conditional Scholarships and Having Conditional Scholarships Reduced or Eliminated

 

Top 50 Law Schools (n=50)

Law Schools Ranked 51-100 (n=53)

Law Schools Ranked 104-146

(n=44)

Law Schools Ranked Alphabetically (n=51)

Number (%) of Law Schools with Conditional Scholarship Programs

16 (32%)

37 (70%)

35 (79.5%)

47 (92%)

Total First-Years at These Law Schools

11,862

10,937

7,611

12,180

Number (%) of First-Years with Conditional Scholarships

1,587 (13.4%)

3,192 (29.2%)

3,247 (42.7%)

4,444 (36.5%)

Number (%) of Conditional Scholarship Recipients Whose Scholarships were Reduced or Eliminated

154 (9.7% of conditional scholarship recipients and 1.3% of first-years)

734 (23% of conditional scholarship recipients and 6.7% of first-years)

1,124 (34.6% of conditional scholarship recipients and 14.8% of first-years)

1,700 (38.3% of conditional scholarship recipients and 14% of first-years)

Overall, as shown in Table 5, the distribution of retention rates across law schools was as follows for the 2012-13 academic year:  18 law schools had retention rates less than 50%, 20 law schools had retention rates between 50% and 59.99%, 25 law schools had retention rates between 60% and 69.99%, 21 law schools had retention rates between 70% and 79.99%, 25 law schools had retention rates between 80% and 89.99%, and 27 law schools had retention rates of 90% or better. 

 Table 5 – Number of Law Schools with Conditional Scholarship Renewal Rates in Different Deciles

Renewal Category

Number of Schools

90% or More

27 (16 of which were ranked in top 100)

80%-89.9%

25 (12 of which were ranked in top 100)

70%-79.9%

21 (10 of which were ranked in top 100)

60%-69.9%

25 (8 of which were ranked in top 100)

50%-59.9%

20 (5 of which were ranked in top 100)

Less than 50%

18 (2 of which were ranked in top 100)

Notably, of the 52 law schools ranked in the top 100 with conditional scholarship programs, only two (four percent) had retention rates that were less than 50%, while 16 (nearly 31%) had retention rates of 90% or better.  By contrast, of the 82 (of 95) law schools ranked 104 or lower with conditional scholarship programs, 16 (nearly 20%) had retention rates of 50% or less, while only 11 (roughly 13%) had retention rates of 90% or better.

In sum then, with several schools eliminating their conditional scholarship programs as of fall 2013, less than 50% of the law schools ranked in the top 100 (47 of 103 – nearly 46%) still had conditional scholarship programs, and of those, more than 27% (13 of 47) had retention rates for the 2012-13 academic year of 90% or better while less than 22% (10 of 47) had retention rates of 65% or less.

By contrast, as of fall 2013, more than 80% of the schools ranked below 100 (79 of 95 – roughly 83%) still had conditional scholarship programs, and of those, less than 12% (9 of 79) had retention rates for the 2012-13 academic year of 90% or better and nearly half (39 of 79 – roughly 49%) had retention rates of 65% or less.

July 28, 2014 in Data on legal education, Scholarship on legal education | Permalink | Comments (0)

Tuesday, May 27, 2014

Another Datapoint for the Laptops Debate

In my inbox this morning was the HBS Daily Stat with the title, "You'll Absorb More if You Take Notes Longhand."  Here is the accompanying explanation:

College students who take notes on laptop computers are more likely to record lecturers’ words verbatim and are thus less likely to mentally absorb what’s being said, according to a series of experiments by Pam A. Mueller of Princeton and Daniel M. Oppenheimer of UCLA. In one study, laptop-using students recorded 65% more of lectures verbatim than did those who used longhand; a half-hour later, the laptop users performed significantly worse on conceptual questions such as “How do Japan and Sweden differ in their approaches to equality within their societies?” Longhand note takers learn by reframing lecturers’ ideas in their own words, the researchers say.

SOURCE: The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking (emphasis in the original)

Wouldn't the same analysis almost surely apply to law students?  Experience tells me that many law students would argue that they are in the minority who learn better through computer transcription.  But what if, given a choice, over half decide to use laptops?  It would be likely that many, if not most, would be making the wrong tradeoff.

Data rarely changes hearts and minds.  As a result, there is likely a gap between maximum learning/knowledge worker productivity and what we are able to accomplish in an education or  workplace setting.  Why?  People like what they are used to and rationalize why data does not apply to them.  There is a solution to dilemma, I suspect.  We just have not found it yet. 

May 27, 2014 in Blog posts worth reading, Cross industry comparisons, Data on legal education, Fun and Learning in the classroom, New and Noteworthy | Permalink | Comments (2)

Monday, March 17, 2014

A Counterpoint to "The most robust legal market that ever existed in this country"

There is a line in Professor Reich-Graefe's recent essay, Keep Calm and Carry On, 27 Geo. J. Legal Ethics 55 (2014), that is attracting a lot of interest among lawyers, law students, and legal academics: 

[R]ecent law school graduates and current and future law students are standing at the threshold of the most robust legal market that ever existed in this country—a legal market which will grow, exist for, and coincide with, their entire professional career.

This hopeful prediction is based on various trendlines, such as impending lawyer retirements, a massive intergenerational transfer of wealth that will take place over the coming decades, continued population growth, and the growing complexity of law and legal regulation.

Although I am bullish on future growth and dynamism in the legal industry, and I don't dispute the accuracy or relevance of any of the trendlines cited by Reich-Graefe, I think his primary prescriptive advice -- in essence, our problems will be cured with the passage of time -- is naive and potentially dangerous to those who follow it.

The Artisan Lawyer Cannot Keep Up

The primary defect in Reich-Graefe's analysis is that it is a one-sided argument that stacks up all impending positive trendlines without taking into account the substantial evidence that the artisan model of lawyering -- one-to-one consultative legal services that are tailored to the needs of individual clients -- is breaking down as a viable service delivery model.  

Lawyers serve two principal constituencies--individuals and organizations.  This is the Heinz-Laumann "Two-Hemisphere" theory that emerged from the Chicago Lawyers I and II studies.  See Heinz et al, Urban Lawyers (2005). The breakdown in the artisan model can be observed in both hemispheres.

  1. People.  Public defenders are understaffed, legal aid is overwhelmed, and courts are glutted with pro se litigants.  Remarkably, at the same time, record numbers of law school graduates are either unemployed or underemployed.  Why?  Because most poor and middle-class Americans cannot afford to buy several hours of a lawyer's time to solve their legal problems.  
  2. Organizations.  The most affluent organizations, multinational corporations, are also balking at the price of legal services.  As a result, foreign labor, technology, process, or some combination thereof has become a replacement for relatively expensive and unskilled junior lawyers.

The primary driver of this structural shift is the relentless growth in legal complexity.  This increase in complexity arises from many sources, including globalization, technology, digitally stored information, and the sheer size and scope of multinational companies. 

But here is a crucial point:  the complexity itself is not new, only its relative magnitude.  A century ago, as the modern industrial and administrative state was beginning to take shape, lawyers responded by organizing themselves into law firms.  The advent of law firms enabled lawyers to specialize and thus more cost-effectively tackle the more complex legal problems. Further, the diffusion of the partner-associate training model (sometimes referred to as the Cravath system) enabled firms to create more specialized human capital, which put them in an ideal position to benefit from the massive surge in demand for legal services that occurred throughout the 20th century.  See Henderson, Three Generations of Lawyers: Generalists, Specialists, Project Managers, 70 Maryland L Rev 373 (2011). 

The legal industry is at the point where it is no longer cost effective to deal with this growing complexity with ever larger armies of artisan-trained lawyers.  The key phrase here is cost effective.  Law firms are ready and willing to do the work.  But increasingly, clients are looking for credible substitutes on both the cost and quality fronts. Think car versus carriage, furnace versus chimney sweep, municipal water system versus a well.  A similar paradigm shift is now gaining momentum in law.

The New Legal Economy

I have generated the graph below as a way to show the relationship between economic growth, which is the engine of U.S. and world economies, and the legal complexity that accompanies it.

Complexity
This chart can be broken down into three phases.

1. Rise of the law firm. From the early twentieth century to the early 1980s, the increasing complexity of law could be capability handled by additional law firm growth and specialization. Hire more junior lawyers, promote the best ones partner, lease more office space, repeat.  The complexity line has a clear bend it in.  But for most lawyers, the change is/was very gradual and feels/felt like a simple linear progression.  Hence, there was little urgency about the need for new methods of production.

2. Higher law firm profits. Over the last few decades, the complexity of law outpaced overall economic growth.  However, because the change was gradual, law firms, particularly those with brand names, enjoyed enough market power to perennially increase billing rates without significantly improving service offerings.  Corporate clients paid because the economic benefits of the legal work outweighed the higher costs.  Lower and middle class individuals, in contrast, bought fewer legal services because they could not afford them. But as a profession, we barely noticed, primarily because the corporate market was booming. See Henderson, Letting Go of Old Ideas, 114 Mich L Rev 101 (2014).

3. Search for substitutes.  Laws firms are feeling discomfort these days because the old formula -- hire, promote, lease more space, increase rates, repeat -- is no longer working.  This is because clients are increasingly open to alternative methods of solving legal problems, and the higher profits of the last few decades have attracted new entrants.  These alternatives are some combination of better, faster, and cheaper.   But what they all share in common is a greater reliance on technology, process, and data, which are all modes of problemsolving that are not within the training or tradition of lawyers or legal educators.  So the way forward is profoundly interdisciplinary, requiring collaboration with information technologists, systems engineers, project managers, data analysts, and experts in marketing and finance.

Why is this framework potentially difficult for many lawyers, law firms, and legal educators to accept?  Probably because it requires us to cope with uncertainties related to income and status.  This reluctance to accept an unpleasant message creates an appetite for analyses that say "keep calm and carry on."  This is arguably good advice to the British citizenry headed into war (the origin of the saying) but bad advice to members of a legal guild who need to adapt to changing economic conditions.

There is a tremendous silver lining in this analysis.  Law is a profoundly critical component of the globalized, interconnected, and highly regulated world we are entering.  Lawyers, law firms, and legal educators who adapt to these changing conditions are going to be in high demand and will likely prosper economically.  Further, at an institutional level, there is also the potential for new hierarchies to emerge that will rival and eventually supplant the old guard.

Examples

Logo-kcuraOne of the virtues of lawyers is that we demand examples before we believe something to be true.  This skepticism has benefited many a client.  A good example of the emerging legal economy is the Available Positions webpage for kCura, which is a software company that focuses exclusively on the legal industry. 

The current legal job market is terrible, right?  Perhaps for entry-level artisan-trained lawyers.  But at kCura, business is booming. Founded in 2001, the company now employs over 370+ workers and has openings for over 40 full-time professional positions, the majority of which are in Chicago at the company's LaSalle Street headquarters.  Very few of these jobs require a law degree -- yet the output of the company enables lawyers to do their work faster and more accurately.  

What are the jobs?

  • API Technical Writer [API = Application Programming Interface]
  • Big Data Architect - Software Engineering
  • Business Analyst
  • Enterprise Account Manager
  • Group Product Manager
  • Litigation Support Advice Analyst
  • Manager - Software Engineering
  • Marketing Associate
  • Marketing Specialist -- Communications
  • Marketing Specialist -- Corporate Communications and Social Media
  • Product Manager -- Software and Applications Development
  • QA Software Engineer -- Performance [QA = Quality Assurance]
  • Scrum Team Coordinator [Scrum is a team-based software development methodology]
  • Senior SalesForce Administrator 
  • Software Engineer (one in Chicago, another in Portland)
  • Software Engineer (Front-End Developer) [Front-End = what the client sees]
  • Software Engineer in Test [Test = finds and fixes software bugs]
  • Technical Architect
  • Technical Architect - Security
  • VP of Product Development and Engineering

kCura operates exclusively within the legal industry, yet it has all the hallmarks of a great technology company. In the last few years it has racked up numerous awards based on the quality of its products, its stellar growth rate, and the workplace quality of life enjoyed by its employees.

KCuraawards

That is just what is happening at kCura.  There are many other companies positioning themselves to take advantage of the growth opportunities in legal, albeit none of them bear any resemblance to traditional law firms or legal employers.

LexRedux-Eventbrite-headerIn early February, I attended a meeting in New York City of LexRedux, which is comprised of entrepreneurs working in the legal start-up space.  In a 2008 essay entitled "Legal Barriers to Innovation," Professor Gillian Hadfield queried, "Where are the 'garage guys' in law?"  Well, we now know they exist.  At LexRedux, roughly 100 people working in the legal tech start-up space were jammed into a large open room in SoHo as a small group of angel investors and venture capitalists fielded questions on a wide range of topics related to operations, sales, and venture funding.

According to Angel's List, there are as of this writing 434 companies identified as legal start-ups that have received outside capital.  According to LexRedux founder Josh Kubicki, the legal sector took in $458M in start-up funding in 2013, up from essentially zero in 2008.  See Kubicki, 2013 was a Big Year for Legal Startups; 2014 Could Be Bigger, Tech Cocktail, Feb 14, 2014.

The legal tech sector is starting to take shape.  Why?  Because the imperfections and inefficiencies inherent in the artisan model create a tremendous economic opportunity for new entrants.  For a long period of time, many commentators believed that this type of entrepreneurial ferment would be impossible so long as Rule 5.4 was in place.  But in recent years, it has become crystal clear that when it comes to organizational clients where the decisionmaker for the buyer is a licensed lawyer (likely accounting for over half of the U.S. legal economy) everything up until the courthouse door or the client counseling moment can be disaggregated into a legal input or legal product that can be provided by entities owned and controlled by nonlawyers. See Henderson, Is Axiom the Bellwether of Legal Disruption in the Legal Industry? Legal Whiteboard, Nov 13, 2013.

The Legal Ecosystem of the Future

Book-tomorrows-lawyersIn his most recent book, Tomorrow's Lawyers, Richard Susskind describes a dynamic legal economy that bares little resemblance to the legal economy of the past 200 years.  In years past, it was easier to be skeptical of Susskind because his predictions seemed so, well, futuristic and abstract.  But anyone paying close attention can see evidence of a new legal ecosystem beginning to take shape that very much fits the Susskind model.

Susskind's core framework is the movement of legal work along a five-part continuum, from bespoke to standardized to systematized to productized to commoditized.  Lawyers are most confortable in the bespoke realm because it reflects our training and makes us indispensible to a resolution.  Yet, the basic forces of capitalism pull the legal industry toward the commoditized end of the spectrum because the bespoke method of production is incapable of keeping up with the needs of a complex, interconnected, and highly regulated global economy. 

According to Susskind, the sweet spot on the continuum is between systematized and productized, as this enables the legal solution provider to "make money while you sleep."  The cost of remaining in this position (that is, to avoid commoditization) is continuous innovation.  Suffice it to say, lawyers are unlikely to make the cut if they choose to hunker down in the artisan guild and eschew collaboration with other disciplines.

Below is a chart I have generated that attempts to summarize and describe the new legal ecosystem that is now taking shape [click-on to enlarge].  The y-axis is the Heinz-Laumann two-hemisphere framework.  The x-axis is Susskind's five-part change continuum. 

Ecosystem
Those of us who are trained as lawyers and have worked in law firms will have mental frames of reference that are on the left side of the green zone.  We tend to see things from the perspective of the artisan lawyer.  That is our training and socialization, and many of us have prospered as members of the artisan guild.

Conversely, at the commoditized end of the continuum, businesses organized and financed by nonlawyers have entered the legal industry in order to tap into portion of the market that can no longer be cost-effectively serviced by licensed U.S. lawyers.  Yet, like most businesses, they are seeking ways to climb the value chain and grow into higher margin work.  For example, United Lex is one of the leading legal process outsourcers (LPOs).  Although United Lex maintains a substantial workforce in India, they are investing heavily in process, data analytics, and U.S. onshore facilities.  Why?  Because they want to differientiate the company based on quality and overall value-add to clients, thus staving off competition from law firms or other LPOs.

In the green zone are several new clusters of companies:

  • NewLaw.  These are non-law firm legal service organizations that provide high-end services to highly sophisticated corporations.  They also rely heavily on process, technology, and data.  Their offerings are sometimes called "managed services." Novus Law, Axiom, Elevate, and Radiant Law are some of the leading companies in this space. 
  • TechLaw.  These companies would not be confused with law firms. They are primarily tool makers.  Their tools facilitate better, faster, or cheaper legal output.  kCura, mentioned above, works primarily in the e-discovery space.  Lex Machina provides analytic tools that inform the strategy and valuation of IP litigation cases.  KM Standards, Neota Logic, and Exemplify provide tools and platforms that facilitate transactional practice.  In the future, these companies may open the door to the standardization of a wide array of commercial transactions.  And standardization drives down transaction costs and increases legal certainty -- all good from the client's perspective.
  • PeopleLaw.  These companies are using innovative business models to tap into the latent people hemisphere.  Modria is a venture capital-financed online dispute resolution company with DNA that traces back to PayPal and the Harvard Negotiations Workshop.  See Would You Bet on the Future of Online Dispute Resolution (ODR)?  Legal Whiteboard, Oct 20, 2013.  LegalForce is already an online tour de force in trademarks -- a service virtually every small business needs.  The company is attempting to translate its brand loyalty in trademarks into to new consumer-friendly storefront experience.  Its first store is in the heart of University Avenue in Palo Alto.  LegalForce wants to be the virtual and physical portal that start-up entrepreneurs turn to when looking for legal advice.

Conclusion

When I write about the changes occurring in the legal marketplace, I worry whether the substance and methodology of U.S. legal education provides an excellent education for a legal world that is gradually fading away, and very little preparation for the highly interdisciplinary legal world that is coming into being. 

Legal educators are fiduciaries to our students and institutions. It is our job to worry about them and for them and act accordingly.  Surely, the minimum acceptable response to the facts at hand is unease and a willingness to engage in deliberation and planning.  Although I agree we need to stay calm, I disagree that we need to carry on.  The great law schools of the 21st century will be those that adapt and change to keep pace with the legal needs of the citizenry and broader society.  And that task has barely begun.

[PDF version]

March 17, 2014 in Blog posts worth reading, Current events, Data on legal education, Data on the profession, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (16)

Sunday, March 2, 2014

THOUGHTS ON FALL 2013 ENROLLMENT AND PROFILE DATA AMONG LAW SCHOOLS

DECLINING ENROLLMENT – Between fall 2012 and fall 2013, the 199 law schools in the 48 contiguous states and Hawaii (excluding the Puerto Rican schools) accredited by the ABA’s Section for Legal Education and Admissions to the Bar, experienced the following first-year enrollment changes:

25 schools had a decline in first-year enrollment of 25% or more,

34 schools had a decline in first-year enrollment of 15%-24.99%,

44 schools had a decline in first-year enrollment of 5% to 14.99%,

62 schools had “flat” first-year enrollment of -4.99% to 4.99%,

19 schools had an increase in first-year enrollment of  5% and 14.99%, and

15 schools had an increase in first-year enrollment of 15% or more.

Overall, more than half (103) had a decrease in first-year enrollment of at least 5%, while roughly 17% (34) had an increase in first-year enrollment of at least 5%.

Across these 199 schools, first-year enrollment declined from 42,590 to 39,109, a decrease of 8.2%.  The average decline in first-year enrollment across U.S. News “tiers” of law schools was 2.6% among top 50 schools, 8.2% among schools ranked 51-99, 7.7% among schools ranked 100-144 and 7.9% among schools ranked alphabetically.

            Between fall 2010 and fall 2013, the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth), experienced the following first-year enrollment changes:

            28 schools had a decline in first-year enrollment of 40% or more,

            29 schools had a decline in first-year enrollment of 30% to 39.99%

            43 schools had a decline in first-year enrollment of 20% to 29.99%

            43 schools had a decline in first-year enrollment of 10% to 19.99%

            36 schools had a decline in first-year enrollment of 0% to 9.99%

            10 schools had an increase in first-year enrollment of 0.01%to 9.99%

            6 schools had an increase in first-year enrollment of 10% or more.

Overall, more than half (100) had a decrease in first-year enrollment of at least 20%, while only roughly 8% (16) had any increase in first-year enrollment.

            Across these 195 schools, first-year enrollment declined from 50,408 to 38,773, a drop of 23.1%.  The average decline in first-year enrollment across U.S. News “tiers” of law schools was 14.7% among top 50 schools, 22.5% among schools ranked 51-99, 22.8% among schools ranked 100-144, and 26.8% among schools ranked alphabetically. 

 

DECLINING PROFILES -- Across the 195 law schools in the 48 contiguous states and Hawaii fully-accredited by the ABA’s Section for Legal Education and Admissions to the Bar as of 2010 (thus excluding Belmont, LaVerne, California-Irvine, and Massachusetts-Dartmouth) the entering first-year class average LSAT profile fell one point at all three measures between 2012 and 2013, from 159.6/157/153.5 to 158.6/156/152.5.  The entering first-year class average LSAT profile fell roughly two points at all three measures between 2010 and 2013, from 160.5/158.1/155.2 to 158.6/156/152.5. 

The average decline in median LSAT scores between 2012 and 2013 across U.S. News “tiers” of law schools was .98 among top 50 schools, 1.18 among schools ranked 51-99, .72 among schools ranked 100-144, and 1.13 among schools ranked alphabetically. 

Notably, 133 law schools saw a decline in their median LSAT between 2012 and 2013, with 80 down one point, 38 down two points, 12 down three points, one down four points, one down five points and one down six points, while 54 law schools were flat and 7 saw an increase in their median LSAT. 

In terms of schools experiencing “larger” declines in median LSAT scores between 2012 and 2013, five schools in the top 50 saw a three point decline in their median LSAT, five schools ranked 51-99 saw at least a three point decline (of which one was down four points), three schools ranked 100-144 saw a three point decline, and two schools ranked alphabetically saw large declines – one of five points and one of six points.

The average decline in median LSAT scores between 2010 and 2013 across U.S. News “tiers” of law schools was 1.54 among top 50 schools, 2.27 among schools ranked 51-99, 2.11 among schools ranked 100-144, and 2.79 among schools ranked alphabetically.  If one were to unpack the top 50 schools a little more, however, one would discover that the top 20 schools saw an average decline in their median LSAT of 1.05 between 2010 and 2013, while the bottom 15 schools in the top 50 saw an average decline in their median LSAT of 2.53.

In terms of schools experiencing “larger” declines in median LSAT scores between 2010 and 2013, three schools in the top 50 have seen declines of four or more points, nine schools ranked 51-99 have seen declines of four or more points, 11 schools ranked 100-144 have seen declines of four or more points and 17 schools ranked alphabetically have seen declines of four or more points. 

When looking at the 2012-13 data in comparison with the 2010-2013 data, one sees that lower ranked schools have had more of a sustained challenge in terms of managing profile over the last few years, while schools ranked in the top 50 or top 100 had been managing profile fairly well until fall 2013 when the decreased number of high LSAT applicants really began to manifest itself in terms of impacting the LSAT profiles of highly ranked schools.

The overall decline in the LSAT profile of first-year students also can be demonstrated with two other reference points.  In 2010, there were 74 law schools with a median LSAT of 160; in 2013, that number has fallen to 56.  At the other end of the spectrum, in 2010, there were only 9 schools with a median LSAT of less than 150 and only one with a median LSAT of 145.  In 2013, the number of law schools with a median LSAT of less than 150 has more than tripled to 32, while the number of law schools with a median LSAT of 145 or less now numbers 9 (with the low now being a 143).

 

CONCLUDING THOUGHTS – Over the last three years, few schools have had the luxury of being able to hold enrollment (or come close to holding enrollment) and being able to hold profile (or come close to holding profile).  Many schools have found themselves in a “pick your poison” scenario.  A number of schools have picked profile and made an effort to hold profile or come close to holding profile by absorbing significant declines in first-year enrollment (and the corresponding loss of revenue).  By contrast, a number of schools have picked enrollment and made an effort to hold enrollment or come close to holding enrollment (and maintaining revenue) but at the expense of absorbing a significant decline in LSAT profile.  Some schools, however, haven’t even been able to pick their poison.  For these schools, the last three years have presented something of a double whammy, as the schools have experienced both significant declines in first-year enrollment (and the corresponding loss of revenue) and significant declines in profile. 

March 2, 2014 in Data on legal education, Scholarship on legal education, Structural change | Permalink | Comments (0)

Saturday, March 1, 2014

Is the Employment Market for Law Graduates Going to be Improving?

Last fall, while making a presentation at the Midwest Association of Pre-Law Advisors Conference in St. Louis, I had the opportunity to respond to the question that is the title of this blog posting. 

Is the employment market for law graduates going to be improving?  My answer was, and is, almost certainly yes, although perhaps not immediately.

I write this to offer my perspective on the employment market for law graduates in the coming years.  A number of people have written on this topic in recent weeks and months.  Bernie Burk has a very thoughtful piece analyzing the changing job market over the last three decades.  In his concluding thoughts he suggests that the decline in the number of law students will mean that the job market will be improving.  Paula Young, Debby Merritt, Matt Leichter, and The National Jurist, also have weighed in on this issue with some disagreement about how to understand the “market” for law graduates in the coming years.  Whether and how to include JD Advantage jobs in the analysis is something that is frequently contested.  Bernie Burk does a thorough job analyzing the challenges of assessing whether JD Advantage jobs should be included within his definition of “law jobs” – “placements for which a law degree is typically a necessary or extremely valuable substantive preparation; or put slightly differently, jobs that a law degree typically makes a truly substantial and significant difference in obtaining or performing.”

To avoid some of these definitional challenges, this post will focus solely on the market for full-time, long-term Bar Passage Required jobs. Initially, it will analyze those jobs in relation to all graduates; then it will look more specifically at the percentage of graduates who are likely to be eligible for Bar Passage Required jobs for whom full-time, long-term Bar Passage Required jobs likely will be available, a point on which few others appear to have focused up until now.

Class of 2013 – Little if Any Good News is Likely

In the short term, for the Class of 2013, for which job results will be reported in the coming weeks, it would not be at all surprising to see little, if any, improvement in the employment results in terms of the percentage of graduates finding jobs classified as full-time, long-term Bar Passage Required jobs.

According to NALP’s data, there were 29,978 full-time, long-term Bar Passage Required jobs for 2007 graduates, a number which fell to 24,902 for 2011 graduates, and then rebounded to 26,876 for 2012 graduates, an increase of 1,974.   According to the ABA’s Employment Outcomes data, between 2011 and 2012, the number of full-time, long-term Bar Passage Required jobs grew from 24,149 to 26,066, an increase of 1,917.  (For this blog posting, I am not going to try to reconcile the slight differences in data between NALP and the ABA’s Employment Outcomes data.)

Unfortunately, however, according to the ABA's Employment Outcomes data, this growth in full-time, long-term Bar Passage Required jobs between 2011 and 2012 corresponded with a growth in the number of law graduates, from 43,979 to 46,364, an increase of 2,385.  Thus, even though the number of full-time, long-term Bar Passage Required jobs grew by 7.9%, the percentage of graduates in full-time, long-term Bar Passage Required jobs grew only slightly, from 54.9% to 56.2%.

Between 2012 and 2013 the number of full-time, long-term Bar Passage Required jobs may increase again, but the number of graduates also will be increasing, likely from 46,364 to roughly 47,250.  (For the last few years, the number of law school graduates has averaged roughly 90% of the number of first-year students who started law school three years previously.  With 52,500 first-year students in Fall 2010, there likely were roughly 47,250 May 2013 graduates on whom employment will be reported in the coming weeks.) 

If the number of full-time, long-term Bar Passage Required jobs for the 2013 graduates reported in the ABA Employment Outcomes data grows by roughly 1,000 to 27,000, an increase of nearly 4%, the percentage of graduates with such jobs would increase only slightly to 57.1%.  If the number of full-time, long-term Bar Passage Required jobs for the 2013 graduates grows only slightly, by roughly 500, to 26,500 (an increase of less than 2%), the percentage of graduates with such jobs will drop slightly, to 56.1%.  If the number of Bar Passage Required jobs is flat, at 26,000, the percentage of graduates with such jobs will drop a little more to 55%.  Between 2011 and 2013, the market might see graduates finding roughly 2,000 to 2,500 new full-time, long-term Bar Passage Required jobs, and yet still see only 55% to 57% of graduates in such jobs because of the growth in the number of graduates between 2011 and 2013.

Classes of 2014, 2015, 2016, 2017 – An Improving Dynamic

What are the employment prospects for those currently in law school or considering starting law school in the fall of 2014?  They almost certainly will be getting better – not necessarily because there will be more jobs, but because there will be fewer graduates.

Indeed, to make this point, let’s assume that there is actually no further growth in full-time, long-term Bar Passage Required jobs between 2012 and 2017.  Assume the number of such jobs plateaus at 26,000 for graduates of the Class of 2013 and then stays at that level each year through 2017.  What percentage of law graduates over the next four years will have such jobs? 

According to the LSAC, "ABA First-Year Enrollment" has declined steadily from 2010 to the present, from 52,500 in 2010, to 48,700 in Fall 2011, to 44,500 in Fall 2012.  The ABA recently released the Fall 2013 enrollment summary noting that it had fallen to 39,675.   The LSAC's most recent Current Volume Summary, from February 21, 2014,  indicates that applicants to law school are down roughly 11% compared to last year.  Thus, it seems reasonable to project that first-year matriculants will decline again in Fall 2014.  If first-year enrollment falls by 5%, that would give us roughly 37,700 first-years.  If it falls by 10% once again, that would give us roughly 35,700 first-years.

With these estimates for the number of first-years, we can estimate the number of graduates (which, as noted above, has averaged roughly 90% of first-years for the last few years).  Even if the number of full-time, long-term Bar Passage Required jobs does not continue to rebound, but plateaus at 26,000, as the number of graduates declines over the next few years, the percentage of law graduates obtaining a full-time, long-term Bar Passage Required job, as shown in Table 1, will grow to between 77% and 84% by 2017 (depending upon first-year enrollment in fall 2014).

TABLE 1

Analysis of the Estimated Number of Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of the Estimated Number of Law Graduates from 2012-2017

   

Grad. Year

2012

2013

2014

2015

2016

2017

2017

 

(5% Dec.)

(10% Dec.)

 

(1st Yrs 3 Yrs Prior)

 

51600

 

52500

 

48700

 

44500

 

39675

 

37700*

 

35700*

 
 

Grads (90% of 1st Yrs.)

 

46364

 

47250*

 

43830*

 

40050*

 

35708*

 

33930*

 

32130*

 
 

FT/LT BPR Jobs

26066

26000*

26000*

26000*

26000*

26000*

26000*

 

% of Grads in FT/LT BPR Jobs

 

56%

 

55%*

 

59%*

 

65%*

 

73%*

 

77%*

 

84%*

 

*Denotes estimated value.

An improvement in the number of law school graduates getting full-time, long-term Bar Passage Required jobs, from roughly 55% to between 77% and 84% is indicative of an improving employment market for law school graduates.  Indeed, according to Bernie Burk’s analysis of the employment market over the last few decades, this rate of employment in full-time, long-term Bar Passage Required jobs would rival or exceed the high water mark for “Law Jobs” of roughly 77% that he identified as having been experienced by the graduates from 2005 to 2007.  (And for his purposes, “Law Jobs” included some JD Advantage jobs.)  Moreover, this assumes no growth in the number of full-time, long-term Bar Passage Required jobs; if there is even modest growth in the number of full-time, long-term Bar Passage Required jobs over the next few years, the percentages of grads in these jobs would be even higher than reflected in this chart.

 Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of Those Eligible for Such Positions by Virtue of Having Passed a Bar Exam

Even so, many may look at this and suggest the market remains less than robust given that perhaps 16%-23% of graduates in this “improved” market in 2017 will not obtain full-time, long-term Bar Passage Required jobs. While some compare the number of full-time, long-term Bar Passage Required jobs to the number of law school graduates to demonstrate why the employment market for law school graduates remains unsatisfactory, this may not be the most accurate way of thinking about the market for full-time, long-term Bar Passage Required jobs as not all graduates are going to be eligible for Bar Passage Required jobs.

Among those graduating from law schools accredited by the Section of Legal Education and Admissions to the Bar and taking a bar exam upon graduation, the National Conference of Bar Examiners indicates that over the last several years, on average, roughly 83% of graduates of ABA-accredited law schools pass the bar exam on their first attempt. 

To calculate the employment market for law graduates in the coming years who are eligible for full-time, long-term Bar Passage Required jobs, let’s assume that all law graduates actually want a full-time, long-term Bar Passage Required job and therefore take a July bar exam, and let’s assume that 83% of them pass the bar exam on their first attempt.  This should give us the maximum number of graduates eligible for full-time, long-term Bar Passage Required jobs 10 months after graduation (which will be the measuring point starting with the Class of 2014). 

Even if we assume no growth in the number of full-time, long-term Bar Passage Required jobs in the coming years and simply hold the number of such jobs at a constant 26,000, the decreasing number of law graduates will mean an even more improved employment market for those seeking full-time, long-term Bar Passage Required jobs who will be eligible for those jobs by virtue of having passed the bar exam on their first attempt, increasing from nearly 70% in 2012 and 2013 to nearly 90% by 2016 and over 90% by 2017.

 TABLE 2

Analysis of the Estimated Number of Full-Time, Long-Term Bar Passage Required Jobs as a Percentage of the Estimated Number of Law Graduates Eligible for Bar Passage Required Jobs from 2012-2017 

Graduating Year

2012

2013

2014

2015

2016

2017

2017

 

First   Year Enrollment

 

51600

 

52500

 

48700

 

44500

 

39675

 

37700*

(5% Dec.)

 

35700*

(10% Dec.)

 
 
 

Graduates   (90% of First Year Enrollment)

 

46364

 

47250*

 

43830*

 

40050*

 

35708*

 

33930*

 

32130*

 

83% of   Graduates (NCBE Avg. for First-Time Takers)

 

38482*

 

39218*

 

36379*

 

33242

 

29638*

 

28162*

 

26668*

 
 

FT/LT Bar   Passage Jobs

 

26066

 

26000*

 

26000*

 

26000*

 

26000*

 

26000*

 

26000*

 

Percentage   of Graduates Who Might Pass the Bar for whom FT/LT Bar Passage Jobs Likely   Would be Available

 

68%*

 

66%*

 

71%*

 

78%*

 

88%*

 

92%*

 

97%*

 

 *Denotes estimated value.

Notably, these estimates probably overstate the number of graduates who will be eligible for Bar Passage Required jobs.  First, not all law school graduates want to take a bar exam as some conclude that they are not interested in practicing law as a licensed attorney.  Second, given the increasing number of law school matriculants with LSATs less than 150, one could anticipate a slightly higher rate of attrition such that fewer than 90% of matriculants graduate after three years.  Third, given the increasing number of law school matriculants with LSATs less than 150, one also could anticipate that the historical average bar passage rate of 83% might be too generous.  All of these points suggest that the number of graduates eligible for full-time, long-term Bar Passage Required jobs may decline between now and 2017 even more than is indicated in Table 2.    

Between 2012 and 2013 to 2016 and 2017, we will have gone from having nearly seven full-time, long-term Bar Passage Required jobs for every ten graduates eligible for such positions by virtue of having passed a bar exam to having nine or more full-time, long-term Bar Passage Required jobs for every ten graduates eligible for such positions by virtue of having passed a bar exam. That strikes me as an improving employment market.

Of course, this may not be good news for those who graduated in the last few years into one of the toughest markets in history.  It is not clear that this improving market will be improving for them.  But it also is not clear that this "excess capacity" will unduly constrain the opportunities available to law school graduates in the coming years.  This excess capacity already has been impacting the market, yet the number of full-time, long-term Bar Passage Required jobs obtained within nine months of graduation grew by nearly 2000 between 2011 and 2012.  That is one reason I think the assumption of no further growth in full-time, long-term Bar Passage Required jobs is probably fairly conservative. 

In addition, this may not be good news for those who fail to pass the bar exam on their first try and may have to look for jobs that do not require bar passage.  While a significant percentage of these graduates will pass the bar exam on their second attempt and may eventually find employment in full-time, long-term Bar Passage Required positions, it may take several months longer than they had desired and may require that they pursue other employment, perhaps JD Advantage employment, during the intervening months.  

Even assuming a flat market for full-time, long-term Bar Passage Required jobs, as a result of significant declines in first-year enrollment that will mean a significant decline in the number of law school graduates in 2016 and 2017, we should be moving from having slightly more than three of ten graduates who were eligible for Bar Passage Required jobs in 2012 who could not find them to having less than one of ten graduates in 2017 who likely will be eligible for Bar Passage Required jobs who cannot find them.  While individual schools and local or regional markets may have more varied results on a "micro level," on a "macro level" this should be good news for current first-year students and students considering starting law school in the fall of 2014.

Whether this improving employment situation will be enough to change the trend in terms of declining number of applicants to law school remains to be seen.  While the future may be brightening, the "news" in the coming weeks will be the report on employment outcomes for 2013 graduates nine months after graduation.  As noted above, that may be somewhat uninspiring because any increase in the number of full-time, long-term Bar Passage Required jobs may be masked by the larger number of graduates in 2013 compared to 2012.  As a result, potential law school applicants may remain reluctant to make the commitment of time and money that law school requires because the "good news" message regarding future employment prospects for law graduates may fail to gain traction if the messages about employment outcomes for recent law school graduates continue to be less than encouraging.

March 1, 2014 in Data on legal education, Data on the profession, Structural change | Permalink | Comments (8)

Tuesday, February 4, 2014

If We Make Legal Education More Experiential, Would it Really Matter?

I think the answer is yes.  But, unfortunately, in virtually all of the debate surrounding legal education, there is a tremendous lack of clarity and precision about how we assess improvements in quality.  And equally relevant, if a gain is real, was it worth the cost?

The purpose of this essay is to chip away at this serious conceptual gap.  Until this gap is filled, experiential education will fall significantly short of its potential. 

Is Experiential Legal Education Better?  And if so, at What Cost?

Many legal educators believe that if we had more clinics, externships, and skills courses in law school, legal education would be better.  Why?  Because this more diversified curriculum would become more "experiential."  

Inside the legal education echo chamber, we often accept this claim as self-evident. The logic runs something like this.  A competent lawyer needs domain knowledge + practical skills + a fiduciary disposition (i.e., the lawyer’s needs are subservient to the needs of clients and the rule of law).  Since practical skills—and some would argue, a fiduciary disposition—cannot be effectively acquired through traditional Socratic or lecture teaching methods, the ostensible logic is that schools become better by embracing the "learning-by-doing" experiential approach.

That may be true.  I would bet on it. But the per-unit cost of legal education is also probably going up as well.  So, have we really created a viable and sustainable long-term improvement to legal education?  

In my mind, the questions we should be asking instead are the following:  (1) Among experiential teaching methods, which ones are the most effective at accelerating professional development?  And (2) among these options, how much does each cost to operate?  Quality and cost must be assessed simultaneously.  After they are evaluated, then we will be able to make choices and tradeoffs. 

Let's start with quality, which I define as moving lawyers toward their peak effectiveness potential as rapidly and cost-effectively as possible. This is an education design problem, as we are trying to find the right combination of education (building domain knowledge) and experience (acquiring and honing skills through practice).  There is also likely to be an optimal way to sequence the various educational and experiential steps. 

Creating Compelling Evidence of Educational Quality

We legal educators have many ideas on how to improve educational quality, but we make no real progress if employers and students remain unconvinced.  Can it be shown that because of a specific type of experiential curriculum at School X, its graduates are, during the first few years of practice, more capable lawyers than graduates of School Y?  

[Side bar:  If you are skeptical of this market test, it is worth noting that it was the preferences of law firm employers who gave rise to the existing national law school hierarchy.  It happened about 100 years ago when a handful of law schools adopted the case method, required undergraduate education as a prerequisite to admission, and hired scholars as teachers.  As a general matter, this was a far better education than a practitioner reading lecture notes at the local YMCA.  See William Henderson, "Successful Lawyer Skills and Behaviors," in Essential Qualities of the Professional Lawyer ch 5 (P. Haskins ed., 2013).]

If a law school can produce, on balance, a better caliber of graduates than its competitors, then we are getting somewhere.  As this information diffuses,  employers (who want lawyers who make their lives easier) will preference law schools with the better graduates, and law students (who want more and better career options) will follow suit. Until we have this level of conceptual and empirical clarity, we might as well be debating art or literature.

If students and employers are responding to particular curricula, it is reasonable to assume they are responding to perceived value (i.e., quality as a function of price).   I believe there are three steps needed to create a legal education curriculum that truly moves the market.

1. Clarity on Goals.  We need to understand the knowledge, skills, and behaviors that are highly prized by legal and non-legal employers.  Truth be told, this is tacit knowledge in most workplaces. It is hard intellectual work to translate tacit knowledge into something explicit that can be communicated and taught. But we are educators -- that is our job!  If we think employers are missing something essential, we can add in additional factors. That's our job, too.

2. Designing and Building the Program. Working backwards from our goals, let's design and build curricula that will, overall, accelerate development toward those goals.  This is harder and more rigorous than lesson planning from a casebook.

3. Communicating Value to the Market.  If our program is indeed better, employers and students need to know it.  This also requires a crisp, accurate message and a receptive audience.  This requires planning and effort.  That said, if our program truly is producing more effective lawyers, it logically follows that our graduates (i.e., the more effective lawyers) will be the most  effective way to communicate that message. 

Regarding point #3, in simple, practical terms, how would this work?  

During the 1L year, we show our law students the roadmap we have developed (step #2) and spend the next two years filling in the knowledge, skills, and behaviors needed to achieve their career goals.  This professional development process would be documented through a portfolio of work.  This would enable students to communicate specific examples of initiative, collaborative learning, problem-solving, or a fiduciary disposition, etc., developed during law school.  Students would also know their weaknesses, and have a clear plan for their future professional development. In a word, they'd stand out from other law graduates because, as a group, they would be much more intentional and self-directed (i.e., they'd know where they are going and how to get there). 

With such a curriculum in place, our law school would collaborate with employers assess the performance of our graduates.  By implication, the reference point for assessing quality would be graduates from other law schools.  When our graduates fare better, future graduates will be more heavily recruited.  Why?  Because when an employer hires from our school, they would be more likely to get a lawyer who helps peers and clients while adding immediate enterprise value.    

I suspect that many of my legal academic colleagues would argue the best law schools are not trade schools -- I 100% agree.  But I am not talking about a trade school model.  Rather, a world-class law school creates skilled problem-solvers who combine theory with practice and a fiduciary disposition. Graduates of a world-class law school would be reliably smart, competent, and trustworthy.  This is a very difficult endeavor. It takes time, planning, collaboration, creativity and hard work.  But the benefits are personal, organizational, and societal.  

At a practical level, I think few law schools have targeted this goal with a full, unbridled institutional commitment.  But the opportunity exists.

Applied Research 

When I got tenure in 2009, I decided that I was going to spend the next several years doing applied research. I am a fact guy.  Rather than argue that something is, or is not, better, I prefer to spend my time and effort gathering evidence and following the data.  I am also a practical guy.  The world is headed in this direction, thanks to the ubiquity of data in the digital age.  And, on balance, that is a good thing because it has the potential to reduce conflict. 

I have pursued applied work in two ways:  (1) building stuff (curricula, selection systems, lawyer development tools, datasets for making strategic decisions, etc.) and assessing how well it works, and (2) observing and measuring the work of others.

A Law School Curriculum Worth Measuring

A couple of years ago, a really unique applied research opportunity fell onto my lap.  I had a series of lengthy discussions on the future of legal education with Emily Spieler, who was then serving as dean of Northeastern University School of Law in Boston, a position she held for over a decade.  One of the raps on legal education is that it is more alike than it is different.  In fact, this very point was just made by the ABA Taskforce on Legal Education.  See ABA Task Force On The Future Of Legal Education, Report And Recommendations (Jan. 2014) at 2.

Emily, in contrast, said her school was unique -- that the curriculum better prepared students for practice and enabled them to make better career planning decisions.  Also, Emily stated that Northeastern students were more sensitized to the needs of clients and the privilege and burden of being a lawyer--specifically, that Northeastern grads become aware, before graduation, that their own lack of competency and diligence has real-world consequences for real-world people. And that reality weighed on students' minds.  

Tall claims.   But if Northeastern coulddeliver those outcomes more effectively than the traditional unstructured law school curriculum, I wanted to know about it.  

On a purely structural level, Northeastern Law is definitely unique.  Most law schools are organized on either quarters (University of Chicago, my alma mater) or semesters (Indiana University, where I teach). Northeastern, however, has both.  The 1L year curriculum at Northeastern is the traditional two semester model.  But after that, the school flips to quarters -- one quarter in law school, and one quarter in a cooperative placement with a legal employer, such as a judge, prosecutor’s office, a law firm, a corporate legal department, or a public interest organization.  

This classroom/coop sequence occurs four times over eight quarters.  Because the cooperative placement is not viewed as part of Northeastern's ABA-required course work -- all the contact hours are packed into two 1L semesters and four 2L/3L quarters -- students can be paid during cooperative placements.  And in any given semester, roughly 30 to 40% are getting paid. 

This system has been up and running for 45 years--over 5,000 students have become lawyers through this program.  What an amazing research opportunity! 

Now imagine the faculty meeting where the law professors get together to discuss and deliberate over whether to adopt the Northeastern model.  At Northeastern, "summer" means summer quarter, not summer vacation.  

How did this unique curricular structure come into being?  That is quite an interesting story. During the 1950s, the law school at Northeastern was shuttered.  Yet, reflecting the zeitgeist of the times, a group of Northeastern law alumni and young lawyers who were skeptical of their own legal education (at elite national law schools) petitioned Northeastern to reopen the law school and feature a more progressive, forward-looking curriculum.  The university administration agreed to reopen the law school on the condition that the school adopt the signature cooperative education model.  So this crucial decision was essentially made at the birth of the law school over four decades ago.  Once up and running, Northeastern Law implemented other innovations, such as the narrative grading policy--i.e., no letter grades and no GPA.  This was done in order to mitigate competition and encourage a focus on collaboration and skills development. 

The Outcomes Assessment Project

Back in 2011, my conversations with Emily Spieler eventually led me to make a two-day pilgrimage to Boston to talk with Northeastern Law faculty, students, administrators, and coop employers.  Suffice it to say, I was surprised by what I witnessed --a truly differentiated legal education with a substantial alumni/ae base spanning 45 years.  

That pilgrimage eventually led to my involvement in Northeastern Law's Outcomes Assessment Project (OAP), which is something akin to The After the JD Project, but limited in scope to Northeastern -- although Northeastern will provide all of the project tools and templates to other law schools interested in studying their own alumni.  From the outset, the OAP has been set up to scale to other law schools. 

There are lots of tricky methodological issues with Northeastern.  For example,

  • It has a longstanding public interest tradition; Northeastern Law is overrepresented in government service, public interest, and non-profit sectors (including a sizeable contingent of law professors and legal clinicians). See Research Bulletin No 1.
  • Its student body was over 50% female almost from the outset, nearly 20 years before legal education as a whole. 
  • Because of its progressive roots, GLBT law students have long been drawn to Northeastern Law -- again, nearly two decades before it was deemed safe to be out.

Because of this distinctive profile, we have to worry that any differences in graduates are primarily due to a selection effect (who applied and enrolled) versus a treatment effect (they got a different type of education).  That said, the admissions data show that Northeastern Law students are, like other law students, strongly influenced by the US News rankings.   If a student gets admitted to Northeastern Law and BC, BU, or Harvard Law, Northeastern seldom wins.  

Over the coming months, I am going to use OAP data to attempt to develop some analytical and empirical clarity to some of the questions surrounding experiential education.   Preliminary data from our Research Bulletin No 3 suggest that the coop program does remarkably well in developing the three apprenticeships identified by the Carnegie Report.  More on that later. 

Print version of this essay at JD Supra.

February 4, 2014 in Data on legal education, Important research, Innovations in legal education, Scholarship on legal education | Permalink | Comments (4)