Thursday, October 17, 2013
Trends in LSAT Profiles of Applicants and Matriculants
In looking at trends over the last 12 years, there are two relevant time frames due to changes in how LSAC reported data. Between 2002 and 2009, the LSAC’s annual National Decision Profiles were based on the average LSAT scores of applicants and matriculants. From 2010 to the present, the National Decision Profiles were based on the highest LSAT scores of applicants and matriculants. This post compares trends in LSAT profiles between 2002 and 2009 with trends between 2010 and 2013, noting that the latter period not only has seen a decline in enrollment but also has seen a significant weakening of the overall LSAT profile of first-years.
Changes in LSAT Profiles from 2002-2009 Using Average LSAT
The following chart shows the difference in LSAT composition of first-years in three cycles between 2001-02 and 2008-09.
Matriculants by LSAT Category (Reflecting Average LSAT) 2002-2009
165+ 150-164 <150 Total
2001-02 5,889 30,100 9,097 45,086
2004-05 7,447 32,007 6,036 45,490
2008-09 7,652 31,991 8,943 48,586
In the three years between 2002 and 2005, applications grew by roughly 5,000, to roughly 95,000, with growth among those with an average LSAT of 165+ and an average LSAT of 150-164, and a modest decline among those with an average LSAT of <150. Law schools matriculated only 400 more first-years in 2005 than in 2002, but there were roughly 3,050 fewer first-year students with average LSATs <150, with 1,900 more first years with average LSATs of 150-164 and roughly 1,550 more with average LSATs of 165+. This three-year period saw strengthening of the LSAT profile of first-year students.
Four years later, with an applicant pool that had declined to nearly 87,000, however, law schools enrolled over 3,000 additional first-year students, 2,900 of whom had average LSATs of <150. Virtually all of the growth in first-years between 2005 and 2009, therefore, was comprised of students at the lower end of the LSAT profile.
Nonetheless, in comparison with the 2002 first-years, the 2009 first-years included slightly fewer students with an average LSAT of <150 (down 154 – 1.7%) and larger populations of students with average LSATs of 165+ (up 1,763 – nearly 30% more) and with average LSATs of 150-164 (up 1,891 – or roughly 6.3% more). In 2009, therefore, the average LSAT profile of all first-years, while less robust than in 2005, was still more robust than in 2002.
Between 2004 and 2008, the ABA approved nine new law schools (with fall 2009 first-year enrollment in parentheses) – Atlanta’s John Marshall (211) and Western State (188) in 2005, Liberty (119), Faulkner (150) and Charleston (241) in 2006, Phoenix (272) in 2007, and Elon (121), Drexel (156) and Charlotte (276) in 2008. The first-year enrollment of these nine schools in Fall 2009 totaled 1,734, roughly 60% of the growth in matriculants with average LSATs of < 150 between 2005 and 2009. While many of the first-year students at these schools had LSATs of greater than 150, these schools took students who might have gone to other schools and increased the overall demand for applicants with average LSATs of <150.
Changes in LSAT Profiles from 2010-2013
The following chart focuses on the last three admissions cycles and the current admission cycle, covering the period in which the LSAC National Decision Profiles were based on each applicant’s highest LSAT score.
Applicants and Matriculants Across Three LSAT Categories Based on Highest LSAT from 2010 to 2013
Adm. Cycle Total Total Apps. Mat. Apps. Mat. Apps. Mat.
Apps. Mat.* 165+ 165+ 150-164 150-164 <150 <150
Fall 2010 87912 49719 12177 9477 47722 32862 26548 7013
Fall 2011 78474 45616 11190 8952 41435 29220 24396 7101
Fall 2012 67925 41422 9196 7571 34653 25425 22089 7906
Fall 2013** 59426 38900 7496 6300 30263 24000 20569 8200
*Note that the total matriculants number is greater than the sum of the matriculants across the three categories in any given year because the total matriculants number includes non-standard test-takers and those without an LSAT.
**The Fall 2013 numbers represent estimates based on the number of applicants in each category and an assumption that 2013 saw another slight increase in the percentage of applicants from each LSAT category who matriculated (consistent with increases in the two previous years in response to the decreasing applicant pool).
During this period, the number of applicants declined by 28,000, or over 32%, but the number of applicants with a highest LSAT of 165+ declined by 38%, and the number with a highest LSAT of 150-164 declined by 36.5%, while the number with a highest LSAT of <150 declined by only 22.5%. Thus, the pool of applicants is not only smaller in the 2012-13 admissions cycle as compared to 2009-10, but it is “weaker” in terms of the LSAT profile.
The number of matriculants in the top two LSAT categories also declined significantly between Fall 2010 and Fall 2012, while the number of matriculants in the bottom LSAT category actually grew.
The number of matriculants whose highest LSAT score was 165+ fell from 9,477 in 2010 to 7,571 in 2012, a decline of over 20%, while the percentage of applicants in this category who became matriculants increased from 78% to 80% to 82% over that period. If we estimate that 84% of the 2013 applicants with a highest LSAT of 165+ matriculate, then we can anticipate roughly 6300 matriculants for Fall 2013 with a highest LSAT of 165+, a drop of nearly 33% since 2010.
The number of matriculants whose highest LSAT score was 150-164 fell from 32,862 in 2010 to 25,425 in 2012, a decline of nearly 23%, while the percentage of applicants in this category who became matriculants increased from 69% to 70.5% to 73% over that period. If we estimate that roughly 79% of the applicants with a highest LSAT of 150-164 matriculate, then we can anticipate roughly 24,000 matriculants for Fall 2013 with an LSAT of 150-164, a decline of roughly 27% since Fall 2010.
Meanwhile, the number of matriculants whose highest LSAT score was <150 grew from roughly 7,000 to over 7,900, an increase of roughly 13%, while the percentage of applicants in this category who became matriculants increased from 26% to 29% to 36% over that period. If we estimate that roughly 40% of the applicants with a highest LSAT of <150 matriculate, then we can anticipate roughly 8,200 matriculants with an LSAT of <150 for Fall 2013, an increase of roughly 17% since Fall 2010.
Percentage of First-Years from Each LSAT Category Using Highest LSAT-- 2010-2013*
165+ 150-164 <150
2010 0.191 0.661 0.141
2011 0.196 0.641 0.156
2012 0.183 0.614 0.191
2013 0.162 0.617 0.211
*The sum of the percentages in any given year will be slightly less than 1.00 because the denominator -- total matriculants -- includes matriculants with non-standard LSAT and those with no LSAT.
This table shows that if my estimates for 2013 are roughly accurate, while the percentage of matriculants whose highest LSAT score was 165+ in the first-year class has declined between Fall 2010 and Fall 2013 by roughly 16% (from 19% to 16%) and the percentage of matriculants whose highest LSAT was 150-164 has declined by roughly 6% (from 66% to 62%) the percentage of matriculants whose highest LSAT was <150 has increased 50% (from 14% to 21%).
Adjusting from Highest LSAT to Average LSAT to Compare 2002 and 2013
The change in the 2009-10 admissions cycle to using highest LSAT rather than average LSAT resulted in an increase in matriculants with scores of 165+ of roughly 1,800 between Fall 2009 and Fall 2010. Given that there had been a modest increase in the number of matriculants with an average LSAT of 165+ between 2008 and 2009 (an increase of roughly 600, from 7,023 to 7,652), it might be fair to assume that there would have been another modest increase in the number of matriculants with an average LSAT of 165+ between 2009 and 2010 given the challenging economic environment at the time and the continued growth in applications between 2009 and 2010. Assume then that of the 1,800 additional matriculants with scores of 165+, 400 would have been included in the category if we were still using an average LSAT of 165+ rather than the highest LSAT of 165+. That would suggest that to estimate the number of matriculants with an average LSAT of 165+ in 2010, it might make sense to subtract 1,400 matriculants from the number of matriculants with a highest LSAT of 165+ in 2010 and then for the next three years apply the same percentage reduction as reflected in the number of those with a highest LSAT of 165+ over those three years.
The change to highest LSAT rather than average LSAT also resulted in a drop in the number of matriculants with an LSAT <150 between 2009 and 2010 of roughly 1,900 matriculants. Notably, the number of applicants and matriculants with an average LSAT <150 had grown slightly between 2007 and 2009 (applicants from 29,123 to 29,926, matriculants from 7,013 to 7,906). Nonetheless, to err on the conservative side, assume that the number of matriculants with an average LSAT <150 actually may have declined in Fall 2010 from Fall 2009 rather than continuing to increase modestly. Assume it would have declined by roughly 5% or 400 (rather than 1,900). That would mean that to estimate the number of matriculants with an average LSAT of <150 in Fall 2010, we would need to add to the number with a highest LSAT of <150 roughly 1,500 more matriculants and then for the next three years apply the same percentage increase as reflected in the number of those with a highest LSAT of <150 over those three years.
Using these assumptions, the estimated number of first-years with an average LSAT of 165+ would fall to roughly 5,400 as of Fall 2013, while the estimated number of first-years with an average LSAT of <150 would rise to over 9,800 in Fall 2013.
If the estimates above are close to accurate, then the number of Fall 2013 matriculants with an average LSAT score of 165+ represents roughly 14% of Fall 2013 matriculants (a slightly higher percentage than in Fall 2002), while the number of Fall 2013 matriculants with an average LSAT of <150 represents over 25% of Fall 2013 matriculants (a much higher percentage than in Fall 2002). The following chart shows the percentage of matriculants for the period from 2002-2013 taking into account the estimates set forth in the preceding paragraph regarding the number of matriculants with an average LSAT in each range over the period from 2010-2013.
This graph shows that the percentage of matriculants with an average LSAT of 165+ has varied between roughly 13% and roughly 17% percent over the period from 2002-2013, and appears to have returned in Fall 2013 to a percentage only slightly higher than where it was in Fall 2002. By contrast, this chart also shows that the percentage of matriculants with an average LSAT of <150 had varied between roughly 19% and roughly 13% until the Fall 2012 and Fall 2013 groups of matriculants, when the percentages increased to roughly 22% (in 2012) and over 25% (in 2013). While this graph does not include the percentage of matriculants with average LSATs of 150-164, one can infer that percentage as the difference between 100% and the sum of the 165+ percentage and the <150 percentage. For the period between 2002 and 2011, this generally hovered between 65% and 70%, but in the last two years it has fallen closer to 60%.
This shift in LSAT profile is further evidenced by changes in LSAT profiles among first-year entering classes between 2010 and 2013. For Fall 2010, there were only nine law schools with a median LSAT of 149 or lower (using highest LSAT for reporting purposes). For Fall 2011, there were 14 law schools with a median LSAT of 149 or lower. For Fall 2012, there were 21 law schools with a median LSAT of 149 or lower. That number may grow to nearly 30 when official data is published next spring on the Fall 2013 entering class.
If one uses the LSAT profile as an indicator of the “strength” of a given class of first-year students, and uses the framework set forth above for looking at the LSAT profile, then in the last three years we not only have seen first-year enrollment shrink by roughly 10,000 students, but also have seen a significant “weakening” of the LSAT profile. In terms of LSAT profile, the Fall 2013 entering class is almost certainly the weakest of any class going back to Fall 2002. This may impact the classroom experience at some law schools and may impact bar passage results when the Fall 2013 entering class graduates in 2016.
Why the Differential Response to Market Signals by Different Populations of Prospective Law Students?
What might explain the extent to which different populations of prospective law students have responded to market signals in such different ways, with those from elite college and universities and those with higher LSATs turning away from law school more than those from less elite colleges and universities and those with lower LSATs? In Part Three I will explore some possible explanations.
Tuesday, October 15, 2013
Below is job posting for a new type of job called a "legal solutions architect."
The job post just appeared on the website of Seyfarth Shaw, a large law firm based in Chicago. Seyfarth was one of the first to embrace the movement toward technology and process. See Six Sigma at Seyfarth Shaw, Legal Professions Blog, April 14, 2010.
Before getting to the text of the ad, a few of observations for what this posting is telling us about legal education and the emerging legal job market:
- This is a pure JD advantaged job. "Juris Doctor or MBA with legal industry experience strongly preferred job" (emphasis in original). It is full-time, long-term job in downtown Chicago. it is not reviewing documents. This is a good professional job doing very sophisticated and challenging work.
- The job is not partner-track. But it terms of economic potential and job security, does that matter? In the years to come, folks that understand the overlay between law, technology, and process are going to be great demand and have a lot of options.
- Undergraduate education matters, but the majors are far from typical among traditional law students: finance, business administration, computer science, or "other technical discipline."
- It is easier to get this job if an applicant has familiarity with "extranets, intranets, document assembly, enterprise search, relational databases and workflow." Also, it is "a plus" to have "familiarity with Agile and Scrum [two software development tools]." We don't teach any of this stuff in law school. Perhaps we should.
- The required skills are an blend of technical skills and knowledge plus higher order professional abilities that, frankly, are not explicitly taught in law school. Law schools need to take notice, as this an order any decent professional school should be able to fill.
Now the actual job posting:
Legal Solutions Architect
Seyfarth Shaw is one of the most progressive, forward-thinking law firms in the world. Seyfarth’s commitment to delivering legal services in a new way through its SeyfarthLean program - with an emphasis on value and continuous improvement - has been praised by the Association of Corporate Counsel (ACC) as being “five years ahead of every other AmLaw 200 firm.”
Legal Solutions Architects anticipate, identify, sell and drive innovative business solutions. Through an understanding of technology, knowledge management, business analysis, process improvement and project management, this role provides solutions that enhance the client experience. These multidisciplinary resources are aligned with Firm strategy and play an important role in driving the Firm’s innovative approach to the practice of law and the delivery of legal services.
This position will report to the Director of the Legal Technology Innovations Office. Seyfarth Shaw recently received awards for 2013 Innovative Law Firm of the Year and Innovative Project of the Year, and the efforts of the Legal Technology Innovations Office played a significant role in earning those recognitions.
- Partner with clients, Seyfarth legal teams and legal project managers to enhance the delivery and effectiveness of services provided within legal engagements
- Translate stated and inferred needs of clients and attorneys into specific technologies and methods
- Synthesize the needs of multiple engagements and create requirements for systematic solutions that underpin Seyfarth’s varied legal practices
- Team with the Application Development Group to design and plan for custom solutions and oversee the construction and implementation of these systems
- Manage multiple projects concurrently, juggling priorities, deadlines and essential duties for each project
- Collaborate with other Firm departments, including Legal Project Management Office, Practice Management, Finance, Marketing and Professional Development to provide comprehensive solutions
- Act as an effective change manager – keeping client and Firm culture, group behavior and individual habits in mind in order to best circumnavigate roadblocks and pitfalls for solution adoption
- Provide presentations to individuals, small groups and large audiences of clients and Seyfarth attorneys in a persuasive and encouraging manner
- Contribute to continuous improvement, promote the use of technology solutions and help improve the awareness of the impact of the solutions on the business
- Perform vendor due diligence and serve as a point of contact for third-party technologies leveraged by the Firm
- Conduct market, external and internal research and convey results to forward assigned projects and to aid projects lead by teammates, other groups and other departments
- Proactively research and maintain knowledge of emerging technologies and service delivery models and possible applications to the business
- Highly motivated self-starter with an entrepreneurial bent
- Uses intelligence, creativity and persistence to solve varied, non-routine problems
- Possesses an understanding of knowledge management, process improvement and legal project management and an appreciation of the benefits to law firms employing these approaches
- Passion for legal technology, including technical platforms, specific technical applications and their impact on the practice of law
- Keen grasp of project management, flexible in project execution and able to meet aggressive deadlines
- Strong business analysis approach
- Visualizes how raw data can be converted into useful information for client and Firm decision-makers
- Pays attention to detail but still maintains focus on the bigger picture
- Comfortable working both independently and in diverse teams
- Excellent written and verbal communicator that is able to distill complex concepts into simple messages
- Familiar with the software development cycle
- Capable of managing and motivating up, down and across the organization
- Appreciation for user interface and user experience design
- Embraces change and seeks to create order from chaos
- Bachelor’s degree, preferably in finance, business administration, computer science or other technical discipline
- Juris Doctor or MBA with legal industry experience strongly
- Experience working within a large law firm preferred but not required
- Familiarity with extranets, intranets, document assembly, enterprise search, relational databases and workflow preferred
- Familiarity with Agile and Scrum a plus
Seyfarth Shaw is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the employment process, please call (312) 460-6545 and let us know the nature of your request and your contact information. We offer an outstanding benefit package which includes: medical/dental, 401k with employer contribution; life insurance; transportation fringe benefit program; generous paid time off policy; and long-term and short-term disability policies. Equal Opportunity Employer M/F/D/V
Friday, October 11, 2013
Analysis of Differential Declines in Law School Applicants Among Top-240 Feeder Schools
Some people recently have noted the decline in applications to law school from graduates of relatively elite colleges and universities - here and here. This suggests that different populations of potential applicants to law school are responding differently to market signals about the cost of legal education and the diminished employment prospects for law school graduates in recent years.In this blog posting, I analyze the changes in applications among the LSAC's Top 240 Feeder Schools between 2010 and 2012, documenting the extent to which the response to market signals about legal education has been different among graduates of elite colleges and universities when compared with graduates of less elite colleges and universities. In Part Two, I will look at a different set of data regarding changes in LSAT profiles of applicants. In Part Three, I will offer some possible explanations for the different responses to market signals among different groups of applicants.
Between 2010 and 2012, the total number of applicants from the Top 240 Feeder Schools fell from 55,818 to 42,825. In both years, the Top 240 Feeder Schools were responsible for roughly 63% of the total pool of applicants (63.5% of 87,900 in 2010 and 63.1% of 67,900 in 2012). But the decline in applications was not uniform across all of the Top 240 Feeder Schools. There are a few different ways one can look at this information to get a sense of the different responses among different populations of potential applicants.
Differential Declines Among Feeder Schools with Law Schools Ranked in Different Tiers
First, one can look at declines across the Top 240 Feeder Schools that have law schools.
One might surmise that potential applicants who are graduates of colleges and universities with a law school might be particularly well aware of the increasing costs of legal education and the challenging employment environment for recent law school graduates and assume that feeder schools with law schools would generally see similar declines in applications. In fact, however, the percentage decline in applications between 2010 and 2012 varied significantly by the ranking of the law school at the feeder school.
Among feeder schools with law schools ranked between 1-50 in the most recent USNews rankings, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 28.08%. Among feeder schools with law schools ranked between 51-100, the average percentage decline in applicants between Fall 2010 and Fall 2012 was 20.27%. Among feeder schools with law schools ranked between 100-146, the average percentage decline in applicants was 18.14%. But among feeder schools with law schools that are ranked alphabetically, the average percentage decline in applicants was only 3.31%.
Given that most of the top ranked law schools are at colleges and universities that also are considered elite colleges and universities, and most of the alphabetically ranked law schools are at colleges and universities that are not considered elite colleges and universities, this analysis suggests that graduates of elite colleges and universities are responding to the market signals regarding legal education differently than graduates of less elite college and universities. (This may seem particularly paradoxical, given that the percentage decline in applicants generally is greater at colleges and universities with more highly ranked law schools (whose graduates generally experience more promising employment outcomes) while the percentage decline in applicants is lowest at colleges and universities with less highly ranked law schools (whose graduates generally experience less promising employment outcomes.))
Comparisons of Outlier Schools – Those Schools More than One Standard Deviation from the Mean
Second, one can look at “outlier” schools and see how negative outliers compare to positive outliers. The average percentage decline in applicants across the Top 240 Feeder Schools between 2010 and 2012 was 19.76%. The standard deviation was 18.67%. How do those schools more than one standard deviation from the mean compare with each other?
There are a total of 13 schools that saw a decline in applicants between 2010 and 2012 putting them below the mean by more than one standard deviation – schools with a decline in applications greater than 38.44%. There are a total of 26 schools that saw an increase in applications or such a modest decline in applications that their increase/decline was more than one standard deviation above the mean – a decline of less than 1.09% or an increase. How do these schools compare?
Eight of the 13 feeder schools that saw the most significant declines in applications had a law school with an average rank of 69. (These schools include NYU (6), Virginia (7), Cornell (13), George Mason (41), Marquette (94), Akron (119), Loyola (New Orleans) (126), and Univ. of San Fran. (144). Four of the eight were top-50 law schools, while none were alphabetically ranked.)
Thirteen of the 26 feeder schools that saw the least significant declines in applications (or saw increases in applications) had a law school, including four that were ranked alphabetically. Among just the nine law schools in this category that are ranked, the average rank is 104. (These schools include Denver (64), UNLV (68), Loyola (Chicago) (76), Rutgers (91), Florida International (105), Wyoming (113), CUNY (132), Southern Illinois (140), and Suffolk (144), along with Florida A & M, North Carolina Central, Nova Southeastern, and Southern (all alphabetical). Notably, only four of the thirteen were ranked in the top-100 law schools (none in the top-50).)
Again, in this analysis, with a few exceptions, those feeder schools that saw significant declines in applicants generally represent a more elite slice of American colleges and universities, while those with the most nominal declines in applicants (or increases in applicants) generally represent a less elite slice of American colleges and universities.
Outliers More Broadly – Comparing Schools with Declines Greater than 30% and Less than 10%
Third, if one wanted to look at a broader pool of feeder schools at the bottom and the top, one could look at all schools down 30% or more in applicants and all schools that were down 10% or less in applicants between 2010 and 2012 (roughly 10% above and below the mean), two sets that account for nearly half of the Top 240 Feeder Schools.
There were 68 schools down 30% or more in applicants, 46 of which had a law school, of which 29 were ranked in the top-50, with only one school ranked alphabetically. The average rank of the 45 numerically ranked law schools was 48. The other 22 feeder schools in this category include several highly regarded schools – including, for example, Rice, Vassar, Miami University, Brown, Amherst, Johns Hopkins and Princeton.
There were 51 schools with a decrease in applicants of 10% or less, 25 of which had law schools, only two of which were ranked in the top-50, with six schools ranked alphabetically. The average rank of the 19 numerically ranked law schools was 94. The other 26 feeder schools in this category include mostly less elite colleges and universities – including, for example, Kenesaw State University, University of Texas at San Antonio, and Florida Gulf Coast University, along with University of Phoenix and Kaplan University.
All three approaches to analyzing the changes in applicants among the Top-240 Feeder Schools point in the same direction. Graduates of elite colleges and universities are opting not to apply to law school at a greater rate than graduates of less elite colleges and universities. One might suppose that this translates to a greater decline in the number of applicants and matriculants with really high LSATs (165 or above) as compared to those with relatively low LSATs (149 and below). In Part 2, I explore whether this supposition is accurate.
Posted by Jerry Organ
Wednesday, October 2, 2013
Because the U.S. News & World Report ranking era has been associated with so much turmoil and bad behavior, many of us in legal education tend to think of the magazine as the source of woes. In fact, the evidence compiled in an new paper on SSRN, "Enduring Hierarchies in American Legal Education," suggest that our desire (or propensity) to establish a legal education pecking order predates the U.S. News rankings by century or so. Vanity of vanities, all is vanity -- at least that is what the data seem to suggest.
My brilliant and industrious colleagues, Funmi Arewa and Andy Morriss, led the charge on this. For many, a major contribution of this research will be the detailed 40+ tables compiled at the end of the article. Now that all that fact-gathering work is done, others can use it. Below is the paper's abstract:
Although much attention has been paid to U.S. News & World Report’s rankings of U.S. law schools, the hierarchy it describes is a long-standing one rather than a recent innovation. In this Article, we show the presence of a consistent hierarchy of U.S. law schools from the 1930s to the present, provide a categorization of law schools for use in research on trends in legal education, and examine the impact of U.S. News’s introduction of a national, ordinal ranking on this established hierarchy. The Article examines the impact of such hierarchies for a range of decision-making in law school contexts, including the role of hierarchies in promotion, tenure, publication, and admissions, for employers in hiring, and for prospective law students in choosing a law school. This Article concludes with suggestions for ways the legal academy can move beyond existing hierarchies and at the same time address issues of pressing concern in the legal education sector. Finally, the Article provides a categorization of law schools across time that can serve as a basis for future empirical work on trends in legal education and scholarship.
Posted by Bill Henderson
Monday, September 16, 2013
The trend toward outsourcing of legal work to India may be giving way to "onshoring." What is the attraction of moving legal jobs back to the US? The wage gap between India and the US is closing, but more importantly, innovation and continuous improvement are significantly aided by proximity.
I heard this perspective from a friend of mine who was part of the management team of a successful LPO that was sold (at a substantial profit) to a much larger legal conglomerate. Indeed, he contemplated getting back into the business, but this time running an onshoring operation.
This identical perspective is on display in a recent Minneapolis StarTribune story on Black Hills IP, a 2.0 legal process outsourcer that provides various types of managed services for all things related to intellectual property. According to its website, Black Hills IP is a "US-based IP paralegal service that is faster, more accurate and more cost-effective than in house departments and off-shore providers." The company appears to be growing, as it did a PR-blitz to commemorate its 100th client. The company was originally started in Rapids City, South Dakota but has since expanded to Minneapolis.
What make this story especially interesting is that many of the folks who started Black Hills IP were sophisticated Minneapolis corporate lawyers who created a company in the early 2000s called Intellevate, a 1.0 LPO that was sending legal work to India. In 2006, Intellevate became part of CPA Global, a much larger LPO. In other words, the folks at Black Hills IP are industry players with much better information than the rest of us who are making bets with their own money.
Unlike traditional law firms, these types of legal vendors are growing rapidly. Their secret sauce appears to be combining high-quality processes with capable, motivated paraprofessional talent.
The challenge for law schools and many practicing lawyers is getting our heads around the fact that, from a pure market perspective, bright legal minds may be less valuable than well-designed and well-executed legal processes and systems. This state of affairs is just as much an opportunity as it is a threat.
One last interesting note suggesting that companies like Black Hills IP are part of the same ecosystem as traditional law firms and law schools: The CEO of Black Hills IP is Ann McCrackin, a former professor of law at Franklin Pierce (now University of New Hampshire School of Law), where she was director of the Patent Prosecution and Procedure Program. Prior to that, McCrackin was a shareholder in Schwegman, Lundberg & Woessner, a large patent law firm based in Minneapolis that specializes in high technology.
posted by Bill Henderson
Wednesday, July 3, 2013
As a result of the ABA’s revisions to Standard 509, Consumer Information, there is now a much greater universe of publicly available information about law school scholarship programs, specifically conditional scholarship programs and scholarship retention. Based on a review of law school websites conducted between March 19 and May 29, 2013, I have compiled a complete list of schools with conditional scholarship programs, with only one-year scholarships, with good standing (or guaranteed) scholarships and with only need-based scholarships.
The availability of this data now gives each admitted scholarship recipient some meaningful basis for assessing the likelihood that any given scholarship will be renewed. (That said, within a given cohort of conditional scholarship recipients at a given school, those at the top end of the entering class profile likely retain their scholarships at a higher percentage than reflected in the law school's overall data while those further down the class profile likely retain their scholarships at a lower percentage than reflected in the law school's overall data.)
What do we know about the conditional scholarship programs in place for students entering law school in 2011-12? There were 140 schools with conditional scholarship programs. The average retention rate across all law schools was 69%. In total, 12,735 students who entered law school in the fall of 2011 and continued into their second year of law school at the same school entered with conditional scholarships and 4,387 students lost those scholarships, a retention rate across individual students of 66%. Across the 194 law schools on which I compiled data, the Fall 2011 entering first-year class totaled 46,233, so roughly 27.5% of the students in the Fall 2011 entering first-year class were on conditional scholarships and roughly 9.5% of the students in the Fall 2011 entering first-year class failed to retain their conditional scholarship as they moved into the second year of law school.
The distribution of scholarship retention rates by deciles across all 140 schools reporting conditional scholarship programs is set forth in Table 1. Table 1 shows the largest number of law schools grouped around the overall average retention rate, with 30 law schools in the 60-69% range and 24 law schools in the 70-79% range; nearly 40 percent of law schools with conditional scholarships fall in these two ranges. Interestingly, the decile range of 90% or better is the second largest decile range, with 26 law schools (nearly half of which are ranked 50 or better in the USNEWS ranking). Notably, 23 law schools had scholarship retention rates of less than 50%.
Table 1: Number of Law Schools Reporting Retention Rates by Decile Range
Less than 40%
Four of the eight were law schools ranked alphabetically
Eight of the 15 were law schools ranked between 50 and 99
16 of the 20 were law schools ranked 100 or lower, while only two were in the top 50
23 of the 30 were law schools ranked 100 or lower, while only one was in the top 50
13 of the 24 were law schools ranked in the top 100, but only three of those were in the top 50
12 of the 17 were law schools ranked between 50 and 145
90% or better
12 of the 26 were law schools ranked in the top 50
As shown in Table 2, law schools ranked in the top-50 in the U.S.News 2012 Rankings had the smallest percentage of law schools with conditional scholarship programs, with only 20 law schools – 40% -- having conditional scholarship programs, directly impacting only 1,674 students who had conditional scholarships (12.8% of the 13,109 first-year students at these law schools) and only 192 who failed to retain their scholarships (11.5% of the 1674 conditional scholarship recipients and only 1.5% of the 13,109 first year students). By contrast, across the balance of law schools, over 80% of the law schools had conditional scholarships with 11,061 of the 33,124 first-year students (33.4%) having conditional scholarships and 4,195 (37.9% of those on scholarship and 12.7% of first-years at the balance of law schools) losing their scholarships after their first-year of law school.
Table 2: Number and Percentage of First-Year Students in 2011 Having Conditional Scholarships and Losing Conditional Scholarships by US News Rankings Categories
Top 50 Law Schools
Law Schools Ranked 51-100
Law Schools Ranked 101-146
Law Schools Ranked Alphabetically
Total Number of Law Schools
Number (%) of Law Schools with Conditional Scholarship Programs
Total First-Years at These Law Schools
Number (%) of First-Years with Conditional Scholarships
1,674 (12.8% of all first-year students in top-50 schools)
4,176 (36% of all first-year students in schools 51-100)
2,754 (29.6% of all first-year students in schools 101-145)
4,131 (33.6% of all first-year students at alphabetically-ranked schools)
Number (%) of Conditional Scholarship Recipients NOT Retaining Scholarships
192 (11.5% of conditional scholarship recipients and 1.5% of first-years)
1,454 (34.8% of conditional scholarship recipients and 12.5% of first-years)
1,044 (37.9% of conditional scholarship recipients and 11.2% of first-years)
1,697 (41% of conditional scholarship recipients and 13.7% of first-years)
A number of law schools switched to non-conditional scholarship programs for 2012-13 or will be switching to non-conditional scholarship programs for the 2013-14 academic year. As a result, for the 2013-14 academic year, there will be 131 law schools with conditional scholarship programs, five law schools with non-renewable one-year scholarships, four that only offer need-based scholarships, and 54 law schools with good standing (or guaranteed) scholarships. Of the 194 schools on which I was gathering information, therefore, as of the 2013-14 academic year, 70% will have conditional or one-year scholarship programs (136/194), while nearly 28% will have good standing (or guaranteed) scholarships (54/194), with 2% (4/194) having only need based scholarship assistance. (Note that some law schools with conditional scholarship programs also offer some scholarships on a non-conditional basis and/or offer some need-based assistance.)
Those who might be interested in a more detailed analysis of conditional scholarship programs, may want to look at the draft article I have posted on SSRN – Better Understanding the Scope of Conditional Scholarship Programs in American Law Schools.
[posted by Jerry Organ]
Sunday, June 30, 2013
As noted in Part I of this post, the competitive dynamics among law schools are about to change due to a combination of two factors: (1) the ABA's collection and publication more granular data on school-level employment outcomes, and (2) the decision by U.S. News to make JD Bar Passage Required and JD Advantaged the primary measures for the employed-at-9-months input to its rankngs formula.
The histogram below reveals a near perfect bell curve for this revamped US News
input [click on to enlarge]. This is a huge change from prior years
when schools were all bunched at the 95% level because employment of any
kind was all that mattered. Under the old methodology, any law school that
limited itself to full-time, professional law-related jobs would have
plummeted in the rankings 10 to 50 spots.
Because spring 2013 was the first year with the new methodology, the impact of the change is not well understood. The most stark fact of the new environment is that the full-time, professional law-related jobs are in short supply. Among the class of 2011 (the stats used for the 2013 rankings), this desirable outcome was achieved by only 63.0% of graduates. When we subtract out full-time, long-term law-related professional jobs funded by law schools -- a luxury that only a small number of mostly first-tier law schools can afford -- the total drops to 61.9%.
Digging deeper, some other significant patterns emerge.
The vast majority of law schools feed into the regional labor markets where they are located. In places like California, those markets are saturated.
Among the ABA-accredited law schools in California, 46.5% of the class of 2011 obtained full-time JD Bar Passage Required jobs. The comparable figure for the remaining ABA-accredited law schools was 56.0%. Likewise, there is also a disparity for JD Advantage jobs: 6.2% in California versus 8.3% for schools in all other states. In fact, among the 19 ranked California law schools, only four -- Stanford, UC Berkeley, USC, UCLA -- are above the 63.0% average for full-time, professional law-related jobs.
Based on these data, it should come as no suprise that no law school located in California went up in the 2013 U.S. News rankings. Stanford, USC, and Santa Clara hung onto their ranking, but 11 California law schools dropped, with an average decline of 11 spots. Five other Calfornia schools remained in the unranked fourth-tier category.
In contrast, some of the biggest winners in the methodology change were flagship public law schools that are relatively big fish in smaller regional markets. Students at these schools tend to stay in-state and get JD Bar Passage Required jobs at rates far higher than the 54.9% average for the class of 2011 average.
Below are the top 15 non-national public law schools based on the proportion of FT Bar Passage Required jobs.
Between 2012 and 2013, the average rankings gain for the above schools was +9 spots. Among this group, the only school to go down in the rankings was ASU Law (-3). And that decline was largely due to the fact that ASU reported a 98% employed-at-nine-months figure for the class of 2010--a figure that drew suggestions of aggressive gaming. See Brian Tamanaha, When True Numbers Mislead, Balkanization, April 2, 2012.
The heavier weighting for JD Bar Passage Required jobs also benefits a handful of lower-ranked private law schools that are practice-oriented and tend to feed smaller firms within their regional areas.
- Campbell (71.4% FT bar passage jobs) went from unranked to #126.
- South Texas (64.4% FT bar passage jobs) went from unranked to #144
- St. Mary's (78.3% FT bar passage jobs) went from unranked to #140.
Part-Time Law Schools Dominate JD Advantaged Jobs
JD Advantaged Jobs count the same as JD Bar Passage Required Jobs. But what, exactly, is included in this category? According to the ABA,
A position in this category is one for which the employer sought an individual with a J.D., and perhaps even required a J.D., or for which the J.D. provided a demonstrable advantage in obtaining or performing the job, but which does not itself require bar passage or an active law license or involve practicing law.
See ABA Class of 2012 (definitions). Many professionals enroll in law school on a part-time basis to improve their career prospects. It should be no surprise, then, that schools with part-time programs tend to be the largest producers of graduates with full-time JD Advantage jobs. In many cases, it is the full-time job that the student held during law school -- and presumably retains upon graduation -- that confers the advantage.
Of the top 10 schools based on the percentage of JD Advantage law school jobs, eight had part-time programs and the other two were located in a state capital, which tends to increase the number of opportunities related to government and public policy.
The schools listed above gained an average of 3.5 spots in the rankings, albeit the average is pulled down by the inclusion of Southwestern, which had to weather the brutal California legal market.
It is worth noting that the percentage of JD Advantage jobs is negatively correlated with the percentage of JD Bar Passage Required Jobs (-.33) .The table below summarizes the differences between schools with Part-time versus Full-Time only programs.
The higher percentage of JD Advantage jobs (10.1% versus 6.9%) for schools with part-time programs is unlikely the results of chance, as the differences in means are statistically signficant at p < .001. But what does this inverse relationship mean?
programs tend to be affiliated with lower ranked law schools, which in turn would produce a lower average percentage of JD Bar
Passage Required jobs. Yet, part-time programs are also in larger,
urban locations. Thus, in addition to the continued employment of
part-time students with their current employers, the sheer proximity to
large, specialized regional economies probably increases the proportion
of JD Advantage jobs. Indeed, any school in an large metro area would
be foolish to ignore the human capital needs of non-legal employers, as
knowledge of the law is very helpful in navigating through an ever more
complex, regulated, and interconnected world.
What is the Best Strategy for Maximizing Full-Time, Professional Law-Related Jobs?
Largely through happenstance, the ABA and U.S. News have created an environment where law schools have to ask this basic but very important question. Part-time jobs will no longer cut it. And few law schools have the cash to hire their own grads full-time for a year past graduation -- and if they do, there are probably better uses for the millions of dollars needed annually to prop up a school's ranking.
The new gold standard employment outcome is full-time, long-term professional law-related jobs. The issue of how to maximize this outcome is so pressing and intricate that it may warrant trade-offs in the admissions process, favoring students will lower credentials but more rock-solid employment prospects on the backend at graduation. This is the topic I will take up in Part III.
[posted by Bill Henderson]
Friday, June 28, 2013
NALP recently released the employment outcome data for the class of 2012. The good news is that the absolute number of JD Bar Passage Required jobs went up from the prior year. The bad news is that a significantly larger class of entry-level lawyers were competing for those jobs. The class of 2011 totaled 41,623, versus 44,339 in 2012 (+2,716, or +6.5%). And note, the class of 2013 is likely to be even bigger -- roughly +1.6% based on the size of the entering 1L classes in the fall of 2010 (see ABA enrollment data).
Setting aside the year-over-year flucuations, the trendlines suggest a relatively large and persistent shortfall in the number of full-time, professional law-related jobs. I assembled the graph below from NALP data [click on to enlarge].
[Methodological notes: NALP used the JD-Preferred category until the class of 2011, when NALP and the ABA collaborated on the creation of the JD Advantage category. According to NALP, the jobs in the two categories are "largely the same." See NALP, Detailed Analysis of JD Advantage Jobs (April 2013). The figures for 2012 are estimates of full-time employment calculated from (a) NALP's just released figures for 2012 class size and the percentage breakdowns by job category, and (b) the percentage breakdowns of full-time versus part-time from the prior year, which also relied on the new JD Advantage definition. In short, basic algebra.]
A reasonable expectation of a 3-year, $100,000+ financial commitment is that nine months after graduation, the entry-level lawyer has secured a full-time professional job. See Legal Whiteboard, June 26, 2007. Those outcomes are reflected in the blue-red-green bars above. Since 2007 (the first year that NALP collected data on full-time versus part-time employment), the percentage of jobs fitting these criteria has fallen from 85.0% to 73.9%. So the overall size of the purple bar -- part-time jobs, nonprofessional, unemployment, etc. -- has grown from 15% to 26.1%.
Unfortunately, the pain does not end there. With a limited pool of full-time professional jobs and the number of graduates trending upward, the law of supply and demand kicks in. Consider this arc of median entry-level salaries of employed graduates: $65,748 for class of 2007, $72,000 for 2008, $72,000 for 2009, $63,000 for 2010, $60,000 for 2011, $61,245 for $2008. So, in short, the odds of landing a full-time professional job have gone down, and so has the starting pay. Yet, tuition and student debt continue to edge up. These unsustainable trends have made law schools fair game for criticism by the media and law student bloggers.
That said, a market correction is clearly underway. A considerable number of prospective law students are deciding (rationally) not to apply to law school -- from 98,700 when the class of 2007 enrolled in the fall of 2004 to an estimated 58,424 for the fall of 2013. Likewise, law schools, to the extent they can afford it, are enrolling fewer students. From the high water mark in the fall of 2010 (49,700), law schools only enrolled 41,400 1Ls in the fall of 2012, and the numbers are sure to be even lower this fall. See Jerry Organ's estimates, Legal Whiteboard, May 20, 2013. To weather this storm, law schools are running significant deficits or drawing down their endowments.
So, can we conclude that the market correction will be complete when the relatively small class of 2017 enters the job market four years from now? I certainly think the smaller number of graduates will help. But I would argue that two things have fundamentally changed:
1. Revenues versus credentials. Law schools are struggling with the need to balance their desire to hang onto respectable LSAT/UGPA medians with a need to generate sufficient revenue to cover their operating costs. If a law school favors revenues this year, its US News rankings could drop, affecting its applicant pool in future years. On the other hand, the combination of shrinking 1L classes and lavish scholarships -- a strategy being pursued by dozens of law schools -- is unsustainable over the medium to long term. A decision to enroll fewer students this year is a three-year commitment to lower revenue. If the smaller entering class is repeated next fall, the budget pain doubles. Do it three years running, and the revenue shortfall triples. Many law schools are not trying to outrun the bear; they are trying to outrun other law schools in their regional market. Some law schools may not make it out of this trough.
2. Competition over full-time, professional law-related jobs. If there is one silver lining that has emerged from this troubled period in U.S. legal education, it is the willingness of the ABA to collect and publish more granular employment outcome data at the law school level. In turn, U.S. News has incorporated these data into its rankings formula. Instead of propping up our rankings by hiring our own students or benefiting when they got jobs nine months out working as a retail manager or a cab driver, under the new 2013 U.S. News rankings formula, only full-time, long-term jobs that are JD Bar Passage Required or JD Advantaged are given "full weight."
It is this second point that is going to push change in how law schools do business--we now have an employment outcome in which the ranking payoff is now fully in allignment with what law students want--full-time, professional law-related jobs.
Specifically, the employed-at-nine-months input to the U.S. News rankings formula is currently given 14% weight. According to the U.S. News law school rankings methodology, the magazine is weighting 22 of the 35 employment outcomes collected and published by the ABA. Among these 22 factors, we don't know the internal weighting. What we do know based on the "full weight" given to JD Bar Passage Required and JD Advantage jobs, is that the highest employed-at-nine-month scores will go to law schools with the highest percentages in these two categories. This is a completely new world for law schools -- one that incentivizes what law students care about when they make the decision to enroll.
Part II to follow ...
[Posted by Bill Henderson]
Wednesday, June 5, 2013
For those trying to better understand how legal education can better prepare law students for the world that awaits them, I would encourage you to take a look at the draft article my colleague, Neil Hamilton, Director of the Holloran Center for Ethical Leadership in the Professions at the University of St. Thomas School of Law, recently posted on SSRN. The article is entitled Law-Firm Competency Models and Student Professional Success: Building on a Foundation of Professional Formation/Professionalism. Here is some of the description from the abstract:
A law student who understands legal employer competency models can differentiate him or herself from other graduates by using the three years of law school to develop (and to create supporting evidence to demonstrate) specific competencies beyond just knowledge of doctrinal law, legal analysis, and some written and oral communication skills. . . .
In Part I below, this essay analyzes all available empirical research on the values, virtues, capacities and skills in law firm competency models that define the competencies of the most effective and successful lawyers. Part II examines empirical evidence on the competencies that clients evaluate. Part III evaluates the competencies that make the most difference in fast-track associate and partnership promotions. These data and analyses lead to several bold propositions developed in Part IV:
1. Law students and legal educators should identify and understand the values, virtues, capacities and skills (the competencies) of highly effective and successful lawyers in different types of practice (one major example is law firm competency models analyzed below in Part I);
2. Each student should use all three years of experiences both inside and outside of law school (including the required and elective curriculum, extracurricular activities, and paid or pro bono work experiences) to develop and be able to demonstrate evidence of the competencies that legal employers and clients want in the student’s area of employment interest;
3. Law schools should develop a competency-based curriculum that helps each student develop and be able to demonstrate the competencies that legal employers and clients want; and
4. Both law students and law schools should understand that the values, virtues, capacities and skills of professional formation (professionalism) are the foundation for excellence at all of the competencies of an effective and successful lawyer.
The article presents far more useful information than can be summarized here, and different readers may be struck by different things discussed in the article. One of the most significant takeaways for me, however, is the convergence around an array of competencies frequently not taught in law school. The article analyzes competency models used to assess associate development at 14 medium to large law firms in the Twin Cities and compares that with some other literature on competencies clients look for in attorneys. The analysis demonstrates that in addition to traditionally understood technical skills – legal analysis, oral and written communication, and knowledge of the law – there is significant convergence around several competencies frequently not taught in law school – 1) Ability to initiate and maintain strong work and team relationships; 2) Good judgment/common sense/problem-solving; 3) Business development/marketing/client retention; 4) Project management including high quality, efficiency, and timeliness; 5) Dedication to client service/responsive to client; and 6) Initiative/ambition/drive/strong work ethic.
Whether law schools are going to be able to find efficient ways to offer students opportunities to develop these competencies, it is imperative that we make our students aware that they need to be developing these competencies to give themselves the greatest likelihood of professional success.
[posted by Jerry Organ]
June 5, 2013 in Data on legal education, Data on the profession, Important research, Innovations in legal education, Law Firms, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (0)
Wednesday, May 29, 2013
There has been a bit of a flutter recently regarding law school admissions in light of data from the LSAC Current Volume Summary for May 17, 2013, suggesting that the size of the applicant pool will be larger than earlier projections had suggested. It appears that a larger number of applicants are showing up later in the application cycle than last year. This has generated blog postings on TaxProf Blog, The Faculty Lounge and Lawyers, Guns & Money. While I will be posting my projections for the fall 2013 entering class on this blog in the next couple of days, I first wanted to recap (to the extent available data allows) the situation in which law schools have found themselves as of the fall 2012 entering class.
In November, I posted a preliminary, unofficial comparison of enrollment data for 140 law schools and profile data for 128 law schools that had such information posted on their websites as of November 15, 2012. Now, several months later, I have an updated analysis based on enrollment data from 188 law schools and profile data from 173 law schools that had published on their websites sufficient profile data on which to make meaningful year-to-year comparisons as of May 28, 2013. Please note that this data remains unofficial, having been taken from law school websites, not from any ABA publication. When the ABA posts the digital version of the Official Guide in the coming weeks, I will be able to run an official comparison across all schools.
DECLINING ENROLLMENT – Between 2010 and 2012, 147 of the 188 law schools with available enrollment information (roughly 78%) had a decline in enrollment of at least 5%. Of these 147 law schools down at least 5% in enrollment, nearly half – 73 --- were down 20% or more:
-52 of the 188 law schools with available enrollment information (nearly 39%) had a decline in enrollment of between 20% and 30%.
-21 of the 188 law schools with available enrollment information (roughly 11%) had a decline in enrollment of 30% or more, with 11 seeing a decline in enrollment between 30% and 40% and 10 seeing a decline in enrollment of more than 40%.
Notably, only 16 schools declined between 2% and 5%, only 16 schools were flat (a change between -2% and +2%) and only 9 schools had an increase in enrollment of at least 2%. Across these 188 schools, first-year enrollment declined from 47854 in 2010, to 44141 in 2011, to 40297 in 2012, an overall decline of 7557 or 15.8% between 2010 and 2012.
DECLINING PROFILES -- Among the 173 law schools with complete profile information available for their fall 2012 entering first-year class, the average LSAT profile has declined over the last two years, from a 160.6/158.3/155.4 to 159.8/157.2/153.8. The average GPA profile also has declined, from a 3.64/3.43/3.15 to 3.62/3.40/3.13. In addition, the number of law schools with a median LSAT in the 140s has more than doubled from 9 to 19 between 2010 and 2012.
DECLINING ENROLLMENT WITH DECLINING PROFILES – Perhaps most significantly, of the 73 law schools with declines in enrollment of 20% or more, 52 of those schools also saw a decline in their LSAT/GPA profiles between 2010 and 2012. That means roughly 30% of law schools with available enrollment and profile information for 2012 (52/173) had declines in enrollment of 20% or more and saw their LSAT/GPA profile decline. Notably, seven of these 52 law schools were in the 2012 USNews top-50, 13 were ranked between 51-100, 13 were ranked between 101-145 and 19 were in the alphabetical listing of schools. The declining interest in law school, therefore, is impacting law schools across the rankings, but is more dramatically impacting alphabetical schools than top-ranked schools.
As noted above, I am planning on posting a projection on fall 2013 first-year enrollment in the coming days. I also am planning on posting an analysis of scholarship retention information across all law schools sometime in the coming days.
Monday, May 20, 2013
This week's National Law Journal has a Special Report section on the challenges facing law schools. Karen Sloan has several stories on how law schools are finding alternative sources of revenues beyond tuition dollars for JD degrees (masters's degrees for nonlawyers, online LLMs, and lawyer executive education).
I contributed an essay entitled "The Calculus of University Presidents." Although the essay is posed as the letter I would write to a university president seeking advice on how to handle a significant, unexpected shortfall in law school revenues, the intended audience is lawyers and legal educators seeking to get a handle on the brutal economics that are now threatening the survival of a large swath of law schools.
From the perspective of many, it would be nice if things would go back to the way they used to be. But that is not going to happen. Good lawyers understand that we gain no long-term advantage from hiding from these facts. Instead, we need to confront them honestly and proactively.
[posted by Bill Henderson]
Sunday, March 24, 2013
Each year, the instructors in Indiana Law's 1L Legal Professions class coordinate with Indiana Law's Office on Career and Professional Development (OCPD) to run the Career Choices Speakers Series -- 16 lunchtime forums on Thursdays and Fridays throughout the second semester. It has been an enormous hit with students. Although our 1Ls are required to attend at least three, a huge proportion of the 1Ls attend over ten.
Below is a photo of this Thursday's pizza run for the session on Direct Service Public Interest Lawyers -- 22 pizzas and the laptop/scanner used for attendance. Over the course of semester, we will purchase well over 300 pizzas. Who pays for all of this food and equipment (plus about a dozen dinners for students and alums that occur before and after these events)? An Indiana Law alumni who profoundly believes in the role of ethics and integrity to achieve personal and professional success in life. And he has done so quietly, behind the scenes, every year for the last five.
I thought our alum would enjoy seeing the pizza gurney. Thank you! You are opening students' eyes and helping them make better decisions, all through relationships with other lawyers.
[photo credit, 1L Dakota Scheu, via iPhone]. For additional information on this highly effective program, see my prior post, A New Tool for Lawyer Professional Development.
[posted by Bill Henderson]
Wednesday, March 13, 2013
I was at the ReInvent Law Silicon Valley event last week. Following up on Jerry's thorough remarks, I can honestly say it was unlike any legal education and lawyer conference I have ever attended (the only thing close is Law Without Walls). There is a new guard in the legal academy taking shape, and it is led -- truly led -- by Dan Katz and Renee Knake at Michigan State.
Admittedly, Dan and Renee lean heavily toward my bias. Most of us law professors talk. Dan and Renee, in contrast, are doers. Shortly after becoming assistant professors, they each moved quickly from ideas to action to actually having the audacity to attempt to build new and relevant institutions. Moreover, they both did it untenured--Dan is only in his second year of teaching and Renee just cleared the tenure hurdle earlier this year. They did all of this without a net. To my mind, they are winning the "Game of Life." If other junior faculty follow their example, the legal academy is going to truly change. And right now, that is what we need.
One of my favorite Paul Lippe quotes is this, "In hindsight, the new solutions are all going to look obvious." ReInvent Law was 40 speakers tied together by a common interest in experimentation. Were all the ideas good? If history is any guide, and the criteria is moving from concept to implementation to financial and institutional sustainability, the answer is surely no. But it was invigorating to be in a room of doers who are all willing to risk failure. That is the courage and leadership we need right now. To me, it looked obvious that we need a place like ReInvent Law where insurgent ideas can be expressed with enthusiasm, even if only a handful or fewer will transform the legal landscape.
I was fortunate to be one of the presenters. Dan Katz was kind enough to take my picture when I gave my Ted-style talk (all the talks were Ted-style or "Ignite"). If you zoom-in on me, I look ridiculous. I am no showman. But you have to admit that the lighting is pretty spectacular. The green screen, by the way, is the running twitter feed, an idea that I can assure you was not stolen from the ABA or the AALS.
Amidst all these "revolutionary" ideas, I think my presentation was probably the most conservative. My central claim is that 100 years ago, as the nation struggled to find enough specialized lawyers to deal with the rise of the industrial and administrative state, some brilliant lawyers in cities throughout the U.S. created a "clockworks" approach to lawyer development. These clockworks filled the enormous skills and knowledge gap. Firms like Cravath, Swaine & Moore, through their "Cravath System," finished what legal educators started. (I use the Cravath System as my exemplar because its elegant business logic was written out so meticulously in the firm's 3-volume history.)
The whole purpose of the clockworks was to create a "better lawyer faster." This is a quote from volume II. The company I co-founded, Lawyer Metrics, incorporated it into our trademark -- the value promise is that compelling. See the slides below.
Here is the Slideshare description:
The original Cravath System circa 1920 demonstrated the power of a "clockworks" approach to lawyer development. The system was a meticulously designed and mechanized way to create specialized lawyers who could service the needs of America's rapidly growing industrial and financial enterprises -- lawyers who were in perennial short supply because the requisite skill set could only be learned by doing. The System endured for a century because it solved the specialized lawyer shortage by making every stakeholder better off -- junior lawyers (received training), partner-owners (large, stable profits), and clients (world class service and value).
Today's legal employers and legal educators would benefit by revisiting this system's powerful business logic. The clockworks approach to lawyer development still works. The only difference is that the specifications for a great lawyer have changed. Like the original Cravath System, a new clockworks would create a "better lawyer faster."
[posted by Bill Henderson]
March 13, 2013 in Current events, Data on legal education, Data on the profession, Fun and Learning in the classroom, Innovations in law, Law Firms, Legal Departments, New and Noteworthy, Structural change | Permalink | Comments (0)
Wednesday, February 13, 2013
My previous post on Washington & Lee's 3L Program stirred a lot of interest and commentary, including some disbeleiving critics. Fortunately, Professor Jim Moliterno agreed to write a reply essay, below, that completes the cycle. [Bill Henderson]
Jim Moliterno Replies [This is a long reply, so a PDF version online here]
A number of comments to Bill’s January 28 post and posts regarding it on other blogs cause me to enter this conversation.
Are students really coming to W&L because of the new curriculum? Yes, to a significant extent. How do we know? Because the entering students say so. As do many law schools, we administer a questionnaire to our enrolling students. Among the questions asked is the obvious one: why are you here?
In the most recent such survey the students were asked to rank the strengths of the law school. Here are the top ten, in order, according to the entering students:
- Third Year Curriculum
- Ranking / Prestige
- Quality of Life
- National Reputation
- Job Placement
- General Cirriculum
- Clinical Program
- Financial Aid Award
- Size of Lexington
The curriculum reform was first. Financial aid awards were 9th, just ahead of the “size of Lexington.” The data does not support the unsubstantiated claims of some bloggers that students are choosing W&L because of the generosity of financial aid awards.
The curriculum reform has steadily moved higher on the “strength” rankings given by enrolled students since 2009. The 2011 and 2012 surveys are nearly identical, and the written comments of students about their reasons for coming to W&L (none reprinted here), are more striking than the numbers themselves.
I don’t know of any better data on this proposition but the statements of those whose reasons are under study. If that data is unsatisfying to some, then they will continue to be unsatisfied.
Are there other reasons students come to W&L? Of course. W&L has a highly productive, highly visible faculty engaged in scholarship and projects at the highest levels. Some students undoubtedly value W&L’s faculty prowess. W&L is highly ranked. Some students undoubtedly are affected by a top 25 ranking. It has an excellent reputation as a small, closely-knit academic community. Some students select for the sense of community and size. No reason will ever be the only reason for prospective students to choose a law school. Changes made by law schools will affect student choices for or against a particular law school. The W&L curriculum reform is positively affecting a significant number of students’ calculus about choosing W&L.
And some do come because of the financial aid package they were offered. But the financial aid reason is unlikely to explain the increase in applications since 2008. Some students, the recipients of aid, undoubtedly come in part because of the aid. That is no different than the students who choose [insert name of any school] because of the financial aid they were awarded. In 2012, about the same number of offers of admission were made as in previous years, but instead of the usual 130 or 135 admittees choosing to attend, more than 260 made deposits. Some were asked to defer their attendance until 2013 and once the dust settled we had a class of 187 instead of the usual 130 to 135. This same class entering in 2012 listed the curriculum reform first and financial aid ninth as strengths of the law school.
What else was happening in 2008 and 09 when the applications increased by nearly 33% per year?
In 2009 and 10, while W&L applications were on the rise, the US News ranking fell from 25-34 (while its reputation rank among academics stayed steady). It has now recovered to 24. If anything, that should have led to a drop in applications during 2008-2011 rather than the sharp increases that actually occurred.
Can we exclude all other possible explanations than those previously mentioned? Of course not. It could be that being in a small, beautiful mountain town is all the rage among young adults and 33% more students want that now than wanted it in 2007. I know of no data to prove or disprove that proposition, so it remains one that could be true. The reality is that the students who have come in recent years rate the curriculum reform among the top reasons (often the most important reason) for their attendance at W&L. That matters.
There is empirical evidence that the W&L curriculum reform is engaging students more than in the traditional “no plan” third year curriculum. Is it perfect evidence? Of course not. Is it definitive evidence that has no flaw? Of course not. Is anything ever supported by perfect, definite evidence that has no flaw? Not to my knowledge. We make all of our most important decisions in life based on the best available evidence. As long as the evidence is empirically sound and statistically significant, it is worthy of respect. The evidence of W&L 3L engagement increases is sound and statistically significant and marks a path toward further research and verification.
One commenter suggested that the data is suspect because the peer schools have not been identified. Their data belongs to them, not W&L. LSSSE does not make specific school data available to other schools. So W&L has only a composite score for those peer schools. And it would be unseemly for W&L to reveal the specific schools. I will not do so here. But to be sure, W&L asked LSSSE to calculate the data from a list of schools because they are the schools with whom W&L competes for students and competes in the rankings. It would not have served W&L’s research interests to learn how it compares with a list of schools that it does not compete with in the marketplace. No one at W&L has the data for any specific school.
Nonetheless, do not be mistaken, the schools with whom W&L is compared in LSSSE data are the schools anyone would expect them to be: schools that by their geography, rank and quality compete with W&L in the relevant markets for students and placement.
One observation: in the legal profession and legal education in particular, the status quo never seems to need empirical justification. Only change is suspect and wrong until proven definitively to be otherwise. Is there any empirical evidence that the status quo third year is the best possible third year except that it has been done that way for a long time? None that I know of. The old adage, “if it ain’t broke don’t fix it” does not apply here. The third year of legal education is “broke”.
Amid calls for its abandonment by some, dating back at least to the early 1970s report by Paul Carrington, the third year is widely acknowledged to be of the least value among the three years. (See below on W&L’s largely unchanged approach to years 1 and 2.) The Roman Legions (and more than a few other military powers) have found out that the mere fact that something has been successfully done before is not sufficient evidence that it will prevail in the present or future. Arguing in favor of the status quo based on no empirical evidence, . . . based only on instinct and the argument that it is the way things are currently done, is an approach doomed to failure. Just ask Kodak. (And see my forthcoming book: “The American Legal Profession In Crisis,” Oxford, March 2013.)
How about the claim that “[W&L’s LSAT has] gone down every year since [the new curriculum was announced], while its GPA rank has, after a plunge, more or less returned to where it was.” The blogger made that claim, once again without any data, let alone empirically credible data. Actually the W&L median LSAT was steady at 166 from 2005-2010, dropped 2 points to 164 in 2011 and stayed at 164 for 2012. It has not “gone down every year since [the new curriculum was announced in 2008].” Meanwhile, the GPA of entering classes, which was in the 3.5 and 3.4 range in 2008-2010, has gone up to the 3.6 range (3.65 and 3.62) in 2011 and 2012. The two modest changes in LSAT and GPA have essentially off-set one another in US News points. Hardly the reason for pause suggested by the blogger.
It seems that as long as someone is arguing against change, no rules apply to the arguments’ underpinnings.
Here is what the empirical evidence from the LSSSE surveys shows and what it does not show: students are more engaged in their work and their work includes more writing, more collaboration and more problem solving. Here are a few charts even more striking than those Bill used in his post. Together they say that significantly more than their peers or their predecessors at W&L, current third year students are working more, writing more, collaborating more, applying law to real world problems more, and preparing for class more often. Overall, they describe a harder-working, more engaged student body. And they are working harder at acquire the skills that matter to success as a lawyer.
February 13, 2013 in Blog posts worth reading, Current events, Data on legal education, Innovations in law, Innovations in legal education, New and Noteworthy, Scholarship on legal education, Structural change | Permalink | Comments (6)
Tuesday, January 29, 2013
Here it is in a nutshell. There is empirical evidence that Washington & Lee’s experiential 3L curriculum is delivering a significantly better education to 3L students—significantly better than prior graduating classes at W&L, and significantly better than W&L’s primary competitors. Moreover, at a time when total law school applicants are on the decline, W&L’s getting more than its historical share of applicants and getting a much higher yield. When many schools are worried about revenues to survive next year and the year after, W&L is worried about creating the bandwidth needed to educate the surplus of students who enrolled in the fall of 2012, and the backlog of applicants that the school deferred to the fall of 2013.
[This is a long essay. If you want it in PDF format, click here.]
Alas, now we know: There is a market for high quality legal education. It consists of college graduates who don’t want to cast their lot with law schools who cannot guarantee students entree to meaningful practical training. Some might argue that W&L is not objectively better-- that the 3L curriculum is a marketing ploy where the reality falls well short of promotional materials and that, regardless, prospective students can't judge quality.
Well, in fact there is substantial evidence that the W&L 3L program delivers comparative value. The evidence is based on several years' worth of data from the Law School Survey of Student Engagement (LSSSE). I received permission from Professor James Moliterno, someone who took a leadership role in building W&L’s third year program, to share some of the key results (each school controls access to its LSSSE data.) They are below.
But before getting into empirical evidence, I want to put squarely on the table the most sobering finding that likely applies to virtually all of legal education. It is this: On several key LSSSE metrics, W&L has made impressive gains vis-à-vis its own historical benchmarks and its primary rival schools. But even for this leader, there remains enormous room for improvement. More on that below.
Here is the bottom line: Traditional legal education, when it is measured, does not fare very well. Yet, as W&L shows, substantial improvement is clearly possible. We law professors can respond to this information in one of two ways:
- Don’t measure, as it may disconfirm our belief that we are delivering a great education.
- Measure—even when it hurts—and improve.
I am in the second camp. Indeed, I don’t know if improvement is possible without measurement. Are we judging art work or the acquisition of key professional skills needed for the benefit of clients and the advancement of the public good?
Moving the Market
I doubt I will ever forget Jim Moliterno’s September 2012 presentation at the Educating Tomorrow’s Lawyers (ETL) conference at the University of Denver. He presented a single graph (chart below) showing W&L actual applicant volumes since 2008 versus what would have happened at W&L if its applicant volume had followed national trends.
While law school applicants crested a few years ago, W&L enjoyed a large run-up in volume of applicants, presumably due to the launching of their new 3L program. This larger applicant pool effectively served as a buffer when applicant declines began in 2011 and 2012. Since 2008, overall law school applicants are down -19%, yet W&L is up overall +33%.
But much more significantly, after their experiential 3L year was up and running and the overall legal job market continued to stagnate, W&L yields spiked. Ordinarily they would enroll 135 students. But for the fall of 2012, they received enrollment commitments from well over 260 students. Indeed, at the ETL conference Jim Moliterno said the school had to offer financially attractive deferments to get the class to approximately 185 incoming students -- a 50 student bulge.
When Jim Moliterno showed the above graph and explained the corresponding changes in yield, my good friend Gillian Hadfield, a skeptical, toughminded, evidence-demanding economist who teaches at USC Law, leaned over and said to me, “that is the single most important takeaway from this entire conference.” I agreed. The market for a legal education with practical training is, apparently, much more inelastic than the market for traditional JD programs.
Yet, what is perhaps most remarkable is that a large proportion of incoming students at W&L were enrolling based on little more than faith. Nobody knew for sure if W&L had the ability to pull off their ambitious 3L curriculum. The program relies on a large cadre of adjunct professors, after all, and W&L is located in remote Lexington, Virginia. Many law faculty outside of W&L, and perhaps some inside, thought (or perhaps think) that the program could not live up to the hype. Well, as shown below, the program appears to have produced meaningful gains.
The only data-driven critique anyone can muster is that the gains remain significantly short of perfection. But that critique bites harder on the rest of us. To use a simple metaphor, W&L is tooling around in a Model-T while the rest of us rely on horse and buggy. What ought to be plain to all of us, however, is that, just like automobile industry circa 1910, we are entering a period of staggering transformation that will last decades. And transformation will be roughly equal parts creation and destruction. See Schumpeter.
W&L Data, Internal Historical Benchmark
LSSSE is a phenomenally rich dataset – nearly 100 questions per year on a wide variety of topics related to student classroom experience, faculty interaction, type and quantity of assessments, time allocation, and perceived gains on a variety of dimensions related to personal and professional development. The survey instrument is online here.
Aside from a host of questions related to demographics, career goals, and debt, major sections in the LSSSE include:
- Section 1, Intellectual Experience (20 questions)
- Section 2, Examinations (1 question)
- Section 3, Mental Activities (5 questions)
- Section 4, Writing (3 questions)
- Section 5, Enriching Educational Experiences (9 questions)
- Section 6, Student Satisfaction (7 questions)
- Section 7, Time Usage (11 questions)
- Section 8, Law School Environment (10 questions)
- Section 9, Quality of Relationships (3 questions)
- Section 10, Educational and Personal Growth (16 questions)
W&L deserves to be a detailed case study. But frankly, legal education can’t wait. So I will do the best I can to cover the landscape in a blog post. I hope every law faculty member who reads this post makes a strong plea to their dean to enroll in LSSSE. Why? So your school can benchmark itself against the detailed LSSSE case studies that are bound to flow out of W&L and other innovative law schools. Though they don’t get much press, there are, in fact, other innovative law schools.
Friday, January 18, 2013
Brian discusses the bleak employment prospects of law schools, but (through no fault of his own) understates the nature of the structural change that is occurring in the U.S. and global market for legal services. In Part II, I will write about some logical next steps for law schools looking to get ahead of the coming tsunami.
I tried to write Part II, but a blog post just was not up to the task. Further, I sensed that my colleagues were in no mood for half-baked solutions. There has been enormous criticism of legal education on the blogs and in the media, but very little in the way of detailed prescriptions to improve the situation. I felt an obligation to back off on the criticism and focus on solutions. So, in essence, Part II of my Tamanaha review became an article.
I just posted to SSRN an article entitled "A Blueprint for Change" forthcoming in the Pepperdine Law Review. It is both a diagnosis and a proposed solution -- a solution I am actively pursuing. Here is the abstract:
This Article discusses the financial viability of law schools in the face of massive structural changes now occurring within the legal industry. It then offers a blueprint for change – a realistic way for law schools to retool themselves in an attempt to provide our students with high quality professional employment in a rapidly changing world. Because no institution can instantaneously reinvent itself, a key element of my proposal is the “12% solution.” Approximately 12% of faculty members take the lead on building a competency-based curriculum that is designed to accelerate the development of valuable skills and behaviors prized by both legal and nonlegal employers. For a variety of practical reasons, successful implementation of the blueprint requires law schools to band together in consortia. The goal of these initiatives needs to be the creation and implementation of a world-class professional education in which our graduates consistently and measurably outperform graduates from traditional J.D. programs.
I have a large backlog of shorter articles and analyses that I have not posted because I wanted my own detailed solution in the public domain. I hope to tie all of these ideas together over the coming weeks.
Thank you, Brian Tamanaha, for writing an book that required me to think in terms of solutions.
[posted by Bill Henderson]
January 18, 2013 in Current events, Data on legal education, Data on the profession, Innovations in legal education, Scholarship on legal education, Scholarship on the legal profession, Structural change | Permalink | Comments (2)
Wednesday, November 28, 2012
In August, I posted to this blog a narrative analysis comparing the 2010 and 2011 enrollment and profile data among law schools based on the data published in the 2012 ABA-LSAC Guide and the 2013 ABA-LSAC Guide. In response to recent comments on the 2012 enrollment situation, see ABA Journal Weekly Newsletter and the discussion at The Faculty Lounge, and the further drop in LSAT test-takers in June/October 2012 recently discussed at Tax Prof Blog, I thought it might make sense to update the enrollment and profile analysis to account for 2012 enrollment and profile data, to the extent that it is available, and to offer some thoughts on 2013.
As of November 15, only 140 law schools had published enrollment data on their webpages and only 128 had published sufficient profile data on which to make meaningful year-to-year comparisons. Please note that this analysis is based on "unofficial data," having been taken from law school webpages, not from any ABA publication, and having been taken from law school webpages prior to the LSAC certification of enrollment and profile data which the LSAC is undertaking this year for the first time.
ENROLLMENT IN DECLINE – Between 2010 and 2012, only 12 schools were flat (a change between -1% and +1%) or had an increase in enrollment; 128 of the 140 law schools had a decline in enrollment (a decrease greater than 1%), of which
-89 had a decline of 10% or more, of which
-59 had a decline in enrollment of 20% or more, and of which
-15 had a decline in enrollment of 30% or more.
This means over 90% of law schools for which 2012 enrollment information is available had a decline in enrollment and that more than 40% had a decline in enrollment of 20% or more.
Based on the data published in the 2012 ABA-LSAC Guide, in 2010, these 140 law schools had 33,952 first-years (68.3% of the 49,700 total 1L enrollment (LSAC matriculants)). Based on the data published in the 2013 ABA-LSAC Guide, in 2011, these 140 law schools had 31,082 first-years (68.2% of the 45,600 total 1L enrollment (LSAC matriculants)). In 2012, based on data from law school webpages, these 140 law schools had 28,380 first-years.
The decline in first-year enrollment was roughly 8.45% percent across these 140 schools between 2010 and 2011 (slightly more than the national decline of 8.25%), while the decline in first-year enrollment was roughly 8.69% across these 140 schools between 2011 and 2012.
If enrollment at these 140 schools represents 68.25% of total first-year enrollment for 2012 (the average of 2010 and 2011), that would suggest that total first-year enrollment (LSAC matriculants) for fall 2012 may be as low as 41,500-41,600, a decline of roughly 8.8% from 2011 and a decline of roughly 16% since 2010. (The LSAC certification of enrollment and profile information may come in even slightly lower than this estimate as it is going to be based on snapshots of enrollment on October 5, 2012, which would exclude students who began classes but withdrew prior to October 5, 2012. This group of students might number a few hundred if there were one to three such students at each law school.)
PROFILES IN DECLINE – Between 2010 and 2012, 93 of the 128 law schools with available profile information had a decline in their LSAT/GPA profile (more indicators down then up), 23 had an increase in profile (more indicators up then down), and 12 had a mixed profile (same number of indicators up and down).
ENROLLMENT AND PROFILES IN DECLINE – Most significantly, of the 128 law schools with both enrollment and profile information available for fall 2012, 85 law schools (nearly two-thirds) saw declines in enrollment and in their LSAT/GPA profiles between 2010 and 2012.
Of these 85 law schools, 38 law schools saw declines in enrollment of greater than 20% and saw declines in their LSAT/GPA profiles. That means nearly 30% of law schools with available enrollment and profile information for 2012 had declines in enrollment of 20% or more and saw their LSAT/GPA profile decline. It also means that over 75% of the 50 law schools with declines in enrollment greater than 20% and for which 2012 profile information is available had declines in profile for 2012.
Notably, five of these 38 law schools were in the USNews top-50, 10 were ranked between 51-100, 10 were ranked between 101-145 and 13 were in the alphabetical listing of schools. The declining interest in law school, therefore, is impacting law schools across the rankings, but is more dramatically impacting alphabetical schools than top-ranked schools.
FURTHER THOUGHTS ON 2012 – According to the LSAC Volume Summary, applications to law school slid from 87,900 in 2010 to 78,500 in 2011 to approximately 68,000 for 2012 (although the 2012 numbers have not been finalized). Over the last nine years, law schools, on average, have admitted roughly 56,800 students per year, with a low of 55,500 in 2007 and in 2008. The “admit” rate – which was only 56% for fall 2004 – had climbed to 71% for fall 2011. For the last several years, however, matriculants have averaged roughly 82% of admitted students. So if we did have 41,600 matriculants this fall (as estimated above), and if matriculants represented roughly 82% of admitted students, that would mean we had roughly 50,700 admitted students, the lowest number this millennium, with an admit rate of nearly 75%, the highest this millenium. (Alternatively, if matriculants declined as a percentage of admitted students, it is possible that a larger number of applicants were admitted.)
PROJECTIONS FOR 2013 -- June and October LSAT administrations suggest that there may be fewer than 60,000 applicants for fall 2013. There were 93,341 June/October test-takers in 2009 (for the 2010 admissions cycle) (resulting in 87,900 applicants – 94.2% of tests administered in June/October). There were 87,318 June/October test-takers in 2010 (for the 2011 admissions cycle) (resulting in 78,500 applicants – 89.9% of tests administered in June/October). There were 71,981 June/October test-takers in 2011 (for the 2012 admissions cycle) (resulting in roughly 68,000 applicants – 94.5% of tests administered in June/October).
That is a three-year average in which the number of applicants in a cycle represented roughly 92.9% of the tests administered in June/October. There were 63,003 June/October test-takers in 2012 (for the 2013 admissions cycle). If the 2013 cycle results in a number of applicants representing 92.9% of June/October test-takers, law schools can anticipate there being only roughly 58,530 applicants to law schools for fall 2013. (Notably, in the admissions cycles from 2007-2009, the number of applicants in a cycle represented, on average, roughly 111% of the June/October test-takers, so the estimate of 58,530 may understate the number of possible applicants.)
If there are only 58,530 applicants for fall 2013 (which would represent nearly a 14% decline from fall 2012 -- the third consecutive double-digit decline in applications), and if law schools admit only 50,700 of these applicants, the same as the estimate above for fall 2012, across all law schools over 86% of all applicants to law school would receive offers of admission. If 82% of these admitted students were to matriculate, that would mean a first-year enrollment for fall 2013 that once again would be around 41,500-41,600. Alternatively, if law schools remain somewhat selective and were to admit only 48,000 of the 58,530 estimated applicants, that still would be an admit rate of 82%. If 82% of those 48,000 matriculated, the first-year enrollment would decline to roughly 39,400, a decline of about 5.3% from the fall 2012 estimate set forth above.
There are two competing tensions law schools must weigh in making admissions decisions in a declining market – revenue and LSAT/GPA profile. Do you take the number of students you need to meet revenue projections (even if that means profile slides) or do you take a smaller number of students (and take a revenue hit) in an effort to maintain LSAT/GPA profile?
What the 2011 and 2012 classes demonstrate is that in the current market, for a large number of schools, even taking significantly fewer students did not allow them to maintain their profiles. Given that many schools already have lost significant revenue due to shrinking enrollments in 2011 and/or 2012 (for just one example see the recent discussion of Vermont Law School in the National Law Journal) they will be hard-pressed to shrink enrollment further to maintain profiles. As a result, I think when enrollment and profile data is evaluated in fall 2013, we will see even more widespread declines in profile than was manifested in 2011 and 2012, possibly along with some ongoing declines in enrollment. It seems likely that several more schools may experience both significant declines in enrollment and in profile.
[posted by Jerry Organ]
Monday, November 19, 2012
Law schools care deeply about their academic reputation. If this were not true, my Indiana Law mailbox would not be stuffed full with glossy brochures sharing the news of faculty publications, impressive new hires, areas of concentration, and sundry distinguished speaker series, etc.
Because of the timing of these mailings – I got nearly 100 in Sept and October—I am guessing that the senders hoped to influence the annual U.S. News & World Report Academic Reputation survey. Cf. Michael Sauder & Wendy Espeland, Fear of Falling: The Effects of U.S. News & World Report Rankings on U.S. Law Schools 1 (Oct 2007) (reporting "increases in marketing expenditures aimed toward raising reputation scores in the USN survey"). But does it work? A recent study by Larry Cunningham (St. Johns Law) suggests that the effect is, at best, decimal dust.
Glossy brochures may not reliably affect Academic Reputation, but I have uncovered four factors that are associated with statistically significant increases and decreases of USN Academic Reputation. To illustrate, consider the scatterplot below, which plots the 1993 ordinal rank of USN Academic Reputation against the 2012 ordinal rank [click on to enlarge].
Four sets of dot (Red, Blue, Orange, and Green), each representing distinctive shared features of law schools, tend to be above or below the regression line. These patterns suggest that changes in USN Academic Reputation over time are probably not the result of random chance. But we will get to the significance of the Red, Blue, Orange, and Green dots soon enough.
The primary takeaway from the above scatterplot is that 2012 USN Academic Reputation is overwhelmingly a function of 1993 USN Academic Reputation. Over 88% of the variation is explained by a school's starting point 20 years earlier. Part of this lock-in effect may be lateral mobility. That is, there are perks at higher ranked schools: they tend to pay more; the teaching loads are lighter; and the prestige is greater, etc. So school-level reputations rarely change, just the work addresses of the most productive scholars. This is, perhaps, the most charitable way to explain the enormous stickiness of USN Academic Reputation.
That said, the scatterplot does not show a perfect correlation; slightly less than 12% of the variation is still in play to be explained by influences other than starting position. A small handful of schools have made progress over these 20 years (these are the schools above the regression line), and a handful have fallen backwards (those below the line).
The Red circles, Blue rectangles, Orange diamonds, and Green circles represent four law school-level attributes. The Reds have been big gainers in reputation, and so have the Blues. In contrast, the Oranges have all experienced big declines; and as as a group, so have the Greens. When the attributes of the Red, Blue, Orange, and Green Schools are factored into the regression, all four are statistically signficant (Red, p =.000; Blue, p = .001; Orange, p = .012; Green, p = .000) and the explained variation increases 4% to 92.3%. As far as linear models goes, this is quite an impressive result.
Before you look below the fold for answers, any guesses on what is driving the Red and Blue successes and Orange and Green setbacks?
Thursday, September 20, 2012
NALP just announced that the median salary for first year associates in Big Law has dropped from $160K to $145K. I think that is very significant. We are now back to to the entry level price point of 2007.
But to my mind, there is much bigger story here. In 2011, firms of 500+ attorneys hired 2,856 entry level lawyers. In 2007, that figure was 4,745. So, after five years, Big Law is paying the same wage but hiring 40% fewer lawyers. Compare 2007 NALP Nat'l Summary with 2011 NALP Nat'l Summary.
Here is another important piece of NALP data, generated from the print versions of the July 2012 NALP Bulletin. It shows the percentage of entry level law jobs that are private practice.
Two takeaways here: (1) there is a longterm trendline showing a declining number of private practice jobs--and that is the economic engine that enables law schools to exist at current tuition levels, and (2) the cliff-like dropoff in 2010 and 2011 is likely Big Law, and that hurts.
[posted by Bill Henderson]
Monday, September 3, 2012
NALP notes that for the Class of 2010 -- and the Class of 2011 -- two-thirds of all employed graduates were employed in the state in which their law school was located. This suggests location matters.
Is location important to employment results at a large number of schools? Are some law schools more national than others? Are some states more “local” in hiring than other states? The answers are yes and yes and yes.
ANALYZING SCHOOL SPECIFIC DATA -- This analysis is based on the Class of 2010 and Class of 2011 employment outcome data reported on the ABA Section of Legal Education website, excluding the law schools in Puerto Rico. This means there are 195 law schools in this analysis (if the two Widener campuses are combined).
The law schools were asked to report the three states with the most employed graduates and the number of employed graduates in each of those three states. Taking those totals as a percentage of employed graduates, and paying attention to the states identified, one can get some idea of which schools are “regional” and which schools might actually have a more “national” footprint. The simple result of the analysis is that the vast majority of schools are “regional” rather than “national.”
- For both the Download Class of 2010 and the Download Class of 2011, there were 117 law schools for which more than 67 percent of their employed graduates are employed in the state in which the law school is located.
- For the Classes of 2010 and 2011, there were 144 and 145 law schools, respectively, for which more than 67 percent of their employed graduates are located in the state in which the law school is located or an adjacent state, and 104 law schools for which more than 80 percent of their employed graduates are located in the state in which the law school is located or an adjacent state.
- There were only 46 law schools for which less than 67 percent of their employed graduates were employed in the state in which the law school is located or an adjacent state for both the Classes of 2010 and 2011.
Notably, 28 of these 46 law schools are in the USNews top-50, for which it is easily imaginable that the employment geography is much more national than regional. For many of these 46 law schools, two of the three states with the most employed graduates generally are not adjacent to the state in which the law school is located, suggesting some national reach. The three non-adjacent jurisdictions reflected most frequently should not be surprising – California, the District of Columbia and New York. Of the 18 other law schools, nine law schools are ranked in the alphabetical list of schools -- schools one generally would consider regional – while nine are ranked between 51 and 145 in USNews.
Perhaps most significantly, due to the incomplete nature of some of the data sets, this summary probably understates the number of law schools for which the employment outcome data suggests the law school is more regional than national. Several of these 46 law schools come in with 60% or more of their employed graduates employed in the state of the law school or an adjacent state for both years -- Boston College, Minnesota, NYU, Ohio State and Penn State – and if the data were to include graduates employed in all adjacent states, the total for these schools well might exceed 67 percent.
In sum, then, more than 76% of all law schools and more than 87% of law schools outside the USNews top-50 had more than 67% of their employed graduates in the state in which the law school is located or an adjacent state for either the Class of 2010 or the Class of 2011.
LOOKING AT STATE SPECIFIC DATA -- NALP also notes that for the Class of 2010, there are 30 states in which two-thirds or more of the jobs were taken by graduates from law schools in those states. (Jobs & JDs, Class of 2010, p. 69) Taking NALP’s state-specific data for the Class of 2010 in conjunction with the ABA’s data for the Class of 2010, there actually are 35 states in which two thirds or more of the jobs were taken by graduates of law schools in those states or an adjacent state and 30 states in which three-quarters or more of the jobs within the state were taken by graduates of the law schools in the state or in an adjacent state.
Again, this data likely understates the results. For example, in Arizona, Colorado, Connecticut, Maryland, Tennessee, and Virginia, roughly 65-75 percent of jobs within the state were taken by graduates from law schools within the state or an adjacent state. But with several schools in adjacent states not counted in the tallies because these states were not one of the top three states for employed graduates from those schools, one could infer that were graduates from all schools from adjacent states included the percentage might exceed 75 percent. (Notably, 13 of the 15 states with less than 67 percent of jobs taken by graduates of the law school in the state or law schools in adjacent states are states with modest populations and only one law school (or no law school) – Alaska, Delaware, Hawai’i, Idaho, Maine, Montana, Nevada, New Hampshire, New Mexico, Rhode Island, South Dakota, Vermont, and West Virginia. The other two states are Utah and Virginia. The District of Columbia also falls into this category.)
LOCATION MATTERS -- In sum then, location matters. For the vast majority of law students at the vast majority of law schools, the vast majority of reasonable employment prospects associated with going to a given law school are going to be in the state in which the law school is located or an adjacent state. In the absence of a unique or specific aspect of a law school's program that might make a particular law school very appealing, this suggests that location should matter when considering a law school, perhaps more than ranking.
For example, if a prospective student has a choice between going to a higher ranked regional law school in a state in which the student does not anticipate practicing or living (and perhaps paying more in tuition), or a lower ranked regional law school in the location in which he or she hopes to live and work professionally (and perhaps paying less in tuition), the prospective law student should give serious consideration to attending the lower-ranked regional law school in the location in which he or she hopes to live and work professionally. This will make it easier to begin networking while in law school and to facilitate employment opportunities in the region in which the student is interested in practicing law and living. (And it may help the prospective student save money if the lower-ranked regional school happens to cost less (if it is a public school, for example), or if the prospective student has a more competitive LSAT/GPA profile at the lower-ranked regional school such that the student may be eligible for a scholarship.)
[Posted by Jerry Organ]