October 13, 2011
Moneyball-ing Legal Services (and Law Schools and Law Libraries)
We shouldn't be surprised that the new film, Moneyball, would lead to a discussion of the applicability of Moneyball-ing law firm hiring and retention, and the value of legal services provided to clients. Heck, seven years before the movie and within a year after the publication of Moneyball: The Art of Winning an Unfair Game (W.W. Norton, 2003) by Michael Lewis , law profs Paul Caren (Cincinnati) and Rafael Gely (then Cincinnati, now Missouri) sparked a flury of discussion in the legal academy with their landmark article, What Law Schools Can Learn from Billy Beane and the Oakland Athletics [SSRN], 82 Texas Law Review 1483 (2004). Quoting from the abstract:
In Moneyball, Michael Lewis takes an inside look at how in recent years the Oakland A's have achieved one of the best records in baseball despite having one of the lowest player payrolls. Cass Sunstein and Richard Thaler have argued that the book has large and profound implications for other professions. This review essay by a tax law professor and a labor law professor explores the book's large and profound implications for law schools.
In many ways, legal education is teeming with more inefficiencies than Beane uncovered in baseball. We argue that changes in the economic conditions of higher education and the legal profession, combined with increasing demands for accountability and transparency, created the market demand for measuring organizational success which U.S. News & World Report met with its annual law school rankings. We explore the implications of Moneyball for legal education in three areas.
First, we argue that law school rankings are here to stay and that the academy should work to devise ways to more accurately measure law school success. We advocate the comprehensive collection of data that users and organizations can weigh differently in arriving at competing rankings systems.
Second, we applaud efforts begun in the past decade to quantify individual faculty contributions to law school success. We support measures that take into account both quantitative and qualitative measurements of faculty performance. We provide data that confirm the relationship of productivity and impact measures of scholarship and provide support for isolating background and performance characteristics in predicting future faculty scholarly work.
Third, we use Billy Beane as a prototype and identify the qualities that enabled him to revolutionize baseball. We shift the focus here to deans and present data measuring decanal scholarly productivity and impact. We contrast these figures with the corresponding faculty data and distinguish deans' scholarly performance both in the period prior to becoming dean and while serving as dean. We also offer some surprising predictions, based on the data, of the qualities that a future dean will need to assume the mantle of the Billy Beane of legal education.
I'm thinking Paul, a diehard Boston Red Sox fan, and Rafael, if a baseball fan, hopefully a Cubs fan during his long-term stay teaching at Chicago-Kent Law, should have been offered cameo appearances in the Moneyball movie. But I digress.
Moneyball-ing in the Private Sector. So now comes two New Normal articles from the ABAJ. In If Legal Services Value Stats Were Created, Standardized, Law Clients Could Play ‘Moneyball’, Patrick J. Lamb writes
In sports, statistical analysis is a means to an end: securing the best win-loss record and winning championships. In law, the challenge is whether statistics can be a useful means of determining value, which is, like beauty, frequently in the eye of the beholder. But the efforts to at least circle around some common understandings of value are nothing but positive developments, and defining the kinds of analyses and statistics that are pertinent to hiring and retention of lawyers will assist law firms and clients in focusing on the same indices of value.
Paul Lippe asks and answers in Can ‘Moneyball’ Principles Be Applied to the Valuation of Legal Services?
Can we apply Moneyball-style analysis to law? The answer is a qualified “yes,” informed by three considerations:
First, value of services is inherently more nuanced than value in goods, and law is toward the more nuanced end of the spectrum of services (say more nuanced than a baseball player, a real estate agent or a travel agent, but probably less than a psychotherapist).
Second, value discussions have to be specific—value in sell-side mergers and acquisitions is different from buy-side, and altogether different from counseling to avoid employment discrimination claims.
Third, discussing value is always going to be useful, even though it doesn’t lead to one absolute standard.
Unfortunately, what many Moneyball commentators fail to emphasis is that once the statistical analysis used by the Oakland A's became institutionalized in professional baseball, the initial competitive advantage the A's had was lost. The playing field was leveled. Statistical analysis has become just another tool.
Quantifying value in legal services and in hiring and retention of lawyers is problematic at best as noted by Lamb and Lippe. Will Moneyball-ing in this context also level out the playing field? Without absolute standards, can the metrics be agreed upon? Will the stats used be accurate or gamed?
Moneyball-ing in the Legal Academy. While Caron and Gely's call for more comprehensive law school data so users and organizations can create alternative methodologies to US News Law School Rankings should be applauded, that hasn't happened and that wasn't what caused the flurry of debate in the legal academy. What did was the the second and third issue presented in their article, namely Moneyball-ing the legal academy in terms of law prof and law dean contributions to law school success by narrow focusing on the relationship between productivity and scholarly impact as if the game of the legal academy is careerism of law profs and their dean.
Isn' the game about producing practice ready graduates by an institutional scorecard of the outcomes of a profession education? One ouside the legal academy can't start Moneyball-ing that because even the current self-reported and unadited data provided by the "team" is unreliable. One might say, the legal academy's credibility issues compounded by the ABA failure to police law schools as their accrediting body has lead many to conclude that law schools have been Moneyball-ing gamed data to move up in the standings of the major leagues knows as US News Law School Rankings. And just like in baseball, the more teams that do this, the more level the fudged playing field is.
Moneyball-ing in Law Libraries. Well, actually first in academic law libraries. At least our data isn't known to fudged when it comes to the size of a academic library collection but I am reminded of the objections raised about the prospects of devaluing the volume count stat at a NOLA seesion. In our New Normal of Shed West Era-Digital First-Digital Only, I think the issue, the value of that metric, has been settled. It's not important.
Law librarian contributions to the success of their employers is not measured by the size of their collections but by the services they provide and are responsible for: saving costs by way of their negotations with vendors, executing efficencies by way of implementing sensible information technologies and e-communications, providing expert services to their user populations.
Our duty is to our institutions but there is little no chance that some new metrics can be created. Providing a metric on expert services is too subjective unless opinion polls are based on scientific principles of opinion research. Just slapping together a bunch of questions for SurveyMonkey doesn't cut it.
There are metric that can be used to evaluate library websites in terms of traffic, what library website pages are visited the most, etc., as long as one carefully understands the limitations of web server logs, plus the frequency of updating library site pages, adding new ones and contact a librarian in real time by way of web links. One could also add law library (not law librarian) blogs, Facebook pages, Twitter accounts. In addition to traffic, frequency of posting institutionally relevant messages, plus the life-span of these alternative forms of web communications could be measured, if anyone really cared to do so.
Then there is cost saving by way of law library-vendor negotiations. Thanks to NDAs we can't share that data in any way, even if specific institutional identities were replaced by alternative identifiers. Hell, I don't think we can even share database-by-usage stats by type and subtype of law library market sectors. At least we can't do that via AALL mediums because that would be "anti-competitive." [JH]
September 14, 2011
PACER Fees Going Up
Public access to federal court docket materials is going up from 8 cents a page to 10 cents a page. As the Judicial Conference puts it:
In separate action, the Conference responded to inflationary pressures by increasing, effective November 1, certain miscellaneous fees for federal courts. The newly approved court fee schedule, the first inflationary increase in eight years, is expected to result in an estimated $10.5 million in additional fee revenue for fiscal year 2012. Fees in appeals, district, and bankruptcy courts are affected. The income the Judiciary receives through miscellaneous fees allows it to reduce its annual appropriations request to Congress.
The Conference also authorized an increase in the Judiciary's electronic public access fee in response to increasing costs for maintaining and enhancing the electronic public access system. The increase in the electronic public access (EPA) fee, from $.08 to $.10 per page, is needed to continue to support and improve the Public Access to Court Electronic Records (PACER) system, and to develop and implement the next generation of the Judiciary's Case Management/Electronic Case Filing system.
The EPA fee has not been increased since 2005. As mandated by Congress, the EPA program is funded entirely through user fees set by the Conference. Implementation of the two-cent per page increase will take a minimum of six months.
The Conference was mindful of the impact such an increase could have on other public entities and on public users accessing the system to obtain information on a particular case. For this reason, local, state, and federal government agencies will be exempted from the increase for three years. Moreover, PACER users who do not accrue charges of more than $15 in a quarterly billing cycle would not be charged a fee. (The current exemption is $10 per quarter.) The expanded exemption means that 75 to 80 percent of all users will still pay no fees.
Weren't the courts making excess money from PACER? What gives? [MG]
August 12, 2011
Location, Location, Location: Feeder Law Schools for BigLaw Associate Hiring and Promotion to Partnership
Two B-school profs, Paul Oyer and Scott Schaefer, have published the results of their study at American BigLaw Lawyers and the Schools that Produce Them: A Profile and Rankings:
We profile the lawyers that work at the largest 300 American law firms as of the Summer of 2008. We show how gender, years of experience, prestige of law school, and other qualities vary across lawyers of different rank and firms of different prestige. Geography is an important determinant of where lawyers work, with many going to undergraduate school and law school near where they ultimately practice. Geography is less important, however, at more prestigious firms and for graduates of higher ranked firms. We then go on to rank law firms based on the prestige of the law schools they attended and we rank law schools based on their success at placing lawyers at BigLaw firms. Chicago, Harvard, and Yale law schools are the clear leaders in placing graduates at BigLaw firms. We provide important caveats about these rankings
First, our rankings are sufficiently close to other rankings that it is clear, as one might expect, that whatever leads a school to be successful in other rankings also leads them to be successful in placing lawyers at BigLaw firms. Second, we find that the University of Chicago, Yale, and Harvard law schools are clearly the most successful at placing lawyers at BigLaw schools. Finally, we show that BigLaw firms have a bias towards East Coast schools -- West Coast schools rank lower by our measure than by previous rankings.
(Emphasis added.) Hat tip to Leiter's Law School Reports.
Which law schools produce the largest numbers of partners at national law firms? This article reports the results of a nationwide study of junior and mid-level partners at the 100 largest U.S. law firms. It identifies both the top 50 feeder schools to the NLJ 100 nationwide and the top 10 feeder schools to those same firms in each of the country’s ten largest legal markets. U.S. News rank turns out to be an unreliable predictor of feeder school status. Hiring and partnering by the NLJ 100 are remarkably local; law school rank is much less important than location. Perhaps surprisingly, Georgetown emerges as Harvard’s closest competitor for truly national status.
Comparing Seto's study with Oyer and Schaefer's makes for interesting reading. [JH]
July 22, 2011
Google Rolls Out Limited Launch of Google Scholar Citations
On July 20th, Google rolled out a limited launch of its new citation tracker because "[t]his is a new direction for us and we plan to use the experience and feedback from the limited launch to improve the service." From the official Google Scholar Blog post:
We use a statistical model based on author names, bibliographic data, and article content to group articles likely written by the same author. You can quickly identify your articles using these groups. After you identify your articles, we collect citations to them, graph these citations over time, and compute your citation metrics. Three metrics are available: the widely used h-index, the i-10 index, which is the number of articles with at least ten citations, and the total number of citations to your articles. We compute each metric over all citations as well as over citations in articles published in the last five years. These metrics are automatically updated as we find new citations to your articles on the web.
You can enable automatic addition of your newly published articles to your profile. This would instruct the Google Scholar indexing system to update your profile as it discovers new articles that are likely yours. And you can, of course, manually update your profile by adding missing articles, fixing bibliographic errors, and merging duplicate entries.
You can also create a public profile with your articles and citation metrics (e.g., Alex Verstak, Anurag Acharya). If you make your profile public, it can appear in Google Scholar search results when someone searches for your name (e.g., Richard Feynman, Paul Dirac). This will make it easier for your colleagues worldwide to follow your work.
And, I guess, another way to sell ads.
Hat tip to Deborah Hackerson's Legal Skills Prof Blog post. [JH]
July 15, 2011
A Three Tiered World of Employed Law School Grads: Understanding The National Jurist's Ranking of Best Law Schools for Standard of Living
The September issue of The National Jurist will publish its ranking of 135 law schools by a standard of living metric that uses median starting salaries, average debt payments, estimated federal and state taxes and cost of living adjustments for the regions where graduates were employed. 63 schools were excluded from the ranking because the percent of graduates with a known salary was below 40%; seven schools were omitted due to lack of data. The top 50 law schools can be viewed at Best law schools for standard of living. (Hat tip to TaxProf Blog). From Best law schools for standard of living:
The National Jurist first did the standard of living study in 1999 and reported that graduates who entered private practice at six law schools at that time had a lower standard of living than they did as students. Since then, salaries have increased dramatically, improving the standard of living at almost every law school in the nation. Debt repayment options also improved in 2009 with a new federal law.
However, there are big differences between schools. For example, graduates at the University of Texas take home a net of $101,308 after debt and taxes, and modifying for cost of living adjustments. More than half of the schools in the study netted less than half of that amount, with six lower than $25,000.
Crunching the Numbers for Employed Law School Grads Practicing Law. In Why some grads are worse off, even while most are better off, Jack Crittenden, Editor in Chief of The National Jurist, writes
[O]ne has to take a close look at the data to understand what really happened to the legal profession over the past ten years.
Large law firms increased the number of hires, and they significantly increased their salaries – from $70,000 in 1998 to $160,000. That means that there is a segment of the population — 22.3 percent — that is far better off than 10 years ago. But there is a segment — the 18.7 percent who landed jobs with firms of two to 10 attorneys and those unemployed — that would be worse off — except for the fact that loan repayment plans are much more flexible. The other 50 percent of graduates saw modest improvements in standard of living.
That has created a world of tiers. For the top tier, the 22.3 percent who get jobs at firms with 101 or more attorneys or land prestigious clerkships, law school is a very good financial decision. Even if their debt is high, their salary makes up for it. For the next tier, the 9.3 percent who get jobs at law firms with 11 to 100 attorneys, the 13.5 percent who go into business and the students who land other clerkships, law school is most likely a wise decision, so long as they watch their debt. For the third tier, the 18.7 percent who work for a small law firm, the 5.7 percent who enter public interest, and the 11.4 percent who work in government, law school is a poor choice, unless they planned for the lower salary.
March 28, 2011
No Surprise: Online Research Beats Manual Research
There is a study called A Day Without A Search Engine: An Experimental Study Of Online and Offline Search, and it's getting some press. It basically compares online and manual research as conducted in a library and highlights the differences in the time involved to arrive at results and the quality of the results. The study, from the University of Michigan, is focused on general research and not legal research.
The experiment was designed around random web queries within the United States on a single day. They were filtered so that they could be answered manually or online. One criticism of taking questions from an online source is that they may be biased towards the web as a source. There is no suggestion in the report that the designers looked at the results before selecting the questions for the study. As I'm fond of telling students, and as much as I love using the web myself, not everything is online. More on that in a bit.
The experiment results showed four things. Using 305 questions, 99.7% were answered via the web and 90.2% were answered in the non-web treatment. Online searches take one-third the time of an offline search. Web searchers used more sources than those who used the library. 70% of the non-web users consulted a reference librarian. Finally, quality of web and non-web sources were not significantly different.
Google, one can imagine, is just ecstatic with the results. Hal Varian, Google's Chief Economist, was quoted in a short interview in The Economist about the productivity gains from web research. His numbers were about 22 minutes using the library compared to 7 minutes using a Google search. He sees this as the democratization of data, enabling users with an Internet connection in remote location.
This study focused on general research, with the answers limited to facts in some cases, and others with conceptual responses that are fleshed out. I'm intrigued as to whether similar results would be obtained with legal research. I think a knowledgeable researcher can pull a cited case online, whether from a commercial service or a free resource such as the Google Scholar case database. It's the conceptual searching I'm wondering about. Lexis and Westlaw certainly have their share of topical commentary. Natural language searching is pretty sophisticated, especially in its WestlawNext incarnation.
Still, with all of this, I'd be curious of the time and the efficiency savings compared to cost of commercial services, the free web, and print. Price wasn't a factor much in the Michigan study though participants weren't limited to Google. They were allowed to use pretty much any electronic resource to which they had access. I have to assume that some of Michigan's commercial subscriptions were part of the mix. The free web has delivered more or less on the promise of unedited primary law, though real library-like organization is lacking. Searching legal concepts is one of those areas where it falls completely short, in my opinion. Given all of this, I wonder if there are recent numbers out there that really measure the efficiencies in online and manual legal research. [MG]
March 25, 2011
Crumbling Relic or Vibrant Source of Legal Illumination: New Citation Study Offers Ray of Hope That Scholarly Legal Articles Are Used by Judges (or at least Their Law Clerks)
The convention wisdom is judges are frustrated with legal scholarship's conceptual nonsense and law profs are upset that all the toil and trouble they put into their scholarly works is viewed with disdain by the bench. Whit D. Pierce & Anne E. Reuben's Empirical Study: The Law Review Is Dead; Long Live The Law Review: A Closer Look at the Declining Judicial Citation of Legal Scholarship, 45 Wake Forest L. Rev. 1185 (2010) offers some evidence that citation to legal scholarly writings is not disappearing from judicial opinions and in some cases is actually increasing. "The outcome of this Study," writes the authors, "is not a reason for the institution of law reviews to rest on its laurels. Lest “the dialogue between practitioners, judges, and academics, which began in 1875 in the first student-edited journal . . . , come to an end,” law reviews should account for the fact that the law and the world in which it operates are dynamic."
Two snips from the study's conclusion:
The point is this: legal scholarship as a means for shaping the law is not a thing of the past but that reality is not a reason for law reviews to rigidly maintain the status quo. The simple fact that so much attention has been devoted by both judges and scholars to ensuring that the legal scholarship supply is meeting the judicial demand demonstrates a continuing need for academic commentary on particularly relevant topics. This need can be filled by a combination of traditional law reviews and their online companions.
In the world of law reviews, it appears that, while the traditional print “body” of the institution is weakened, the dignitas of law review—scholarly thought and analysis—remains vibrant and useful.
Hat tip to Legal Skills Prof Blog. [JH]
February 20, 2011
Data Sets for Everyone
Data sets seem quite popular in the legal academy these days. Seems we are just catching up to the other disciplines who have been digesting data while we were all digesting cases. (A data set is merely acollection of data variables that have been derived from a single data source. It is usually presented in tabular form and lets the user extract variables to create new data.) Some data sets are held by private concerns (think pharmaceuticals for example) and are sometimes available to academic researchers for a stiff fee. Other data sets are open source, and often available for free. (Rob Richards maintains an archive of his posts on available data sets . Also, Data.gov keeps track of data sets released by the U.S. government.)
For example, during the past two years, Google has been creating data sets and making them available on their Public Data Exchange web site. This is because they have nothing else to do.
There are 27 data sets to play with on their web site. They range from mortality rates in the United States to government debt in the EU. And, you are permitted to link to them or embed the visualizations in your own web pages. Google bases its technology on Hans Rosling's work from whom they bought it. (You might have seen this You Tube on Hans Rosling's visualization project, http://www.youtube.com/watch?v=jbkSRLYSojo)
During the weekend, Google opened up the Dataset Publishing Language that it uses to produce the resources on the Public Data Exchange web site. This means for all those empirical undertakings at law schools all over the world, we now have an open source product that can visualize the data beyond the common tabbed columns. This could also help visualize collection statistics in ways not yet viable in many ILS's. For example, if you wanted to compare prices on treatises on EU law by publisher. It is simple to extract this data from your ILS and dumped into this product to obtain a nice bubble chart or other visual aid to review with your acquisitions staff.
Of course, your ILS vendor might just take this open source product, tag it onto their product, and then charge you mega $'s to use it. Sort of like RSS feeds or filtered searching.
Good job Goggle. (VS)
September 07, 2010
The Saga of Garbage In, Garbage Out Continues: Employment Data Reported by Law Schools is "Junk"
Although US News has changed its ranking method to try to overcome law school gaming of employment data, critics argue the unaudited reported data is just plain "junk." See National Jurist's Employment Data Under Fire. Hat tip to TaxProf Blog. [JH]
August 20, 2010
A Word of Advice for Prospective Law School Students and 1Ls on Acceptance or Transfer Decisions: Ask the Admissions Dean "how much are you paying me to 'gain on-the-ground experience, and strengthen [my] lawyering skills' when I graduate?"
There was a time when law school students were more worried about the terms of an employment offer than they were about whether any employment offer would be forthcoming. But those days are over. The economic downturn has produced a glut in the labor supply and today's grads will be competing for job opportunities with laid-off attorneys with real work experience. In In defense of young lawyers (NLJ), Duke Law School Dean David Levi writes
The downturn in the legal economy has been hard on many new and young lawyers. They have faced lengthy deferrals and withdrawals of job offers, layoffs, shrinking job prospects and lower salaries. While unwelcome, these new burdens are at least understandable; they reflect the laws of supply and demand at a time when there is simply less legal work to go around. What is not understandable is the surprising amount of criticism heaped upon younger lawyers, offered as if to justify placing a disproportionate share of the economic downturn on their shoulders
The criticism comes from law firm managers, in-house counsel and former lawyers who now comment on the legal profession. They most likely represent a minority view, but they are vocal. They say that clients are no longer willing to pay for the work of young associates because their work is "worthless." We might expect clients to make any argument that could lead to a lower bill, particularly during an economic downturn. But it is wrong and surprising for experienced lawyers inside and outside of firms to acquiesce in, even reinforce, this line of argument.
As the title of his NLJ article states, Dean Levi makes a pitch for the value of young lawyers. "Like any good CEO should, Duke Law School dean David Levi has written an editorial defending his product," writes ATL's Kashmir Hill. Hill observes that Duke is willing to prove its point by way of the school's Bridge to Practice program. Launched in 2008, Duke pays law school grads a stipend for eight- to 12-week fellowships in nonprofit and advocacy organizations, district attorneys offices, law firms, and courts to "gain on-the-ground experience, and strengthen their lawyering skills." See Duke Law News, 100% employment: Meeting a lofty goal. The program had nine participants in 2008 and 15 in 2009. The number likely doubled this year reports ATL's Hill in his June 10, 2010 ATL post entitled The Secret to ‘100% Employed at Graduation’: Duke’s Bridge to Practice. With an average class enrollment of about 210 students that means a whopping 14% of the Class of 2010 participated in Duke's program.
No way this can hurt Duke maintain its amazingly high as in 100% employment rates at graduation and nine months after, right? Just the "facts" from Duke Law's website here. Like a good CEO, Dean Levi knows how to sell his product, law school grads, and at a discount to employers no less, while also enhancing the school's prospects for scoring high marks in US News Law School rankings (currently 11, 10-year average also 11, according to LLB's The Long, Hard, Nearly Impossible Climb to Reach Elite Status: Top 30 Law Schools, 2002-2011.)
It's common knowledge that unaudited employment rates for law school grads reported to US News are gamed because law school administrators are not held accountable by any ethical standards. In this instance, perhaps a big asterisk needs to displayed in next year's US News Law School Rankings under the Placement Success metric for Duke. Duke, however, is not alone. See, for example, ATL's SMU Will Pay You To Hire Their Graduates.
Law schools that are doing this actually get a 2-fer because underneath the mask of US News metrics, what drives rankings improvements is substantially increasing the school's expenditures-per-student. Increase that by any means other than scholarship awards is the quickest, most cost-efficient way to improve a law school rankings. See What Is It Going to Cost Stanford Law to Become the No. 1 Law School in the Country? Another way to increase expenditures-per-student.is to hire more full-time faculty to reach the promised land of less than a 10:1 student-to-faculty ratio. This is half the 20:1 ratio under ABA Accreditation Standards. Of course, that reported ratio isn't audited either to take into account law profs on sabbatical for the reported year.
Now, I for one, like the idea that law schools are financing the employment prospects of their students and that lower student to faculty ratios are beneficial to the educational experience assuming the profs counted each year aren't on sabatical and certainly prospective law school students want to know what their employment prospects may be if they attend a given law school. However, these metrics used by US News are so seriously flawed that it reveals what most folks, except, perhaps prospective students and employers, know, namely the fundamental reason US News produces its series of annual rankings is because it sells magazine issues. As Brian Leiter recently stated, "If there's money to be made in ranking Rabbis......then why isn't Newsweek also ranking law schools? Seriously, I'd tell them how to do it."
My unsolicited advice to prospective law school students before deciding on a school to attend and 1Ls before deciding to stay put after the first year is to ask the Admissions Dean "how much are you paying me to 'gain on-the-ground experience, and strengthen [my] lawyering skills' when I graduate?" [JH]
August 11, 2010
What Is It Going to Cost Stanford Law to Become the No. 1 Law School in the Country?
Recently the dean of Stanford Law School, Larry Kramer, announced he intends to make Stanford Law the #1 law school in the country. While Dean Kramer did not explicitly refer to the US News Law School rankings that's the scorecard most folks will point to as "proof" should Stanford Law achieve the status of so-called "best law school" in the country. "U.S. News and World Report’s annual ranking of law schools overwhelmingly dominates the public discourse on how law schools compare to one another." Quoting from the ABA Section on Legal Education and Admissions to the Bar's Report of the Special Committee on the U.S. News and World Report Rankings (July 15, 2010) "We believe that, for better or worse, U.S. News rankings will continue for the foreseeable future to dominate public perceptions of how law schools compare, and that there is relatively little that leaders in legal education can do to change that in the short term."
It's common knowledge that the way to improve a law school's ranking under the US News ranking methodology is to increase expenditures per student. In Can Stanford Be #1 in the US News Rankings? The Data, Bill Henderson reports the following:
My back of the envelope calculations suggest that a check for $350 million ought to be enough to produce enough endowment income [for Stanford] to eclipse Yale in the US News rankings. This assumes that the money is used for things like books, more faculty, and higher salaries for everyone. If the money is spent on student scholarships, however, Stanford would need a check for roughly $1.8 billion to be #1. Again, these are the idiosyncrasies of the dominant method of law school rankings.
Providing a high quality education at an affordable cost is not the way to improve a law school's ranking. From the above-linked ABA Special Report:
1. The current methodology tends to increase the costs of legal education for students. As a recent study by the United States Government Accountability Office has suggested, the U.S. News methodology arguably punishes a school that provides a high quality education at an affordable cost. Because low-cost law schools report a lower expenditure per student than higher cost schools, it is difficult for low tuition schools to top the rankings. A school that works hard to hold down costs may indeed find itself falling in the rankings relative to a peer that increases tuition above the rate of inflation each year. U.S. Gov’t Accountability Office, GAO-10-20, Higher Education: Issues Related to Law School Cost and Access (Oct. 2009).
2. The current methodology tends to discourage the award of financial aid based upon need. Because median LSAT score and median UGPA are so important to the current rankings, law schools have largely abandoned other measures of merit or need in awarding financial aid. This can have the effect of shifting financial aid to those students with LSAT scores that will assist a school in achieving its target median for rankings purposes. The result is that students with the greatest financial need often are relegated to heavy borrowing to attend law school.
Henderson ends his post with the following comment:
The legal profession, especially our students, have some big problems at the moment. And society's are even larger. The best law school is one that prepares its students to solve these problems. This requires a careful balance of innovative teaching and scholarship. The U.S. News rankings don't capture these metrics. In fact, they obscure them and create incentives for truly destructive behavior. By and large the deans are trapped. From my own perspective, I don't think even one law school in the US News Tier 1 has reached even 10% of its potential to educate and solve problems. Too many one-professor silos. Too much ego.
That's the real lost opportunity cost. [JH]
June 04, 2010
There is no reliable method to measure the "scholarly" quality of law faculty: Results of Brian Leiter's Poll.
So what is the best way to evaluate the "scholarly" quality of a law faculty? Chicago law prof Brian Leiter launched a poll to find out by asking participants to rank order, from best to worst, the different ways of assessing the scholarly quality of a law faculty, broadly defined. 257 survey takers responded to the call and the results are in.
- There is no reliable method (Condorcet winner: wins contests with all other choices)
- Impact/citation studies loses to There is no reliable method by 125–115
- Reputational surveys loses to There is no reliable method by 130–110, loses to Impact/citation studies by 118–107
- SSRN Downloads loses to There is no reliable method by 163–65, loses to Reputational surveys by 173–51
About the results, Leiter writes:
I'm a bit puzzled by the victory of "there is no reliable method," though at least some readers told me they chose it as a proxy for "none of the above." That would make more sense, since I assume all those who voted for "no reliable method" are in habit of adjudging some faculties better than others, so they must actually believe there is some rational basis for those judgments. Alternatively, perhaps some readers took "reliable" to mean wholly accurate or infallible, and then, of course, one would have to agree.
Those desperately seeking some sort of "scholarly" recognition by the asinine metric of mouse clicks (SSRN download counts) ought to know better by now. [JH]
June 01, 2010
Forbes Worst Master's Degrees: MLIS ranks 25th out of 30
Forbes took a look at master's degree programs offering the best and worst salaries and employment prospects over the next decade. Click on the article's slideshow links for the findings. Forbes' method to determine the worst master's degrees: "we looked at 30 popular degrees and mid-career salaries, as provided by Payscale.com, which lets users compare their pay with that of other people in similar jobs. Next we looked at Bureau of Labor Statistics data, to see how fast employment was expected to increase over the next decade for popular jobs held by people with these degrees. Finally we considered the number of job openings based on expected replacement needs during the next 10 years."
In the worst master's degree rankings, MLIS ranks 25 out of 30.
No. 25: Library and Information Science
Mid-career median pay: $57,200
Projected employment increase: 16%
Job growth, including replacement needs: 41%
Common jobs: Reference librarian, library director, law librarian
In the best master's degree ranking, Physician Assistant takes first place honors.
No. 1: Physician Assistant Studies
Mid-career median pay: $98,900
Projected employment increase: 39%
Job growth, including replacement needs: 57%
Common jobs: Nurse anesthetist, clinical nurse specialist, director of nursing
I distinctly remember one of my library school profs comparing the nursing profession to librarianship; the former helping to healing and preserve the health the body while we helping to do the same for the information needs of the mind. Well, we don't all do this just for the money and I, for one, prefer to visit my physician assistant instead of my doctor because she spends more time with her patients.
Hat tip to LISNews. [JH]
May 22, 2010
Justia's Most Popular Law Blogs, All Time, This Month and This Week
Oh boy, looks like Slaw and LLB are running neck to neck in the legal information professional blogosphere, ranking 51st and 52nd respectively in Justia's all-time most popular law blogs (I refuse to use "blawgs") out of over 3,000 listed blogs ("based on the number of visits to the blawg (oops) from Justia's BlawgSearch (oops again) search engine and directory listing pages"). In 18th place, LLB is the highest ranked blog in this self-created category "this month," ditto this week (15th place).
Can you tell, I'm just killing time waiting for my wife? [JH]
May 11, 2010
Party Hard at an Elite Law School
The top ranked party law school according this very important law school ranking study is the University of Arizona. TaxProf Blog has done a great service to law school applicants by cross referencing the party law school ranking to the U.S. News Law School ranking here to provide very useful information to law school applicants who want to attend a highly ranked school by both. By the party-hard-at-a-top-ranked US News metric, the best schools appear to be UC-Berkeley, Michigan and Harvard. Sorry Yale. Looks like the best thing to do is acquire residency status in California or Michigan for a 2-fer.
Of course, once the party is over, the job search begins. See TaxProf Blog's Did 16 Law Schools Commit Rankings Malpractice? about employment data supplied by law schools and employment data assumptions made by US News for the the 74 schools that did not supply U.S. News with the percentage of the class employed at graduation. [JH]
April 20, 2010
The Long, Hard, Nearly Impossible Climb to Reach Elite Status: Top 30 Law Schools, 2002-2011
According to my back-of-the-envelope calculations only 37 law schools have reached the Top 30 as ranked by US News during the last 10 years. 13 law schools have been ranked in the Top 10 and 24 schools in the Top 20 at least once since 2002. There is a long, hard, nearly impossible climb for lower ranked law schools to reach this elite status. Based on the data, one might say only Indiana-Bloomington appears to be achieving this ascent up the mountain.
The below table (click to enlarge) reports US News Law School rankings for the Top 30 by year, color-coded to show movement between ranks 1-10, 11-20, 21-30, and below 30. The schools are sorted by this year's ranking. The table also provides the 10-year average rank for each of the 37 schools that have been ranked in the Top 30 at least once between 2002 and 2011. Feel free to report any errors or omissions in comments to this post. [JH]
April 15, 2010
2011 US News Law School Rankings Officially Published
Meanwhile Northwestern Dean David Van Zandt, one of the most well-known proponents of law school rankings, offered the following about the US News rankings in an Above the Law post, Rankings Are Valuable (And Here to Stay); So Let’s Focus on Making Them Better:
We need to keep in mind that our applicants are sophisticated and have the ability to give the rankings the appropriate weight in their decisions. For many prospective students, the rankings offer a starting point to begin the extensive process of researching undergraduate or graduate/professional programs. But serious applicants are not going to rely solely on rankings as a basis for their decision. They likely will use the rankings information in tandem with information gained through campus visits and individual research.
But, is the US News ranking methodology too simple? (WSJ Law Blog post on Dan Solove questioning the peer assessment ranking method at How to Fill Out the US News Law School Rankings Form.See also Solove's follow-up post, Robert Morse’s Response on the US News Law School Rankings.)
Dean Van Zandt, by the way, was ranked one of the three most influential legal educators of the decade by the National Law Journal recently. [JH]
March 29, 2010
National Jurist's Annual Waste of Time: Academic Law Library Ranking
Once again National Jurist has published its asinine annual ranking of law school libraries. 50 percent of the score is based on the number of volumes and unique titles. The ratio of library study seating to enrollment counts for 20 percent. The ratio of full-time professional librarians to enrollment and the number of hours that the library is open each week account for 15 percent each. At least the magazine stopped counting the number of computer workstations because "wireless technology has replaced old school-provided hard-wired stations."
National Jurist might want to rethink some of the other old-school metrics it is has been using to perform this annual waste of time. No, I am not going to report the top X-number of academic law libraries from this ranking. It's utterly too ridiculous. If you can't resist, click on the above link.
Beware the status-obsessed law school dean or prof who wants you to produce a press release about how great "our law library is" based on this ranking. Years back, at the University of Cincinnati College of Law, we just said "no." You see, our library that year was ranked higher than Harvard's law library. Why? Seating-to-student ratio. Oh yeah, that made our library better than Harvard's. [JH]
March 17, 2010
Dear God, Not Another Law School Ranking Metric: Top 10 Law School Home Pages of 2009
The ranking is based on "a tabulation of fourteen objective design criteria." including some truly significant ones, like:
Favicon – 7 points: A favorites icon (or “favicon”) is a small graphic associated with a website, which appears in places such as the browser location bar or in your bookmarks or favorites file. The favicon is probably the most important tiny graphic any site can have, and it is a simple way to help identify a law school brand or image.
Smiles – 5 points: Somebody is smiling in at least one picture on the site.
God help me -- here's the Top 10 Law School Home Pages of 2009.
1. George Mason University School of Law
2. University of Virginia School of Law
3. Wayne State University Law School
4. University of Washington School of Law
5. Harvard Law School and Regent University School of Law
7. Loyola University of Chicago School of Law and University of Notre Dame Law School
9. University of Illinois College of Law and Washburn University School of Law
195 Law Schools Ranked. But wait, the ranking of all 195 ABA accredited law school home pages starts at 17 here [SSRN]. Because I know someone in the extreme southwestern corner of the Buckeye State is going to want to know how their school fared after spending over $100K to redesign and maintain its website, here's the home page ranking for Ohio law schools.
|Ohio Law Schools Home Page Ranking|
The study was performed by ... ah ... an academic law librarian, Roger Skalbeck, Associate Law Librarian, Georgetown University Law Center. Roger, if you decide to do this again, it might be more interesting if you added another metric to this ranking: the design and maintenance costs for each law school's website. That's a metric that has to be worth at least 10 times the number of points (subjectively) assigned to both favicons and "shiny happy people."
Hat tip to Dan Filler's post on Brian Leiter's Law School Reports for this little tidbit on info antics, not metrics. [JH]