Wednesday, October 10, 2012

The Value of Law School Rankings: A Reply to Robert Steinbuch

A few days ago, one of my co-bloggers had a post about an article concerning the value of law school rankings by Robert Steinbuch.  His main argument is that "Law students should be particularly informed consumers; they are learned and investing considerably in further education. Rankings are one sound resource available to assist them in this search for knowledge. These metrics should be evaluated—dare I say, ranked—and used appropriately."

I couldn't disagree more because I have never seen a law school ranking that was sound for helping potential students choose a law school.  Moreover, I think that anyone that tells students that they should look at law school rankings to help them choose a law school is doing a great disservice to law school applicants.

Let's start with the U.S. News Rankings, which I posted on in detail a few months ago.  (here)  My conclusion then was that U.S. News has absolutely no value in helping law students choose a law school because most of its criteria are useless or questionable.  For example, employment rates, which constitute 18% of the total score, are based on full-time and part-time jobs and legal and nonlegal jobs.  Since nonlegal jobs are included in the rates, 18% of the total score is meaningless.   Similarly, the assessment score by lawyers/judges constitutes 15% of the score.  However, only about 12% of those surveyed responded, and, more importantly, I question how these legal professionals can have knowledge of the approximately 200 law schools in this country. Likewise, the peer assessment score (by deans, most recently tenured faculty member, etc.) constitutes 25% of the total, but these are based mainly on scholarship, which tells students little about which law school is best for them.  

Finally, selectivity (25%) can be misleading. G.P.A.s are not uniform because colleges are of different quality and have different grading policies (i.e., grade inflation). As U.S. News has admitted, "The difficulty level of college courses is much less important than the grades received in those classes, because law school admissions committees do an initial sort of applicants based solely on GPA and LSAT scores."  (here)  As Brian Leiter has noted, student-faculty ratio are manipulable because it depends on how schools "count" their faculty. Likewise, acceptance rates are also misleading because they often indicate how good a law school is at getting applications (such as using free online applications) rather than selectivity.

Professor Steinbuch discusses the National Jurist's Best Value Law School Rankings in some detail.   This ranking supposedly combines a law school's quality with its affordability.  "The National Jurist ranking is based on average student debt, tuition, cost of living, two-year bar passage average (comparing that to the state's average) and weighted employment rate."  Sounds good.  However, even Professor Steinbuch questions some of the ranking criteria.  For example, he states, "a high average debt that is a function of limited student aid offered by a school is very different from a high average debt caused by a student base that is disproportionately lower income. The latter might be the case at, say, a historically black college, and should not be considered a negative, while the former should be of concern, particularly if the student is interested in (and likely to get) a scholarship—be it merit-, need- or diversity-based."  Likewise, he writes, "Overall class bar-passage rates, however, are driven far more by the quality of incoming students—as measured by LSAT scores and undergraduate grade-point averages—than by what any particular law school provides."  He adds, "For example, a recent study at my school demonstrated significant differences in bar-passage rates across certain demographic groups—driven at least in part by admissions programs that seek to have classes mirroring the general population—as opposed to those bachelor-degrees students with both higher LSAT scores and GPAs. The cause at one level is simple math. Think of it this way: If Columbia decided now to double the size of its incoming law school class, the average LSAT scores and undergraduate GPAs of the additional students would fall below the first half of admittees, because schools don't generally skip better-qualified candidates willing to attend for less-well credentialed applicants. So, the remaining contenders should have a lower average objective academic profile. The same holds true when increasing constituent class components."

Above the Law has questioned the National Jurist study, declaring "But what’s more interesting is the fact that this entire list could be flawed because the National Jurist calculated the rankings based on incorrect debt data, which accounted for 15% of each school’s overall score."  They ask, "Can we believe ANYTHING that is being fed to us these days? How are prospective law students supposed to become “sophisticated consumers” if virtually all rankings are based on false statistics?"

Based on the above, I stick with my original conclusion.  There is no existing law school ranking that is of any use to students in choosing a law school.

(Scott Fruehwald)

| Permalink


Post a comment