Friday, April 14, 2006

The Relationship Between Law Review Rankings and US News Law School Rankings

The release of the 2007 US News rankings of law schools has set off another round of speculation on the meanings of the rankings and what, if anything, schools can do to improve the quality of the education they provide, as well as their rankings.  David Hoffman over at Concurring Opinions has been asking about where law schools should direct money.  Bill Henderson over at the Conglomerate has presented some rather sobering data about how static law school peer assessments are.  If you haven’t seen Dan Filler and Dan Solove’s chart over at Concurring Opinions, I think you’ll enjoy it.  And our leader here at the Law Professor Blog Network, Paul Caron, has done his usually excellent job of sorting out all the changes.  Of course, there's a lot of talk about how to improve the rankings.  There's the recent discussion of the Hylton rankings over at elsblog (and here at propertyprof). Andrew Morriss at elsblog has suggested some alternative measures (employment data, ssrn, but not law review citations).

Drawing upon earlier evidence (summarized here) that there is a close connection between the citation rankings of law reviews and the ranking of their parent institutions, I have a new paper that looks to changes in both the US News rankings and law journal rankings over the past few years.

I compare changes from the "2003" to the "2007" US News rankings and changes in citations from 2002 to 2005 in John Doyle's website at the Washington and Lee Law Library.  Because the US News 2003 study was released in April 2002 and the 2007 study was released in April 2006, the periods under study for peer assessment and law journals were essentially the same.  First, there continues to be a high (.87) correlation between a law review's citation rank and its parent institution's US News peer assessment score. 

But what about changes over time?  The paper tests and finds some (well, a small some) support for a hypothesis that as law schools improve (or decline), there is a corresponding change in the quality of their main law journals (as measured by citations in other journals).  There is a connection--albeit weak (r=.21)--between changes in law reviews ranking and changes in US News peer assessment rank over the past four years.  For purposes of my hypothesis, I wish, of course, that the relationship where greater than it is.  The paper explores some reasons why that relationship might appear relatively weak--including that there's more of a lag between improvements in school quality (which might be reflected in places like the quality of articles a law review is publishing) and peer assessments of quality. There's obviously a lot left to explore (like the phenomenon of US News peer assessments converging with citation rankings).

I suggest that if you want to know where a law school is heading, in addition to the glossy material that the school sends out (which announce new hires, student successes, faculty publications, and talks sponsored by the school), one should spend some time studying the scholarship its law review publishes.  The idea here is that what students (and perhaps increasingly faculty) are able to recruit (as well as what they select) for publication tells something about the intellectual orientation and reputation of the school.  I have previously suggested that perhaps US News should begin to take law journal rankings into account in their rankings.  Citation rankings offer, perhaps, a more objective measure of what's happening intellectually at a law school than do peer assessments by people who have relatively little knowledge of an institution.  (On the difficulties of ranking schools, see Joseph Slater's recent post.)

One table ranks the changes in journals' citation ranks over the past four years.  Some have really come on strong.  Here are the dozen schools whose reviews have improved the most in rank:

School Change in Rank Current Rank
Michigan State U. 54 109
Lewis and Clark 49 100
William Mitchell (MN) 40.5 65
George Mason U. (VA) 35.5 70
University of Alabama 33.5 54
University of Florida 29 52
University of Akron 28 88
Indiana U.-Indianapolis 27 50.5
U. Arkansas-Little Rock 25 101
U. Louisville 24.5 119
Boston College 24 36
Drake 22.5 95.5

I also detail which reviews are "undervalued"--that is, which reviews are ranked significantly better in citations than their parent institutions--and which reviews are "overvalued"--that is, which reviews are ranked significantly worse in citations than their parent institutions.  Last fall when I compiled a similar table, I observed that DePaul Law Review's excellent ranking (it was ranked in the top 50 main law reviews in terms of citations) suggested that DePaul would likely be in the US News top 100 soon.  I was delighted to see that DePaul is, in fact, ranked 80 this year.

Other schools that you might expect to see rise from the third tier include Albany, whose journal is ranked 49.  Both Catholic’s and Marquette’s peer assessments already place them in the Top 100 and their reviews are ranked 66 and 73.  Other schools already in the Top 100 that are ripe for an improvement in the rankings are DePaul, the University of South Carolina, Chicago-Kent, Cardozo, and Fordham. Michigan State, whose law review is the biggest improver, certainly ought to rise from the fourth tier; its peer assessment rating already places it 112 and its law review is ranked 109.  One might also look for both William Mitchell and South Texas to rise from the fourth tier as well.  William Mitchell’s law review is ranked 65.  South Texas’s law review is ranked 81.

And for those of you who want some more rankings, a final table ranks the main law journals of 178 ABA accredited law schools, according to journal citations.

Let me return, then, to David Hoffman's questions:

what should the smart money be spending cash on?  Employment? Marketing? Facilities? Remember: the goal of this spending is to get as much relative peer-to-peer growth for your buck as possible. So, pretend you are a law school dean. What is in your next budget?

In part I think law schools ought to spend money (and attention) on their law journals.  An increase in law review quality will not necessarily lead to an increase in peer assessment, but I think people are going to increasingly focus on law reviews as an indicator of law school quality.  In part I think they will do that because we're all looking around for indicators of law school quality and we know that there's a high correlation between perceived quality--as measured by US News peer assessment--and law review citations by journals.  (Paul Caron and Bernard Black, whose important paper urges attention to ssrn downloads, are two among many people who are seeking better indicators.)  And when that turn to scrutinizing law journals happens, I think deans will want to be able to point to a high quality law review.  And even if law journals aren't used as measures of quality of their parent institutions, spending time and money improving reviews will help improve legal education--which is one of the points of this whole rankings business anyway, I thought.

Alfred L. Brophy

Comments are held for approval, so they will not appear immediately.

https://lawprofessors.typepad.com/property/2006/04/the_relationshi.html

Law Schools | Permalink

TrackBack URL for this entry:

https://www.typepad.com/services/trackback/6a00d8341bfae553ef00d8355fb72969e2

Listed below are links to weblogs that reference The Relationship Between Law Review Rankings and US News Law School Rankings:

Comments

Post a comment