Thursday, April 13, 2006
Tomorrow I’ll be back to posting on property stuff.
Over at Empirical Legal Studies, the Hylton rankings are continuing to gather press, though Andrew Morriss isn’t nearly as positive on them as either Jason Czarnezki or me. Morriss raises the important point that Hylton relies on US News’ peer assessment scores, which Morriss finds problematic. I don’t think they’re nearly as bad as Morriss does. For one, there’s a high correlation (.86) between a school’s US News peer assessment score and citations to the that school’s main law review (for schools in the US News top 50). And there’s a high correlation (.91) between a school’s US News peer assessment score and the midpoint of the 75th and 25th percentiles on the LSAT of the school’s entering class, as I said in my first post on the Hylton rankings. So, checking against the (readily available and I think decent) data, they seem to have some validity in terms of measuring school quality.
I, of course, agree with Morriss’ critique that there are some strange things going on with peer assessment scores. How could anyone say that UT-Austin or Vanderbilt or UCLA is anything other than outstanding-–to say nothing of Columbia or the University of Chicago? When I filled out the US News evaluation last fall, I ranked about twenty schools as five. How could I do otherwise?
So should we use SSRN instead?
I agree the US News peer assessment scores could be improved. However, (with respect to Bernard Black and my employer here at the law professor blogs network, Paul Caron,) I can’t yet agree with all of Morriss’ statement that “SSRN stats are far from perfect, but they are a heck of a lot better already than the US News peer ranking.” Caron and Black’s paper in the most recent issue of the Indiana Law Journal is required reading in the rankings genre. It makes a strong case for using ssrn data to rank faculties. But “a heck of a lot better already than the US News peer assessment?” As Caron and Black report, there's a high correlation (.72) between ssrn ranks and US News ranks (81 Indiana LJ. 83, 108).
Anything that’s as manipulable as downloads has problems as a measure of quality, IMHO. (Before putting too much weight on the ssrn downloads, one might refer to David Bernstein's joking about them over at volokh, as well as Brian Leiter's well-considered thoughts.) My metric of choice is citations to a law school’s main law journal. (It, too, is problematic in many ways and probably ought to be used as one of several factors.) On that I’ll have some thoughts in the next few days here and in more detailed form on ssrn.
Side notes: some other preliminary thoughts–-including a table that reranks US News’ third and fourth tier based on citations to their journals--are available in a paper I posted last December, which is summarized here.
Paul Caron summarizes which schools benefit the most (and are hurt the most) from the Hylton rankings.
One final note, which I just realized: the Hylton rankings have been picked up by a student discussion board--a sure sign they're important. One student, "Miketyson," uses the .91 correlation between the US News ranks and the LSAT midpoints to defend the US News peer assessments. And another student, in asking about the University of San Diego, says "Its Hylton is way above its USNWR." Ah, Gordon, your ranking system has now arrived!
UPDATE: And now I see that wikipedia is talking about the Hylton rankings. Further evidence of the importance of the Hylton rankings.
Alfred L. Brophy
Comments are held for approval, so they will not appear immediately.