Thursday, April 13, 2006
The Hylton Rankings II (at properprof)
Tomorrow I’ll be back to posting on property stuff.
Over at Empirical Legal Studies, the Hylton rankings are continuing to gather press, though Andrew Morriss isn’t nearly as positive on them as either Jason Czarnezki or me. Morriss raises the important point that Hylton relies on US News’ peer assessment scores, which Morriss finds problematic. I don’t think they’re nearly as bad as Morriss does. For one, there’s a high correlation (.86) between a school’s US News peer assessment score and citations to the that school’s main law review (for schools in the US News top 50). And there’s a high correlation (.91) between a school’s US News peer assessment score and the midpoint of the 75th and 25th percentiles on the LSAT of the school’s entering class, as I said in my first post on the Hylton rankings. So, checking against the (readily available and I think decent) data, they seem to have some validity in terms of measuring school quality.
I, of course, agree with Morriss’ critique that there are some strange things going on with peer assessment scores. How could anyone say that UT-Austin or Vanderbilt or UCLA is anything other than outstanding-–to say nothing of Columbia or the University of Chicago? When I filled out the US News evaluation last fall, I ranked about twenty schools as five. How could I do otherwise?
So should we use SSRN instead?
I agree the US News peer assessment scores could be improved. However, (with respect to Bernard Black and my employer here at the law professor blogs network, Paul Caron,) I can’t yet agree with all of Morriss’ statement that “SSRN stats are far from perfect, but they are a heck of a lot better already than the US News peer ranking.” Caron and Black’s paper in the most recent issue of the Indiana Law Journal is required reading in the rankings genre. It makes a strong case for using ssrn data to rank faculties. But “a heck of a lot better already than the US News peer assessment?” As Caron and Black report, there's a high correlation (.72) between ssrn ranks and US News ranks (81 Indiana LJ. 83, 108).
Anything that’s as manipulable as downloads has problems as a measure of quality, IMHO. (Before putting too much weight on the ssrn downloads, one might refer to David Bernstein's joking about them over at volokh, as well as Brian Leiter's well-considered thoughts.) My metric of choice is citations to a law school’s main law journal. (It, too, is problematic in many ways and probably ought to be used as one of several factors.) On that I’ll have some thoughts in the next few days here and in more detailed form on ssrn.
Side notes: some other preliminary thoughts–-including a table that reranks US News’ third and fourth tier based on citations to their journals--are available in a paper I posted last December, which is summarized here.
Paul Caron summarizes which schools benefit the most (and are hurt the most) from the Hylton rankings.
One final note, which I just realized: the Hylton rankings have been picked up by a student discussion board--a sure sign they're important. One student, "Miketyson," uses the .91 correlation between the US News ranks and the LSAT midpoints to defend the US News peer assessments. And another student, in asking about the University of San Diego, says "Its Hylton is way above its USNWR." Ah, Gordon, your ranking system has now arrived!
UPDATE: And now I see that wikipedia is talking about the Hylton rankings. Further evidence of the importance of the Hylton rankings.
Alfred L. Brophy
Comments are held for approval, so they will not appear immediately.
https://lawprofessors.typepad.com/property/2006/04/the_hylton_rank_1.html
Comments
Jason,
My apologies for mischaracterizing your posts. I still think you're more positive on the Hylton rankings than Andrew Morriss--but less so than me, obviously.
I still prefer the Hylton rankings are better than US News.
Posted by: Alfred L. Brophy | Apr 13, 2006 11:16:41 AM
SSRN data? That leaves out those of us who publish in non-student journals (usually with much wider circulation). It's the self-appointed "publish in student journal" elites arguing about their relative position in a world dwarfed by the world of professional publications, blogs, and other things that are at least as indicative, if not more indicative, of good teaching (which is what law schools... SCHOOLS ... should be about). Want to rank law thinktanks? Sure, use SSRN. But don't claim that a SCHOOL should be measured by its thinktank function.
The true measure, though tough to get, is what people think of a school's graduates during the five-year window of law practice following graduation (before a lawyer's ability has been tempered and refined by living in the practice world). Bar pass rates matter, too, though law schools usually claim they aren't "teaching for the bar." Assuming bar exams are decent gateways for determining who can practice law, then what are law schools "teaching for?"
There is so much subjectivity that ranking law schools is like those lists of "most beautiful women" or "most handsome men" that some magazines and newspapers publish. We could do metrics on distance between eyes, degree of eye roundness and other things that some researchers claim predicts "attractiveness" but ultimately I may think beautiful someone you think plain, and vice versa, and though we might agree generally on categorization, trying to pick between two people who are similarly attractive is like trying to decide if Harvard or Chicago is the better school. And, uh, who cares?
Posted by: Jim Maule | Apr 14, 2006 6:56:57 AM
While I posted the Hylton rankings on the Empirical Legal Studies Blog (www.elsblog.org), I can't say that I'm "positive on them." Now this doesn't mean that I'm negative on them either, but they do exclude very important information as Andrew Morriss points out. To me, the two biggest variables excluded in the Hylton Rankings are job placement and faculty scholarship. Though, like Andrew, I am happy to see the Hylton Rankings because "the law school world would be better if it looked more like the business school world on rankings, where competing sources of rankings exist." Perhaps someone should create a BCS-type ranking system that incorporates a number of rankings together.
Posted by: Jason Czarnezki | Apr 13, 2006 6:47:12 AM