May 17, 2013
Could ATL's Law School Ranking Unseat US News?
Not gonna happen. IMHO, the only way to reduce the impact the annual ritual known as the US News Law School Rankings is if the entire legal academy starts ignoring US News in law school marketing fodder. You think that is ever going to happen?
Here's an excerpt from the YouTube description of a BLaw interview titled "Could This Law School Ranking Unseat US News?":
Elie Mystal, editor at Above the Law, tells Bloomberg Law's Lee Pacchia that his blog's new law school rankings sought to list the top 50 American law schools by relying on an "outcome based" methodology. Mystal says that focusing on the costs and rewards of a legal education allows this ranking to determine which law schools yield "the most bang for this extreme buck."
May 09, 2013
The Most Common Criticisms of the ATL Top 50 Law School Rankings
As reported by ATL's Brian Dalton at The ATL Top 50 Law Schools: A Roundup of Criticism. [JH]
March 12, 2013
New Data "Unmasks" Schools, Says US News Law School Rankings Czar
From the Bloomberg Law YouTube description:
Law schools ranked by US News & World Report magazine in the 50 to 150 range were the ones most affected by the availability this year, for the first time, of more detailed graduate employment data from the ABA, according to US News rankings czar Bob Morse.
The new data "unmasked" that some of those schools had a relatively small number of their students taking full-time long-term jobs that require a JD, Morse tells Bloomberg Law's Lee Pacchia.
Many schools on the east and west coasts also saw drops in their rankings as a result of the new data, Morse said. University of San Francisco School of Law had this year's biggest drop, declining 38 places on the rankings, coming in dead last at 144 on the list.
As for the power of US News' rankings, which some deans blame for legal education's woes, Morse says, "US News isn't the ABA . . . We're not responsible for the cost law school, the state of legal employment, the impact that the recession has had on hiring, or the fact that there are 10 or 20 new law schools that have opened over the past couple of decades. And we're not responsible for the imbalance of jobs to graduates."
Let the Number Crunching Begin: US News 2014 Law School Rankings
Law school deans, are you ready for your report card? -- Elie Mystal, The 2014 U.S. News Law School Rankings (ATL).
It's time for the annual ritual. Comparing annual law school rankings by various ranking factors this year as well as with prior years might make for some interesting Excel spreadsheets.
US News Law School Rankings for 2014:
February 21, 2013
The Saga Continues: National Jurist Admits the Obvious -- The Rate My Professors Data Used in its Law School Rankings is Pretty Damn Inaccurate
But the editorial staff of National Jurist fails to also admit the even more significant info antic obvious; the data sources in Rate My Professors cannot be verified. Citing the opps from the rag known as National Jurist about its goofy as hell law school rankings, University of Chicago Law prof Brian Leiter writes in Two-thirds of "Rate My Professors" data that National Jurist used was inaccurate, magazine now admits:
And by "inaccurate," they mean only that the lists included non-law faculty or faculty who didn't teach at the school in question--they do not mean that the data itself actually reflects the opinon of law students about professors whose classes they really took. No one has any way of confirming that.
Let's add there is no way to determine if law profs gamed their own Rate My Professors rankings. Call me cyncial, but that possibility certainly is not beyond the realm of law prof ego-driven imagination. I'm thinking James Boyd White should have included a chapter on his theory of law as constitutive rhetoric by applying his critique to the culture of the legal academy with a study of published law prof-law school "Yippee!" in The Legal Imagination.
Leiter adds, "[National Jurist] still should withdraw the entire ranking, and hire some educational and statistical consultants to come up with a worthwhile metric." Perhaps National Jurist should just get out the law school rankings game completely. Will National Jurist publish another rankings next year? Wait 'n see.
Right now, however, it is time to gear up for this year's annual ritual known as US News Law School Rankings -- forthcoming in law prof blog posts near you, no doubt. Let's ditto that for law school PR fodder. [JH]
February 19, 2013
National Jurist's Law School Rankings Info Antics: On Verifying the "Accuracy" of Data When Data Sources Cannot Be Verified
We are now reviewing the RateMyProfessor data for all law schools with significant variances from the Princeton Review data, being careful to exclude non-law professors and former law professors. We expect to have this review done by Feb. 19. -- Jack Crittenden, Editor In Chief, National Jurist, Editor's comment on Best Law School ranking.
Although National Jurist is now actually reviewing the "accuracy" of the Rate My Professors data, the question is why didn't they do that before publishing the ranking? NJ says "we believe that the voice of students is essential." But that's not in dispute. The point is that Rate My Professors is not "the voice of students," and a magazine with any integrity, as opposed to an interest in generating hype, never would have utilized such an absurd source. (I'm not even sure what it means to check the accuracy of Rate My Professors data: anyone with Internet access on the planet earth can fill out a Rate My Professors survey, how could that be meaningful?) -- Chicago Law prof Brian Leiter, Brian Leiter's Law School Reports, National Jurist Now Back-Pedalling on its Thomas Cooleyesque Law School Rankings.
Stay tuned... . Today is Feb. 19th, isn't it?
The pot calling the kettle black. One purveyor of absurd rankings reports about the National Jurist kerfuffle. But you would be hard pressed to find him reporting on, let alone responding to, law prof criticisms of his own fatally flawed info antics spit out by the methodologically-challenged intellectual bankruptcy of being an info science amateur. See Info Antics, Not Metrics; When Counting Mickey Mouse Clicks Trump Content Analysis. [JH]
February 13, 2013
National Jurist's Law School Rankings Fatally Flawed...
... by failing to include the "Hotness" score component from the Rate My Professors online rating site. Yes, that's right, the Rate My Professors metric was used by National Jurist to rank law schools.
In National Jurist in Competition to Displace Thomas Cooley Rankings as Biggest Joke in Legal Academia, Chicago Law prof Brian Leiter makes the following request: "If readers catch any law schools publicizing their National Jurist ranking, please let me know." (The list is growing at Leiter's post.) If readers spot any law school or tax prof re-crunching the numbers to include the "Hotness" score, please let me know. [JH]
January 11, 2013
Friday Fun: Info Antics, Not Metrics; Say It Ain't So, Seto
Quoting from Theodore Seto's (Loyola Law School, Los Angeles) Where Do Partners Come From?, 62 Journal of Legal Education 242 (Nov. 2012):
You are a hiring partner. You need to spend your recruiting dollars as efficiently as possible. Which law schools offer the largest pools of potential future partners for you and your firm to explore?
You are applying to law school. Your long-term ambition is to become a partner in a national law firm in a certain city. Which schools may increase your chances of realizing that ambition?
To date, no published study has attempted to answer the question: Which law schools produce the largest numbers of partners at national law firms? This article is intended to fill that gap.
To fill that gap, how? According to Pepperdine Law Prof Robert Anderson's Witnesseth: Law, Deals & Data blog post, Bloated Is Better for Law School Rankings, here's how:
The new Theodore P. Seto ranking of law schools is in large measure a reincarnation of the notorious Thomas M. Cooley’s Law School’s ranking of law schools.
Ouch! See also Anderson's Where Partners Really Come From... and A Last Word on the Seto Rankings. In some respects, Anderson's take on Seto's ranking is mild compared to what Paul Campos has to say in his deconstruction of Seto's article at Partnership or death? and A few more points about the Seto partnership study.
Instead of a reference to Shoeless Joe Jackson, here's Weezer. [JH]
January 04, 2013
Friday Fun: Info Antics, Not Metrics; When Counting Mickey Mouse Clicks Trump Content Analysis
Recently, fatally flawed number crunching antics that appears to be unique to tax profs when they venture outside of their realm of expertise have been in the law prof blogosphere spotlight. For example, Chicago Law prof Brian Leiter comments on the latest release in Paul Caron's ritual of publishing law prof blog traffic rankings :
Breaking Development: Actual Law Blog Makes the "Top 5" in Traffic Rankings... ...of blogs by law professors. The honor goes to a blog on patent law, no less! Meanwhile, the key to having a popular blog remains simple: be a right-wing crazy or blog about philosophy.
In other words, only one of the so-called "Top 5" ranked blogs based on Caron's info antics has anything to do with publishing legal content and analysis by law profs. Just being a "law prof" is good enough. Let's add that studies have shown that about 50% of web traffic, including blogs, can be attributed to robots, etc. Meaning only about one-half of logged traffic can be attributed to humans.
Then there is Caron's latest mouse click counts for his SSRN download ranking for tax profs. Paul Campos spots the problem when raw data trumps content analysis:
[Being a tax prof] Seto's high ranking is solely a product of the fact that three quarters of his SSRN downloads come from three papers that have nothing to do with tax law (this fact is significant in this context because tax papers are written for a highly specialized audience).
Law professorial ego being what it is, Mickey Mouse info antics such as the above examples do drive traffic to TaxProf Blog for hot diggity dog faculty lounge fodder. No doubt there will be more in 2013.
I didn’t make this list of the 25 most influential people in legal education. That pisses me off. I’m going to start writing about how people shouldn’t trust legal educators because law schools are only interested in profits and not the employment outcomes of their students. That’ll show ‘em!
Yup, One has to take the risk of ego-brusing by stating an original opinion about something to be influential in any way, shape or form. Good luck trying to find that in TaxProf Blog posts or SSRN "scholarship" in either tax or legal education authored by the "Blog Emperor".[JH]
November 19, 2012
Professor Big Brother Is Watching
There is a story in the Chronicle of Higher Education highlighting the metrics available to a faculty member who assigns an e-textbook for the course:
The feature is ostensibly marketed as something good. Faculty can reach out to students who show low engagement and counsel them for success. How about some of the other possible uses for the capability? The same information may be useful to authors and publishers in analyzing how their text is used by students. Some of the data may also be used by school administrators to evaluate faculty performance through that same level of student engagement. My point is really that when a pool of information is collected, there can be many uses beyond that intended. In any event, it’s another example of what was previously not measurable becoming extremely measurable. I can see it now: Hey, lets do a study comparing student performance across racial and ethnic groups. I know there are laws that protect student information and also regulate studies using human subjects. I have a feeling that helping students to succeed will be the least of the interesting uses for this capability. [MG]
When students use print textbooks, professors can’t track their reading. But as learning shifts online, everything students do in digital spaces can be monitored, including the intimate details of their reading habits.
Those details are what will make the new CourseSmart service tick. Say a student uses an introductory psychology e-textbook. The book will be integrated into the college’s course-management system. It will track students’ behavior: how much time they spend reading, how many pages they view, and how many notes and highlights they make. That data will get crunched into an engagement score for each student.
September 21, 2012
A Glutton for Punishment: (Phillips and) Yoo's Info-Antics
Soon after the SSRN release of The Cite Stuff: Inventing a Better Law Faculty Relevance Measure by James Cleith Phillips and John Yoo, ATL's David Lat wrote
Some liberals view Professor John Yoo as a sadist. They cite Professor Yoo’s involvement in the so-called “torture memos” during his time as a lawyer in the Justice Department’s Office of Legal Counsel.
But I think Professor Yoo is a masochist. Only a masochist would try to develop a citation-based system for ranking the relevance of law professors.
For much more, see Lat's The 50 Most Relevant Law Professors ("Relevant law professors? Yes, they exist! ... Yes, law professors are efficient too!") Note well Lat's update by way of quoting from ATL law prof readers:
[T]his study is limited to full-time tenure-track faculty who are not clinical faculty, and is also limited to those professors on the faculty for the 2011-2012 school year. It only looks at the top 16 law schools according to the U.S. News and World Report’s academic peer rankings as databases are constantly updated and a citation study that stretches out over too much time will be biased in favor of the faculties done later in the study as the databases will have been updated and have more citations near the end of the period of gathering data.
Let the punishment begin continue. See, for example, Phillips & Yoo Citation Study Has Some Serious Problems ("Yoo and Phillips aren’t even measuring citations correctly, let alone quality.") Ouch. Leap to the Kevin Bacon frat house Initiation scene in National Lampoon's Animal House.
Frankly, the only value I see in the vast majority of these "studies" resides in their contributions to the arcane BDSM discipline known as the cultural anthropology of the frat house that is the legal academy. But see Beyond Cite Stuff by Law Profs for one research report that deserves closer examination and discussion than it is receiving.
Why isn't Triangulating Judicial Responsiveness: Automated Content Analysis, Judicial Opinions, and the Methodology of Legal Scholarship [SSRN] receiving much buzz? Is it because the offered methodology is too complex for quick and easy critiques by members of the legal academy? Is it because the data set for the content analysis has nothing to do with law prof or law faculty publications?
Like OMG dude, why are the authors of that paper writing about a method to analyze court opinions and related documents based on information science! [JH]
September 14, 2012
Beyond Cite Stuff by Law Profs
Should cluster analysis supplement law journal ordinal ranking to improve citation metrics? According to Theodore Eisenberg and Martin T. Wells (both Cornell Law), the answer is "yes" in their Ranking Law Journals and the Limits of Journal Citation Reports [SSRN]. And then there is The Cite Stuff: Inventing a Better Law Faculty Relevance Measure [SSRN] by James Cleith Phillips and John Yoo (both Berkeley Law). You remember Yoo right? -- the author of the Torture Memos before becoming an amateur information scientist. Apparently both articles have "discovered" what has been commonly accepted informetric knowledge for decades.
At least there is some hope for raising the scholarship bar produced by members of the legal academy. See Triangulating Judicial Responsiveness: Automated Content Analysis, Judicial Opinions, and the Methodology of Legal Scholarship [SSRN] by Chad M. Oldfather (Marquette Law), Joseph P. Bockhorst (Wisconsin -Milwaukee Department of Electrical Engineering and Computer Science) and Brian P. Dimmer (Petit & Dommershausen):
The increasing availability of digital versions of court documents, coupled with increases in the power and sophistication of computational methods of textual analysis, promises to enable both the creation of new avenues of scholarly inquiry and the refinement of old ones. This Article advances that project in three respects. First, it examines the potential for automated content analysis to mitigate one of the methodological problems that afflicts both content analysis and traditional legal scholarship — their acceptance on faith of the proposition that judicial opinions accurately report information about the cases they resolve and courts‘ decisional processes. Because automated methods can quickly process large amounts of text, they allow for assessment of the correspondence between opinions and other documents in the case, thereby providing a window into how closely opinions track the information provided by the litigants. Second, it explores one such novel measure — the responsiveness of opinions to briefs — in terms of its connection to both adjudicative theory and existing scholarship on the behavior of courts and judges. Finally, it reports our efforts to test the viability of automated methods for assessing responsiveness on a sample of briefs and opinions from the United States Court of Appeals for the First Circuit. Though we are focused primarily on validating our methodology, rather than on the results it generates, our initial investigation confirms that even basic approaches to automated content analysis provide useful information about responsiveness, and generates intriguing results that suggest avenues for further study.
Remember, the science of citation metrics grew out of the content analysis of foreign newspapers performed by WWII military intelligence staff. It still remains an accepted but evolving screening tool to collect data sets for producing content analysis. But content analysis takes much more work compared to merely spitting out absurb rankings based on raw numbers by amateur law prof "info scientists". [JH]
August 30, 2012
Has the Law Prof Blogosphere Established Itself as a Disruptive Publishing Medium?
Law prof blogging is no longer the latest "hot" thing to do. Hasn't been for years. After some ten years or so the law prof blog is a mature publishing medium. Denver Law prof J. Robert Brown Jr. has been publishing a series of posts entitled "Law Faculty Blogs and Disruptive Innovation" on the Race to the Bottom blog. [First post here]. The posts are based on and highlight themes from his Essay: Law Faculty Blogs and Disruptive Innovation [SSRN].
While relying on data [SSRN download link], I wouldn't characterize Brown's work as empirical infometric research; I hope that wasn't the author's purpose because he overreaches at times. It is best to view Brown's essay as placing this mature and accepted publishing medium in the context of academic legal serial publishing generally.
Brown discusses relative merits and postulates selected outcomes. Here's an excerpt from the essay's abstract:
Law faculty blogs arose in a state of nature and were often perceived as inferior technology used by faculty to convey random, often personal, views. Over time, however, a recognized class of law faculty blogs emerged, with at least one having been cited 45 times in court opinions and another having been cited by over 700 times in assorted legal publications. Widely read and regularly cited, they offered a superior method for the rapid dissemination of some types of legal analysis and facilitate the introduction of ideas into an ongoing debate. They also provide a form of intermediation that discourages low quality posts.
Law faculty blogs provide a form of scholarship that fills a gap left by traditional law reviews. Law faculty blogs overcome the slow publication process and dense analysis that often prevents traditional law review articles from playing a role in an ongoing debate. Said another way, law faculty blogs have altered the continuum of legal scholarship and reduced the role of traditional law reviews. Efforts by law reviews to fight back through the implementation of online supplements has so far failed.
Law faculty blogs have also had a disruptive impact on the determination of faculty reputation. Blogging allows law professors to route around the traditional indicia of reputation such as the frequency of publication in elite law journals. Providing a “prominence” dividend, faculty who blog are able to advertise their expertise through substantive posts and become better known to practitioners, academics and decision makers.
Content is king if content is original. Any given law prof blog can have a relatively long or short publishing lifespan in "blogosphere years." In the now decade-long publishing history of the law prof blogosphere, launch dates oftentimes are key to garnering readership because earlier blogs arrived on the scene before the law prof blogosphere became a fairly crowded space. Newer blogs can acquire sustainable audiences and can even become more widely read than established blogs if their blogging law profs take the long view in "blogosphere years" to establish their web destination as a place to visit because their blogs consistently publish interesting and stimulating content.
As a practical matter, new law prof blogs must face the reality that this is a fairly crowded web space. Prof blogs must also recognize that commerical law media outlets are now well entrenched and that unlike the good old days when Google searches would expose new readers to blogs regardless of published content, modififications of Google's SE algorhythm tends now to filter regurgitated content.
Content is king if content is originial. That can be real plus when law profs blog commentary and analysis based on expert assessments of legal news and developments. That "scholarship in action" blog writing takes more time and thought but when captured by Google's SE, at least today's web browsers now provide an easy means to take an RSS feed. No so back in the early days.
Establishing a Merits-Based Reputation. Having been involved in publishing law prof blogs since co-founding the Law Professor Blogs Network back in the relatively early days when blogging was the latest "hot" thing to do if tenured, in 2004, the acceptance by the legal academy of the blogging platform as being a legitimate publishing medium can have what Brown calls a '"prominence' dividend". The real plus here is the exposure junior faculty can acquire by "thinking out loud" to display their interests and expertise. I won't detail how often junior faculty wanted to blog on a Network blog or wanted to launch a Network blog between 2004 and 2006 but were seriously concerned about the negitive consequences that might have for earning tenure by doing so. At least the legal academy has accepted blogging as a legitimate publishing medium.
The benefits for junior law profs now being able to blog because blogging is an acceptable publishing medium are at least three-fold:
- A higher profile for a junior faculty may mean law reviews will take their article submissions more seriously.
- Demonstrated expertise may result in calls for comments and/or citations to timely posts on topics by or in major general and legal commercial media outlets, and/or an invitation for the law prof to right a think piece; and
- A solicitation for a public service contribution, an amicus brief, or a firm or advocacy group to work for hire on a specific topic or more generally.
August 01, 2012
On Being Seen: Measuring Judicial Impact of Legal Scholarship
Law profs citing other law profs, how relevant is that? See How Not To Be Seen: "And Now for Something Completely Different" (and Irrelevant in the Real World): Top 70 law faculties in scholarly impact, 2007-2011. I think Mitchell Rubinstein is onto something when he writes:
A much more relevant, measurement would be to see which schools and scholars are most cited by courts. It should not be about being cited by other professors. That is exactly what this ranking system measures and that is exactly what is wrong with law schools today.
For much more, see Brian Leiter's Law School Scholarly Impact Rankings- Why??? [JH]
July 25, 2012
How Not To Be Seen: "And Now for Something Completely Different" (and Irrelevant in the Real World): Top 70 law faculties in scholarly impact, 2007-2011
The ranking is published on Brian Leiter's Law School Rankings and the analysis by Gregory C. Sisk and his colleagues in the law library at the University of St. Thomas (Minnesota) at Scholarly Impact of Law School Faculties in 2012: Applying Leiter Scores to Rank the Top Third [SSRN].
Univ. of Chicago law prof Brian Leiter writes:
Professor Sisk and colleagues include all the appropriate caveats in their write-up. Mean scholarly impact is one kind of measure of academic distinction of a faculty; to the extent that school reputations depend more on the very best faculty, rather than the mean impact, then schools like Virginia, Georgetown, Texas, and Southern Cal are underranked, as they probably would be deemed to be in a survey of scholarly experts. Still, mean impact does also provide a check on casual assumptions about faculty quality, and constitutes a useful data point for schools trying to assess the performance of their faculty and for students particularly interested in the scholarly visibility of the law schools they are considering.
Time for the instructional video on how not to be seen. [JH]
June 08, 2012
The Most-Cited Law Review Articles of All Time
Fred Shapiro with the assistance of Michelle Pearse has added a third installment to his citation analysis of the most-cited law review articles. The first two studies were published at 73 Calif. L. Rev. 1540 (1985) and 71 Chi.-Kent L. Rev. 751 (1996). The latest study, The Most-Cited Law Review Articles of All Time, is published in the June 2012 issue of the Michigan Law Review.
From the abstract:
New research tools from the HeinOnline and Web of Science databases now allow lists to be compiled that are more thorough and more accurate than anything previously possible. Tables printed here present the 100 most-cited legal articles of all time, the 100 most-cited articles of the last twenty years, and some additional rankings. Characteristics of the top-ranked publications, authors, and law schools are analyzed as are trends in schools of legal thought. Data from the all-time rankings shed light on contributions to legal scholarship made over a long historical span; the recent-article rankings speak more to the impact of scholarship produced in the current era. The authors discuss alternative tools and metrics for measuring the impact of legal scholarship, running selected articles from the rankings through these tools to serve as points of illustration. The authors then contemplate how these alternative tools and metrics intersect with traditional citation studies and how they might impact legal scholarship in the future.
April 14, 2012
Best and Worst Jobs of 2012
CareerCast.com ranked 200 jobs from best to worst based on five criteria: physical demands, work environment, income, stress and hiring outlook. Ranked in first place is Software Enginer and last place is Lumberjack. Librarian is ranked 61st, just ahead of Judge. Attorney is ranked 87th, far below Paralegal Assistant (49th place). For the complete list see the Wall Street Journal's chart. [JH]
March 16, 2012
Top Law Review's Circulation Drops Below 2,000 Paid Subscribers
Ross E. Davies, George Mason law prof and editor-in-chief of the Green Bag, reports in Law Review Circulation 2011: More Change, More Same [SRRN]:
In 2011, for the first time since the U.S. Postal Service began requiring law reviews to track and report their circulation numbers, no major law review had more than 2,000 paying subscribers. The Harvard Law Review remains the top journal, but its paid circulation has declined from more than 10,000 during much of the 1960s and ’70s to about 5,000 in the 1990s to 1,896 last year.
Isn't it about time to go electronic only for student-edited law journals? How about just distributing them via HeinOnline without a time embargo? Doesn't look to me like that would hurt print subscription sales. [JH]
March 13, 2012
Why So Much Movement in This Year's US News Law School Rankings?
Yes, the US News Law School Rankings for this year are now online. I'm thinking online sales for the full report will be hotter than ever. Rankings are good for US News revenue. This year's edition might even be better than previous years.
Why? First because US News reporter Katy Hopkins spotlights In 2013 Best Law School Rankings, Top Schools Switch Spots:
For the first time in three years, there was some movement at the very top of the U.S. News Best Law Schools rankings.
Yale Law School took the top spot in the 2013 edition of Best Law Schools, a place it's held since 1990. But contenders for the next two spots switched, with Stanford Law School outseating Harvard Law School for the first time since 2007.
Along with Yale, Stanford, and Harvard, 11 institutions that historically claim the top spots in the U.S. News rankings are known as the top 14 (T14), which isn't a designation by U.S. News but is widely known in the legal community. Some T14 schools shuffled: The University of California—Berkeley School of Law and the University of Virginia School of Law both moved up two spots to tie with the University of Pennsylvania Law School for 7th, while the University of Michigan—Ann Arbor Law School slid from 7th to 10th; Cornell Law School, now ranked 14th, and the Georgetown Law Center, at 13th, switched spots from last year's list.
Well, shuffling up or down a notch in the top 15, I mean, top tied-14, isn't all that important as long as a law school stays in that de facto top tier. It is below that elite tier where things get interesting.
Pull out those old rankings and spreadsheets. The second reason why this year's US News Law School Rankings may be a historic "best seller" is because of the amount of movement in the Top 50 rankings this year. Now that the legal and general media have caught on to the games that have been played, interest in all of the legal academy's shenanigans has increased.
There is no doubt in my mind that journalists will join the annual rite members in the legal academy have participated in for years, namely number crunching. Inside the legal academy, this year will be different because legal administrators and law profs will be crunching numbers every which way to figure out why there has been so much movement in the Top 50.
Since the US News ranking methodology hasn't changed, what's up? Are law deans worried about getting caught with their pants down now? Has more accurate data been provided to US News? If so, in what categories? If some data elements report substantially different stats compared to prior years, is that an admission against interest? Or will a chorus of law schools sing "the recession has finally hit us'? You can bet articles, blog posts and SSRN uploads will be filled with commentary and analysis.
[J]ust to save you some time and trouble, here are the top fifty of this year and last, along with how much each school moved. Commentary after the chart.
The biggest drop in the top 50, 12 spots, came from Illinois, which was scandalized this year by publishing fraudulent admissions data. ... But, another school further down dropped more. Villanova, which also published false employment statistics, plummeted from 84 to 101
Beyond Below the Top 50, does anyone really care? You bet! Many law schools look for marketing fodder to pitch "we are the top ranked" (or one of the top ranked) schools in the state or a conveniently defined region. Hell, marketing fodder based on US News rankings has even been crafted for selling a law school as one of the most highest ranked small public law schools in nation.
March Madness, ATL-Style. In addition to David Lat's The U.S. News Law School Rankings Are Out! on Above the Law, see ATL March Madness (2012): The Most Honest Law School:
Every year, we here at Above the Law like to put together a little bracket of our own. In the past, we’ve asked you to vote for such things as the coolest law firm or the douchiest law school.
This year, we’ve come up with a question that you don’t hear a lot of people asking when they’re talking about pursing a career in law: Which law school is the most honest?
We expect law schools to shape our next generation of lawyers. We expect law schools to teach their students to think like lawyers. But do we expect law schools to teach people to be honest lawyers? Are some law schools better at emphasizing the moral and ethical standards of the law, while others teach a more, well, ethically aggressive style?
(Emphasis in the original.)
US News is in the business of making money. In a recent post, US News rankings guru Bob Morse wrote:
Well-known writers have made the case recently that the U.S. News Best Law Schools rankings are among the most powerful forces driving behavior at law schools.
Our take: It's important to remember that the U.S. News rankings are done to provide one tool to help prospective law school students choose the best law school for them. The Best Law Schools rankings are not done to provide law school academics a benchmark to measure their school's progress or to influence or be an instrument to direct educational policy decisions.
The bottom line: U.S. News is not running the law schools, does not play any role in making decisions at any law school, and does not believe there are any credible justifications for falsifying law school data.
Providing a "tool" that generates a helluva lot of revenue, Bob. [JH]
March 04, 2012
Browsing On A Sunday: Do Courts Cite Internet Legal Resources?
I’m doing a lecture on Internet legal research soon and I was wondering how valid are some of the sites I teach to the courts. I find Google Scholar very useful in my day job. Has anyone ever mentioned it in an appellate opinion? Yes, it seems, once, and only very recently. The California Supreme Court issued an opinion on January 27th called Vandermost v. Bowen, --- P.3d ----, 53 Cal.4th 421, 2012 WL 246627 (Cal.), 12 Cal. Daily Op. Serv. 1119, 2012 Daily Journal D.A.R. 1110 (citations and search results generated from Westlaw) with this reference:
By contrast, academic observers have concluded that the Commission's maps, including the certified state Senate map, “represent[ ] an important improvement on the legislature-led redistricting of 2001. The new district boundaries kept more communities together and created more compact districts while at the same time increasing opportunities for minority representation.... [T]hese maps ... have the potential to modestly increase competition in California elections and the responsiveness of the legislative branch to changing voter preferences.” (Kogan & McGhee, Redistricting California: An Evaluation of the Citizens Commission Final Plans, supra, 4 Cal. Journal of Politics and Policy ____ (forthcoming Jan. 2012; available via Google Scholar at <http:// polisci2.ucsd.edu/vkogan/research/redistricting.pdf>, pp. 32–33 [as of Jan. 27, 2012] ).)
I think the Court should be citing the University of California San Diego, which is the actual source of the cited document at the end of the quote. Interestingly enough, the Court tipped its research strategy in finding the document, a forthcoming publication no less.
The Cornell Legal Information Institute gets one state hit in American Home Assur. Co., Inc. v. Unauthorized Practice of Law Committee, 121 S.W.3d 831, 2 A.L.R.6th 783, Tex.App.-Eastland, November 06, 2003 (NO. 11-02-00212-CV), and seven in the federal courts. The Oyez Project gets a reference in a federal case quoting Chief Justice Roberts’ reaction to a particular statute. The reference is to an oral argument. The citing case is Evans & Green, LLP v. Meadoworks, LLC, Slip Copy, 2012 WL 137885, W.D.Mo., January 17, 2012.
FindLaw gets a whopping 23 references, though some of those are involved around litigation involving FindLaw, and a few more relate to accessible resources by prisoners as part of access to a prison law library. The news feed at Leagle.com gets at least one mention in a 2010 Michigan case. The name “Leagle” shows up either as a personal name or as a misspelling. Justia gets five hits, though a few are in reference to litigation involving the site. As for government citation, GPO Access gets two citations, while the more current FDsys.gov has nothing yet. Regulations.gov gets five hits.
The point for me is these sites have enough respectability to be cited by the courts, meaning they have enough respectability to be used and cited by others. We teach students to use the original source, and court rules tend to enforce that concept. But with courts starting to go to legal content on the free Internet, we can’t discount some of these Internet legal sites, at least as a matter of reputation. [MG]