February 23, 2010
Are Law Schools Gaming Their Law Review Circulation Data?
Yup, law review paid subscriptions continue their death sprial but there's no need to update the charts published last year at Twenty-Five Year Decline in Law Review Subscriptions. What's interesting in George Mason law prof Ross E. Davies' second annual study is the possibility that reported law review circulation data is being gamed. He writes, "we noticed something that might be worth thinking about: the possibility that the law school combover culture has infected law reviews."
Case in point, the questions surrounding Harvard Law Review's reported circulation data. Davies writes "context might help: it suggests that the HLR’s circulation is whatever the HLR can convince you it is." From his Law Review Circulation 2009: The Combover article [SSRN]:
The HLR, like all law reviews, operates within a larger world driven in substantial part by USNews rankings and related creatures. It is a world in which some law school leaders — that is, the people in charge of teaching law review editors and other students about the law, its practice, and its values — are committed to being in the elite, to being highly ranked, even if that means also being not fully forthright about the numbers on which rankings are based. Perhaps law review editors internalize that kind of commitment, if not from their own schools, then perhaps from the law school world at large. Perhaps the propriety of fudging your way toward first place in the law is being simultaneously booted out the front door via lectures in Professional Responsibility classes and welcomed in at the back gate via role-modeling in law school administration and media coverage of it.
Oh great, now those law schools which hadn't thought about this stunt, might do so.
Why Subscribe? There's little need to subscribe to or "circulate" law reviews. Who reads what they publish? Law profs? Some perhaps, but if they do it is more likely by way of SSRN or bePress. Federal court clerks for "see generally" references, perhaps . Certainly not the typical state or federal practitioner. They turn to ABA Section journals and state bar association titles because that's where they will find doctrinal analysis and practical information. Commenting on NLJ's article, "Study Finds Sharp Decline in Law Review Circulation," Mark Giangrande wrote on LLB yesterday, "[t]he article noted the focus of law reviews has shifted to the academic audience, saying 'it takes longer for ideas they present to find their way into real-world legal practice.' I'll say." I'll add, "if ever."
Certainly there is nothing wrong with speculative thought but hopefully someday, the peer review model will be the norm in the legal academy. Until then, members of every other discipline on university campuses shake their collective head in disbelief about how law profs "earn" tenure by way of their publishing track record. "Publish or perish" whether right or wrong, is absurdly easy for law profs when they have some 200 student edited law reviews and journals to submit their work to. And still, they whine about it!
The Greening of Law Reviews. A lot of trees will be saved if the Durham Statement on Open Access to Legal Scholarship was implemented nationwide. In fact, it might stop this latest legal academy combover Ross Davies calls attention to. Of course, Durham Statement proponents will have to persuade law profs that SSRN download mouse clicks mean absolutely nothing. That may be an insurmountable obstacle to instant ego gratification. If not cited, at least downloaded, right? [JH]
October 06, 2009
Journal Cost-Effective Index for Legal Periodicals
Ted Bergstrom and Preston McAfee have created the Journal Cost-Effective Index (2009 Beta). From the description:
This website represents our best attempt to compute the price per article and price per citation. Currently we use the ISI data for citations through 2007 and 2009 prices, which are the most recent data available to us. Not all journals report information the same way, and errors are possible. When reported to us, we correct errors. Moreover, prices per unit for journals that have recently expanded are underestimated. The coloration (red for very low value, yellow for low value, and green for good value) is computed by comparing the composite price index to the median for non-profit journals in the same subject. Be advised that price per citation, price per article and the composite index are not perfect measures of value. Neither of us are experts in most of the fields represented, and others may reasonably, or unreasonably, disagree with the value assessment. We have mapped a large set of journal categories into 17 areas. Here is the full mapping. We have updated the non-profit status of nearly 100 journals by writing to editors.
If you select "law" and click on the search link, the results will display the above information for 94 academic law review, commercial, professional association and peer review titles covered in the database. Of course, academic law review titles end up being ranked as "good values" because they are so cheap while commercial law journals are ranked as "very low values" because they are much more expensive. Where does LLJ stand? It's price per article is 3.09, per citation is 7.09 and it is rated a "good value."
For legal journals, the database output is a bit silly. Remember the database is using ISI citation data only and lumping apples and oranges together doesn't work very well. Authors in other disciplines do not benefit from the available of so many low subscription cost student-edited titles. The database is probably more useful in other fields, e.g., Physics, where subscription costs for profit and non-profit titles are much more comparable to each other.
A bit of fun but not particularly useful for legal journals. [JH]
August 05, 2009
Top 10 Speedy Law Prof Blog Readership Ranking (Why Does It Take Readers of This Blog So Long to Read a Post?)
We all know that number-crunchers like tax professionals are brainiacs because they work with math-'ma-tiks and complicated operations like addition, subtraction, multiplication, division and the like. Mathematical brainiacs can do magical things with numbers. My tax accountant, for example, reminds me that he can make one plus one equal any number his client wants. (I remind him that I don't want to get audited.) Bean-counters, like some tax professors, can also become so obsessed with "numerical realism" that they fail to see hidden truths within the numbers they hold close to their hearts. Take for example, the absurd metrics of blog visits and page views. Over on TaxProf Blog, you will find from time to time some sort of mouse click ranking of law prof blogs by these metrics compiled by Paul Caron (Cincinnati), including, most recently, his July 15, 2009 Top 35 Law Prof Blogs for the 12-month period ending June 30, 2009. Note what one commentator has to say: "The higher-ranked blogs on this list have almost nothing to do with law. They're merely lumped into the "law blogs" category to boost the supposed visitors' stats to all law blogs. Caron does this regularly, it's quite a fraud." Hey now, numbers don't lie but status-obsessed people can make whatever they want to out of mouse click counts. They can even include in their rankings traffic stats for law prof blogs that have ceased publication.
Caron's latest law prof blog ranking prompted several law prof responses to this transparent obsession with "numerical realism" in the practice of blog metrics. In an aptly tagged "Navel-Gazing" post, Since you really wanted to know which blogs by law professors had the most traffic, Brian Leiter (Chicago) wrote, "[Caron] omits the key statistic: average visit length! Could it be because mine is generally one minute thirty seconds, and his is only thirty seconds? Eat your heart out, Paul!" Following up on Leiter's observation a week later, Roger Alford (Pepperdine) commented on Opinio Jurist, "I bet you have no idea which [Caron-ranked] law blogs are the best read, that is, the ones that have “sticky” readership." Indeed we don't.
"If you take Paul Caron’s Top Law Prof Blog rankings and rank the blogs based on the 'average visit length' rather than based on traffic, you get noticeably different results," writes Alford and that's what Alford did in his July 22 post. Alford's sticky readership ranking post was duly noted by Caron because it was based on his scholarly work product but without comment about TaxProf Blog being ranked 5th in page views and 5th in visits but 27th out of 35 law prof blogs by average visit duration. Perhaps Caron did so because he knew that Alford's reported visit duration for his blog was twice as long as it usually is and he did not want to lend credibility to the abnormally high ranking for TaxProf Blog. Alford reported it at 71 seconds but Leiter had it right; TaxProf Blog's average visit duration is usually in the 30 to 40 second range. But for the vulgarity of Sitemeter's snapshot stats, TaxProf Blog would have dropped down to 30th place in Alford's sticky readership ranking, one position higher than the Law Professor Blogs Network's CrimProf Blog, a blog that ceased publishing posts on May 6, 2009, ten weeks after Alford compiled his data .
One would expect a short visit duration for a dead blog because no fresh content is being published but should dead blogs be included in any ranking study? Alford included CrimProf Blog because Caron listed it in his ranking. CrimProf Blog ranked 25th in visitors and 29th in page views for the 12 months ending June 20, 2009. As co-founder of the Law Professor Blogs Network, Caron's comments about Network blogs requires clarification: "Members of our Law Professor Blogs Network comprise, by visitors, two of the Top 10, four of the Top 20, and
ten one dead and nine live blogs of the Top 35 [live and dead] blogs; and by page views, two of the Top 10, four of the Top 20, and ten one dead and nine live blogs of the Top 35 [live and dead] blogs."
Average visit duration, however, doesn't tell the whole story because blog visitors usually view more than one page per visit. If you take that stat and apply it to average visit duration, you can rank blogs by duration per viewed page. I don't know how Caron missed this opportunity to craft another blog ranking but if he had he would have discovered that readers of his blog are some of the speediest speed readers in the law prof blogosphere.
I haven't re-ranked all 35 listed law prof blogs in Caron's ranking by this speedy readership metric to prove this point because, well, I may be bored right now but I'm not that bored. I have ranked some blogs in the Law Professor Blogs Network listed there and added this blog which is not listed because LLB is not a "law prof" blog in Caron's world even though many of LLB's editors and contributing editors have faculty status, graduated from law school, and teach courses in law schools just like, well, "law profs."
This is probably the first ever speedy law prof blog reader ranking so it must be an important "navel-gazing" contribution to blog metrics. Because this is very serious business, a note on data collection. My selection criterion was Network blogs with at least 250,000 total page views written by editors who take blogging seriously enough to post regularly by which I mean daily or damn near close to daily Monday through Friday. There are 10 such live Network blogs, hence we have an all-important "Top 10" ranking. Snapshot data for average visit duration and average page views per visit was taken on August 1. I've included the obligatory but asinine raw mouse click counts for each blog's total number of visits and total page views as of July 31, 2009.
So here's the product of waiting for my long-suffering wife to give me my list of weekend errands, The Top 10 Speedy Law Prof Blog Readership Ranking:
As you can see from the above table, the average duration per viewed page ranges from 88.57 seconds for 10th-ranked White Collar Crime Prof Blog to 31.67 seconds for top ranked TaxProf Blog. Now I do not want to imply that readers of criminal law blogs are slow readers, though the 84.71 second average duration per viewed page for 9th-ranked Sentencing Law & Policy might support the conclusion that they have to sound out the words they are reading. Note also that the readers of the these highly visited criminal law prof blogs take between 2.7 and 2.8 times as long to read a post compared to TaxProf Blog's readers. At just over half a minute, TaxProf Blog's topped ranked duration per viewed page clearly indicated that readers of TaxProf Blog are really, really fast readers and may belong to an elite group of uber speed readers in the law prof blogoshere.
Now one might say that not all pages are created equally because "pages" include blog homepages that display more than one post on the homepage but all of the listed blogs are similarly configured. And one might say that not all ranked blogs publish posts similar in length but the variety of post lengths published in the above blogs is close enough for this blog post about blog info-antics, not metrics. Finally one might say that while the ranked blogs publish posts daily, they do not all publish a similar number of posts daily and that might be some sort of intervening variable that makes the use of snapshot averages questionable. Ah...not being a mathematical brainiac and because my wife who is one happens to be napping right now, I'll move on by recognizing as factually accurate that some of the listed blogs do not publish more than 1 or 2 stories per weekday expect for the editors of TaxProf Blog and Law Librarian Blog who spit out posts as though the Internet might cease to exist tomorrow. So if it boils down to a speedy readership ranking of TaxProf Blog and Law Librarian Blog, readers of this blog come in last place because they take twice as long as TaxProf Blog readers to read a post. How can we let the tax professionals who read TaxProf Blog beat us this badly; aren't we librarians supposed to be readers by profession?
Oh wait, taking longer to read something might actually be a good thing by saying something positive about content. [JH]
July 07, 2009
How to Fend Off the Summer Doldrums, Count Mouse Clicks
Mouse click rankings for SSRN downloads covering US News law schools ranked from 23 to 100 by Bridget Crawford can be found on Feminist Law Professors (why starting with 23?) and "since it's the boring dog days of summer, and inspired by Professor Crawford," Brian Leiter tabulated the US News top 15 law schools according to gross SSRN downloads. Who's bored enough to fill in the gap between the 15th and 23rd ranked schools?
On the absurdity of all this, an anonymous commenter to one of Morriss's Volokh Conspiracy posts on rankings years ago wrote:
Downloads just aren't the same as actual citations: Setting aside whether even citations mean anything important about scholarship, a citation, at the very least, means that another scholar respected an article enough to reference it in his or her own work. Downloads don't mean the same thing. An abstract could sell a paper as having an entirely new schema for analyzing an area of law, then the paper could turn out to be garbage. But, SSRN would already reflect the paper as influential. Heck, the attached paper could be a blank .PDF with a catchy title, and it would do well.
April 08, 2009
Jonesing for Law School Rankings?
Experiencing law school rankings withdrawal because US News' traditional March publication date has been pushed back to April 23 this year? Well Avvo has added its own law school rankings to the genre. The site ranks law schools that produce the top "Avvo" rated attorneys. Here's the top 5:
- Yale Law School
- Stanford Law School
- Harvard Law School
- Columbia Law School
- New York University
Click here to browse the law school directory. [JH]
March 24, 2009
Profiling Social Networks by Analyzing User Tagging Behavior
Can an analysis of the tagging data of users serve as a useful metric to profile social networks? According to Ying Ding, Elin K. Jacob, James Caverlee, Michael Fried and Zhixiong Zhang the answer is a qualified "yes." In Profiling Social Networks: A Social Tagging Perspective, D-Lib Magazine (March/April 2009) the authors report on their investigation of social tagging using data gathered from Delicious, Flickr and YouTube for the years 2005, 2006 and 2007. Preliminary findings indicate that it is possible to profile a social network through the analysis of tagging data. Interestingly, the authors find that Delicious is a more representative venue for analyzing the social tagging behavior of users than either Flickr or YouTube. Are Delicious users more tech savvy? [JH]
February 11, 2009
Top 50 Cited Authors in HeinOnline's Law Journal Library
HeinOnline has compiled a list of the 50 most frequently cited authors in its Law Journal Library using Hein's new citation analysis tool, ScholarCheck. Here's the top 10:
- Posner, Richard A. cited 12,586 times in 251 articles.
- Sunstein, Cass R. cited 11,521 times in 267 articles.
- Epstein, Richard A. cited 6,194 times in 272 articles.
- Easterbrook, Frank H. cited 6,018 times in 84 articles.
- Prosser, William L. cited 5,585 times in 55 articles.
- Coffee, John C. Jr. cited 5,196 times in 68 articles.
- Delgado, Richard cited 5,165 times in 145 articles.
- Eskridge, William N. Jr. cited 5,029 times in 69 articles.
- Pound, Roscoe cited 4,869 times in 284 articles.
- Fischel, Daniel R. cited 4,703 times in 43 articles
January 27, 2009
Avvo Blog's Real-Time Law Blog Ranking by Traffic
The Avvo Blog has created an auto-updating list of the top 300 legal blogs, ordered by their traffic rankings using Alexa data. The creators write "because of the way Alexa works, we could not include every blog we would have liked to (no subdomains or folders on non-blog sites), but we are working to change that." That's why our Law Professor Blogs Network comes in fifth place. It's not a blog. It's the domain for some 50 blogs organized under individual subdomains, including LLB. Maybe the matter should have been worked out before going live with the ranking.
Here's the top 5:
- Above the law
- The Volokh Conspiracy
- Talk Left
- Alt House
- Law Professor Blogs Network
January 05, 2009
John Doyle's Law Journal Rankings
John Doyle (Washington and Lee Law School Library) has updated his law journal rankings database through 2008. Here three of the rankings for US law journals. Methodology and complete listing here. [JH]
1 Harvard Law Review 100
2 The Yale Law Journal 89.6
3 Columbia Law Review 82.4
4 Stanford Law Review 74.7
5 New York University Law Review 68.2
6 California Law Review 66.6
6 University of Pennsylvania Law Review 66.6
8 The Georgetown Law Journal 64.4
9 Virginia Law Review 63.5
10 Cornell Law Review 61.6
1 The Yale Law Journal 2.9
2 Columbia Law Review 2.75
3 New York University Law Review 2.7
4 Supreme Court Review 2.68
5 Harvard Journal of Law & Technology 2.59
6 Stanford Law Review 2.55
7 Cornell Law Review 2.54
8 Virginia Law Review 2.52
9 University of Pennsylvania Law Review 2.5
10 California Law Review 2.49
10 The Georgetown Law Journal 2.49
1 The Georgetown Law Journal 5
2 The Yale Law Journal 4.32
3 Harvard Law Review 4.29
4 Cornell Law Review 4.09
5 New York University Law Review 3.89
6 Columbia Law Review 3.81
7 California Law Review 3.8
8 Virginia Law Review 3.63
9 University of Pennsylvania Law Review 3.58
10 Harvard Civil Rights-Civil Liberties Law Review 3.46
December 01, 2008
Can Your Blog Give Your Boss Clues to What Projects Should Be Assigned to You?
The Myers-Briggs Type Indicator assessment is a personality test employers use as a means to sort out the workforce into productive assignments best suited to individual strengths and weaknesses. It has also been used in modified form to select juries. And now it is the "basis" for analyzing bloggers.
Typealyzer is having its 15 nanoseconds of blogosphere fame. Go to the site, plug in your blog's URL and out pops one of Myers-Briggs' 16 personality types. This blog and every law-related blog I checked except one are the INTJ (introverted, intuitive, thinking and judging) type. Interesting, Slaw.ca is not. Slaw.ca is the ISTJ (introverted, sensing, thinking and judging) type.
The INTJ type represents 2% of the US population. Folks of this type "have original minds and great drive for implementing their ideas and achieving their goals. Quickly see patterns in external events and develop long-range explanatory perspectives. When committed, organize a job and carry it through. Skeptical and independent, have high standards of competence and performance – for themselves and others."
The ISTJ type represents 11% of the US population. Folks of this type are "quiet, serious, earn success by thoroughness and dependability. Practical, matter-of-fact, realistic, and responsible.Decide logically what should be done and work toward it steadily, regardless of distractions. Take pleasure in making everything orderly and organized – their work, their home, their life. Value traditions and loyalty."
Typealzyer is all the rage for the moment but since the protocol for the Myers-Briggs assessment involves answering a series of forced choice questions, I seriously doubt the results; what is your blog's URL is not one of the Myers-Briggs questions. [JH]
October 10, 2008
Will SSRN Ever Crowdsource Its Digital Depository Like Amazon?
Have you noticed that SSRN has some really exciting new features like simple submissions, a "more intuitive, user-friendly" sign-in "modeled after commonly used websites and forms," and redesigned author and abstract pages? You can also export SSRN paper cites into common bib apps like EndNote and share papers through Digg, Del.icio.us, etc. In other words, no-brainer
improvements add-ons, certainly nothing to write home about.
CiteReader, an Infometric Tool. I have been playing around with the Company's CiteReader, a work in progress, but useful should you get curious about the download count noise law schools and their profs like to generate. Take a look at the numbers to put some meat on bony claims based on clicking a mouse button: track (1) number of abstract views, (2) number of downloads of SSRN paper(s); and (3) number of citations of the paper(s) in the SSRN database by law school, by prof, by article. Oops!
Amazon-esque Features. The Company has added one useful feature. On a paper's Abstract Page you will find a "People who downloaded this paper also downloaded" display listing other SSRN papers. Full of hubris, the Company writes:
We love this feature, and many users have actually told us they “found” papers relevant to their research through this list. ... As you can see, it is similar to other features on popular sites like Amazon.
So when will SSRN implement a commenting function to Abstract Pages like Amazon does? It might help abstract viewers make more informed mouse click decisions.
Distributing Works-in-Progress, Why? On a more serious note, Online crowdsourcing might even contribute to improving the quality of legal scholarship. Any SSRN user can see that authors upload to SSRN revised versions of their papers. Perhaps the papers were revised because some downloaders have actually read the papers and emailed the authors comments. But wouldn't crowdsourcing these works-in-progress by online commenting be a plus too? Wouldn't it be a more timely and contemporary way to do this? Contemporary as in using social media, as in SSRN not even having to invent the necessary applications to do this ... [sigh of relief] ... (SSRN isn't exactly known for being a design and implementation coding powerhouse, evidenced once again by the new features listed above).
Remember the bad old days when scholars tried to obtain input from other scholars by circulating drafts in the US mail, hoping recipients would take the time to write back a letter with comments and criticism. In library school we learned about this "informal college" of review and its drawbacks -- unknown writers not getting responses from known experts in the field, useful suggestions and corrections received too late to be added before the article had to be published.
So one has to ask the question, what is the point of e-distributing works-in-progress in SSRN without crowdsourcing via something called the "World Wide Web"? To date it appears to be the narcissistic self-stroking of law prof egos based on absurd download counts. If their articles aren't being cited, at least they are being
read downloaded, right? Or from an infometric perspective, at least SSRN's CiteReader provides a psychiatric fix for the academic delusions produced by drinking that info-antic Kool-aid. [JH]
September 15, 2008
Most Productive Not-Top 50 Law School Faculties
Roger Williams University School of Law has produced a scholarly productivity study for law schools ranking outside the US News Top 50 using articles published in "top journals" over a 15-year period, 1993-2008.
On a per capita basis, the top five law schools are
- University of San Diego: US News Rank, 82; Peer Review Score, 2.8
- Yeshiva University (Cardozo): 55; 2.7
- Florida State University: 55; 2.8
- University of Richmond: 68; 2.4
- University of Pittsburgh: 73; 2.8
NB: We're talking about articles published, not articles cited. A citation analysis correlated to US News peer review scores for the "not-Top 50" law school faculties would be much more interesting in my humble opinion. Hat tip to Brian Leiter (Chicago), Law School Reports. [JH]
February 20, 2008
Top 30 Law Prof Blogs: Harmless if not Very Meaningful Competition
Paul Caron (Cincinnati) charts traffic statistics of the "leading blogs" edited by law professors who use SiteMeter to record blog traffic. The selection of blogs is drawn from Dan Solove's "comprehensive" Law Professor Blogger Census, which of course isn't comprehensive (nor claimed by Solove to be) and wouldn't stand up to the population identification rigors needed for a real infometric analysis. But Caron's ranking, harmless if not very meaningful, is good enough for a blog post and a round of back slapping in law school faculty lounges across the country.
Or to put it another way:
Who cares [about law prof blog traffic rankings], you say? Blog Emperor Caron, of course! Curious that four of the top five have almost nothing to do with law; four of the top five are right-wing blogs; and three of the top five have almost no intellectual content. Welcome to the blogosphere! -- Posted by Brian Leiter (Texas)
Indeed, welcome to the academic legal blogosphere law prof style, complete with pathological myopia. Caron's ranking is, shall we say, a tad elitist. It intentionally excludes academic law library/law librarian blogs. He notes, for example, that Law Librarian Blog is not edited by law profs. Apparently my co-editor's credentials as an AAUP-represented tenured faculty member aren't good enough; neither are yours if similarly situated. The academic legal blogosphere must be a very exclusive club; membership, alas, does not require publishing intellectual content.
Of course the information for an inclusive traffic-based "ranking" is readily available thanks to Bonnie Shucha's excellent directory which was updated earlier this month and now lists 140 law library/law librarian blogs, many of which do reside in the academic legal blogosphere.
The giggle factor for info antics about the legal blogosphere is pretty high among law librarians and some, hopefully most, law profs. I wonder if LIS profs use these all too common posts as examples of what infometrics is not. Bottom line: take them for what they are worth -- law blog trivial pursuit.
Kudos to Dennis Kennedy for recognizing the contributions law library/law librarian bloggers make to the legal blogosphere by repeatedly awarding them his annual Blawggie Award for "Best Legal Blog Category".
"I have to be one of the biggest fans of law librarian blogs there is. I learn so much from these blogs and they get named for this award [Best Legal Blog Category] every year. As I said before, 'across the board, these blogs have developed into strong information resources, often with links to primary source information that I'm not sure how I would find otherwise.'" -- Dennis Kennedy
[quoted in our coverage of Kennedy's 2007 Blawggies]
Footnote: Law Librarian Blog would have come in 19th place by visitors and page views for the period covered in Caron's blog ranking. Other academic law library/law librarian blogs may have placed higher if they had been included but I'm sure Caron has other ways to show his appreciation for the contributions academic law librarians make. [JH]
February 13, 2008
Professional Reading: Review of Metric Services for Digital Libraries and Repositories
Chris Armbruster's (Research Associate, Max Planck Digital Library and Executive Director, Research Network 1989) Access, Usage and Citation Metrics: What Function for Digital Libraries and Repositories in Research Evaluation? is available from SSRN. Here's the abstract for this very interesting and helpful review article:
The growth and increasing complexity of global science poses a grand challenge to scientists: How to organise the worldwide evaluation of research programmes and peers? For the 21st century we need not just information on science, but also meta-level scientific information that is delivered to the digital workbench of every researcher. Access, usage and citation metrics will be one major information service that researchers will need on an everyday basis to handle the complexity of science.
Scientometrics has been built on centralised commercial databases of high functionality but restricted scope, mainly providing information that may be used for research assessment. Enter digital libraries and repositories: Can they collect reliable metadata at source, ensure universal metric coverage and defray costs?
This systematic appraisal of the future role of digital libraries and repositories for metric research evaluation proceeds by investigating the practical inadequacies of current metric evaluation before defining the scope for libraries and repositories as new players. Subsequently the notion of metrics as research information services is developed. Finally, the future relationship between a) libraries and repositories and b) metrics databases, commercial or non-commercial, is addressed.
Service reviewed include: Leiden Ranking, Webometrics Ranking of World Universities, COUNTER, MESUR, Harzing POP, CiteSeer, Citebase, RePEc LogEc and CitEc, Scopus, Web of Science and Google Scholar.
February 05, 2008
Some Evidence for the Assimilation of Blogs into the Structure of Legal Literature
|Why Cite to Blogs?|
|There are a number of reasons to cite to blogs. Ones usually identified are factual assertions, crediting/criticizing ideas, and using a blog post as supporting authority but one largely overlooked reason is the role blogs play as informal repositories of downloadable documents.|
Recently Balkinization's Jack Balkin and The Volokh Conspiracy's Orin Kerr observed that the law review citation rate for their blogs has increased somewhat significantly on an annual basis since 2004. This shouldn't come as much of a surprise. Blogs, as Jack Balkin writes "are being assimilated into the larger universe of legal writing and becoming part of the web of [legal] citations."
I took a quick look at annual blog citation rates for 2004-2007 recently and found similar increases. Using LexisNexis, instead of Westlaw (hat tip to PrawfsBlawg's Dan Markel who was wondering why no one seems to use Lexis-Nexis for citation counts anymore), I searched US Law Reviews & Journals Combined and US Federal & State Cases Combined by the domain name of three common blog service providers (blogspot.com, typepad.com, and wordpress.com) to estimate citation rates.
At the outset, I should emphasize that this search method underestimates the number of blog citations, possibly by a wide margin. For example, The Volokh Conspiracy was not captured because its URL, http://volokh.com/, does not identify a blog hosting service while Balkinization was captured because its URL, http://balkin.blogspot.com/, does include one of the blog hosting services I searched. Additional understating is likely due to the fact that this is a document count, not a pure citation count. (Any single counted document may cite two or more blogs, or more than one post from the same blog.)
This built-in undercount does not diminish from the fact that the below statistics do give a sense of the magnitude of the growth rate of blog citations. According to this estimate, blog citations in law reviews and court opinions have grown from about 70 in 2004 to over 500 in 2007 (and still counting since many law reviews have not completed their 2007 publishing cycle). I believe it is fair to say that for 2005 and 2006 blog citations probably grew exponentially on a document count basis, doubling each year.
It is unlikely, however, that any final count for 2007 will show a similar rate of growth. If the case, would this mean that blogs are "on the decline." Doubtful. It would simply mean that the blogging phenomenon is maturing. As with other forms of publication, with age comes acceptance and recognition of place within the structure of legal literature.
Click to enlarge. [JH]
January 17, 2008
Be True to Your School
Check out the ten law schools who graduated the most law teachers on Brian Leiter's Law School Reports. The data was obtained from 7,820 tenured and tenure-track law professors listed in the AALS Directory of Law Teachers.
The data was provided to Leiter by a Michigan Assistant Dean in response to Leiter's recent Ludicrous Hyperbole Watch: University of Michigan Law School post. Leiter (JD '87 and Ph.D in philosophy '95 from Michigan) writes, "The data [as supplied] was presented in aggregate form, which works to the advantage of larger schools like Michigan." Ah yes, crunch the numbers to make your school look good. Leiter reworked the data for a more appropriate per capita ranking.
Beware the thin-skinned law school administrator. [JH]
November 30, 2007
The Monopoly Board of Citation Rankings
Check out Roger Alford's The Monopoly Board of Citation Rankings. Using Brian Leiter's Most Cited Law Professors by Specialty, 2000-2007, Alford assigns each of the 18 specialties covered in Leiter’s 7-year citation study a real estate value based on the 10th most-cited person in each specialty. It's a much more clever illustration than my more mundane citation density graph. [JH]
November 15, 2007
Just Released: Leiter's Most Cited Law Professors by Specialty, 2000-2007
Brian Leiter has published his study entitled Most Cited Law Professors by Specialty, 2000-2007 at Brian Leiter's Law School Rankings. Here's the announcement on his blog. The study identifies the 10 most cited faculty (in many instances the 20 most cited faculty) in 18 legal specialties. The image (left, click to enlarge) displays the citation density for the top 10 most cited faculty in each specialty. Note how compact Will, Trusts and Estates, Tax, Criminal Law and Procedure, and several other specialties are.
Leiter's study also ranks the top 15 law schools by percentage of faculty in the Top 10 or Top 20 of their specialties. Not surprisingly, Yale ranks first.
If one wishes to estimate a law school's scholarly reputation in a specialty by the number of faculty members listed in Leiter's citation analysis of that specialty then Yale stands out with 4 faculty members in the top 10 for Constitutional and Public Law and 3 in the top 8 for Legal History; Columbia with 3 in the top 4 for Business Law; Harvard with 2 in the top 3 and Georgetown with 3 in the top 10 for Critical Theories; UC-Berkeley with 2 in the top 3 for IP/Cyberlaw; and the University of Chicago with 3 in the top 4 for Law & Economics.
Top Guns. Some top ranked law professors stand out by how many more citations their work has garnered over the next most cited scholar in their specialty. For example:
- Constitutional and Public Law: Cass Sustein's 6,180 citations (first in the field) compared to Laurence Tribe's 3,520 (second);
- Law & Economics: Richard Epstein, first with 3,390 citations, Eric Posner, second with 2,020 (but Epstein has had a 22 year head start);
- Law & Philosophy: Ronald Dworkin, 3,070 followed by Martha Nussbaum, 1,130; and
- Legal Ethics/Legal Profession: Deborah Rhodes with 3,180, Geoffrey Hazard, Jr., 1,140.
A Quick Look at Demographics for the Top 10 Most Cited Law Professors by Specialty
|Click on the image for a larger display|
Young Guns. As observed by Leiter, the lists are dominated by faculty in their 50s and 60s but I was surprised to find that the "young guns" are fairly well represented with 16% of the listed Top 10 most cited faculty being under the age of 50, including the following scholars listed in the Top 5 of their specialty:
- Criminal Law and Procedure: Dan Kahan (Yale) age 44 (ranked 1st), William Stuntz (Harvard) 49 (4th)
- Environmental Law: Richard Revesz (New York) 49 (3rd)
- International Law: Jack Goldsmith (Harvard) 45 (1st), Curtis Bradley (Duke) 43 (4th) and Sean Murphy (George Washington) 47 (5th)
- IP/Cyber Law: Mark Lemley (Stanford) 41 (1st), Robert Merges (UC-Berkeley) 48 ((2nd), Dan Burk (Minnesota) 45 (tied for 5th)
- Law & Economics: Eric Posner (Chicago) 42 (2nd), Ian Ayres (Yale) 48 (3rd)
- Law & Social Science: Lee Epstein (Northwestern) 49 (2nd), Jeffrey Rachlinski (Cornell) 41 (5th)
- Tax: Edward McCaffery (USC) 49 (3rd)
Two thirty-somethings made the most cited lists: Robert Sitkoff (Harvard) who at the age of 33 is the seventh most cited scholar in Wills, Trusts & Estates, and Orin Kerr (George Washington) age 36, the 20th most cited scholar in Criminal Law & Procedure. Not bad for a citation study that goes back to 2000.
Gender. Sixteen percent of the Top 10 most cited faculty are women, including the following scholars listed in the Top 5 of their speciality:
- Civil Procedure: Judith Resnik (Yale) (ranked 2nd), Deborah Hensler (Stanford) (4th)
- Critical Theories: Martha Minow (Harvard) (1st), Catharine MacKinnon (Michigan) (5th)
- Environmental Law: Carol Rose (Arizona) (2nd)
- Evidence: Margaret Berger (Brooklyn) (2nd)
- IP/Cyber Law: Pamela Samuelson (Berkeley) (3rd), Jessica Litman (Michigan) (4th), Jane Ginsburg (Columbia) (tied for 5th)
- Labor & Employment: Katherine van Wezel Stone (UCLA) (4th), Cynthia Estlund (New York) (5th)
- Law & Philosophy: Martha Nussbaum (Chicago) (2nd)
- Law & Social Science: Deborah Merrit (OSU) (3rd)
- Legal Ethics/Legal Profession: Deborah Rhode (Stanford) (1st)
- Legal History: Reva Siegel (Yale) (3rd)
September 14, 2007
Size Still Matters in Latest Per Capita Faculty Productivity Study
Professor Michael Yelnosky, Associate Dean for Academic Affairs at Roger Williams has released this draft "productivity" study of Tier 3 and Tier 4 law schools (Criteria used in study). The Yelnosky study applies features of the method used by Texas law prof Brian Leiter in an earlier 2000-2002 study (Criteria used in study) with some modifications. For example, unlike the Leiter study, the Yelnosky study examines the Top 50 journals, defined as the general law reviews published by the 54 schools receiving the highest peer assessment scores in the U.S. News Rankings (2.8 or higher).
Readers of this blog will note that I respect Leiter's law school rankings studies for their careful employment of objective metrics and equally careful identification and examination of the shortcomings of ranking studies, including his own. See, for example, Top 35 Law Faculties Based on Scholarly Impact for 2007 (one of "those rare citation studies coming out of the legal academy that exemplifies the precision of infometrics, not the carelessness of 'info antics.'").
One metric in Leiter's 200-2002 study that is used in Yelnosky's study, however, is not one of Leiter's best efforts. He applied a points system to the length of articles (0 points for articles under 6 pages; 1 point for articles 6-20 pages; 2 points for articles 21-50 pages; 3 points for articles exceeding 50 pages) and addressed the self-publishing aspect of articles published in-house by halving their point values. Objective? Yes, but the "so what" factor looms large because of the arbitrary nature of the page cutoffs for earning points. For an earlier study using the same methodology for per capita faculty productivity, see Brian Leiter, Measuring the Academic Distinction of Law Faculties, 29 Journal of Legal Studies 451 (2000) [Westlaw].
I agree with Leiter on two issues. Short law review articles and in-house publications should be addressed in per capita faculty productivity studies. But I would not use a points system. Instead I would simply identity both facets in a table and use cumulative stats for data analysis: (1) all articles; (2) all articles longer than X pages; (3) all non-in-house articles, etc.
The Yelnosky study continues the early Leiter "tradition." I'm happy to note that Leiter has not. [JH]
September 10, 2007
Three Thumbs Down on LLJ Published Info Antics Article
When my co-editor Ron Jones called Nova law profs Robert M. Jarvis and Phyllis Coleman's Ranking Law Reviews by Author Prominence - Ten Years Later, 99 Law Library Journal 573 (2007) to my attention, my first thought was to categorize the work as "Professional Reading." After reviewing the methodology, it's more info antics than metrics. Much more. I have to concur with Brian Leiter's assessment of this work. "This, sad to say, has to be the most pointless ranking exercise since the Cooley silliness, one that truly succeeds in conveying no meaningful information." Ranking Law Reviews by "Author Prominence", Brian Leiter's Law School Reports. See also PropertyProf Blog editor Ben Barros' Law Review Rankings of the Worst Sort ("The contributor scale on page three is just absurd.").
It's one thing to point to Noble laureates as Garfield once did in a "proof of concept" moment but quite another to create a 40-category contributor scale like Jarvis and Coleman does. I'm not listing the "Top X" results from this study for obvious reasons. [JH]