Tuesday, October 20, 2015
This week, I received my annual ballot from U.S. News & World Report to rank law school programs for “clinical training.” Clinical program directors get to vote on peer schools with notoriously slim guidance and standards. Each year, CLEA issues a statement criticizing the process and offering some standards for consideration. CLEA attempts a balance among the idealistic desire to eliminate the entire process, the cynical temptation to utterly game the rankings, and a sincere attempt to guide us all to handle it the best we can. We are all at once frustrated and complicit with the regime.
This year, however, I note a new and strange oversight that belies the integrity of the enterprise.
The survey includes instructions and scant standards, but this year’s are extra perplexing. For the “Annual Peer Assessment of Law Schools Clinical Training,” the instructions include this guidance (emphasis added):
2. Identify up to fifteen (15) schools that have the highest quality alternative dispute resolution courses or programs. In making your choices, consider all elements that contribute to a program’s academic excellence, for example, the depth and breadth of the program, faculty research and publication, etc.
Now, my colleagues down the hall in the Straus Institute for Dispute Resolution have been ranked #1 in their category for 11 straight years, and I sometimes teach on the Straus faculty. Maybe this will help our clinical program ranking, but probably that’s not what USN has in mind.
In my pained ambivalence, I will assume that the USN folks mean to inquire about clinical programs, but apparently they are paying us very little attention and care. I intend to follow CLEA’s much better advice, but perhaps USN should consider whether proof reading might make its product more reliable.
UPDATE: We are not alone.
UPDATE II: On Oct. 23, 2015, USN sent out revised surveys, correcting for the mistake and seeking rankings on the "highest quality clinical training courses and programs," due on Nov. 16.