Monday, June 11, 2012
A Special Message to Law School Applicants: Why Law School Applicants Should Not Rely on The U.S. News Rankings
Potential law students generally rely on the U.S. News Law School Rankings to help them decide which law school they should attend. However, these rankings have no validity. Legal education scholars have been attacking them for years. For example, Brian Leiter pointed out that some of the ranking categories are highly manipulable and some of the categories favor smaller schools and penalize larger schools. Similarly, Lynda Edwards has written, "Critics of the U.S. News rankings say the magazine exercises too little control over the quality of the information submitted; several of the self-reporting factors utilized in the methodology, they say, actually reward those law schools willing to cheat." (here)
The biggest flaw in the law school rankings is the employment data. U.S. News measures these at graduation and nine months after graduation. They constitute 18% of the total. U.S. News states, "employment rates are figured solely based on the number of grads working at that point in time full or part time in a legal or non-legal job divided by the total number of J.D. graduates." In other words, U.S. News includes parttime and nonlegal jobs. This means they count parttime jobs doing document review; they even count working at Starbucks. For example, law school A might have 100% employment in big law jobs and law school B 100% employment at McDonalds, but both would be scored the same. Recently, law schools have begun to hire their own graduates in temporary jobs. (here) ("The jobs typically end shortly after the U.S. News reporting dates.” here)Theoretically, a law school can attain 100% employment by doing this. Does this tell you anything about what law school to attend? In sum, 18% of the law school scores on U.S. News are meaningless because they include parttime jobs, nonlegal jobs, and temporary jobs provided by the law schools. Meaningless, meaningless, meaningless!!!
The assessment score by lawyers/judges constitutes 15% of the score. U.S. News states, "legal professionals, including the hiring partners of law firms, state attorneys general, and selected federal and state judges, were asked to rate programs on a scale from 1 (marginal) to 5 (outstanding). Those individuals who did not know enough about a school to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it." Only about 12% of those surveyed responded. (Also, "The two most recent years lawyers' and judges' surveys were averaged . . .")
This 12% response rate calls this category into question. More importantly, I question how these legal professionals can have knowledge of the approximately 200 law schools in this country. A hiring partner or judge in NYC may be able to evaluate NYC law schools because they see law graduates from local schools frequently. However, how can they evaluate California law schools? Equally important, geographic location may be a disadvantage in this category. Do law schools in urban areas have advantages over law schools in remote areas, like Montana or New Mexico? (Paul Caron has stated, "“There is no way of knowing whether the magazine has geographical diversity among the lawyers and judges it surveys or even how many bother to respond to the surveys.” He added, "Yet there is valid data indicating the U.S. News rankings are skewed in favor of Northeastern universities partly because of the way reputation is evaluated." (here))
The peer assessment score constitutes 25% of the total. U.S. News states, "In the fall of 2011, law school deans, deans of academic affairs, chairs of faculty appointments, and the most recently tenured faculty members were asked to rate programs on a scale from marginal (1) to outstanding (5). Those individuals who did not know enough about a school to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it."
At first, this would seem to be a good measure of law school quality. Most academics, however, rate law schools based on their scholarship, not their teachers. If you want to become a law professor, this is a good measure for choosing a law school. However, it is meaningless for picking a law school based on the quality of its instruction. Good scholars are not necessarily good teachers.
Finally, selectivity (25%) can be misleading. G.P.A.s are not uniform because colleges are of different quality and have different grading policies (i.e., grade inflation). As U.S. News has admitted, "The difficulty level of college courses is much less important than the grades received in those classes, because law school admissions committees do an initial sort of applicants based solely on GPA and LSAT scores." (here) As Brian Leiter has noted, student-faculty ratio are manipulable because it depends on how schools "count" their faculty. Acceptance rates are also misleading because they often indicate how good a law school is at getting applications (such as using free online applications) rather than selectivity. ("The Association of American Law Schools has reported that some schools increase rejection rates—and boost selectivity scores—by encouraging students with no chance of admission to apply." here)
(Bar passage rate is an important indicator of how law schools are doing. However, it only constitutes 2% of the total in U.S. News.)
I realize that law school applicants want some way to compare law schools, but the U.S. News Rankings aren’t it. A ouija board is as accurate.