Thursday, September 20, 2018
According to the American Bar Association (ABA), citing to Law.com and TaxProfBlog editor Dean Paul Caron, the national average score on the MBE multiple-choice portion of the July bar exam dropped to its lowest level in 34 years. http://www.abajournal.com; https://www.law.com; http://taxprof.typepad.com. The National Conference of Bar Examiners (NCBE) reports that the July 2018 MBE average score was just 139.5, while for the July 1984 exam, Law.com reports that the MBE average score was likewise low at 139.21. http://www.ncbex.org/news; https://www.law.com.
In an article by Law.com, the President of the NCBE - Judith Gundersen - is quoted as saying that "they [this summer's lower MBE scores] are what would be expected given the number of applicants and LSAT 25th percentile means of the 2015 entering class." https://www.law.com. In other words, according to the NCBE, this summer's low score average is the result of law school admissions decisions based on the NCBE's appraisal of 25 percentile LSAT data for entering 2015 law students.
Nevertheless, despite the NCBE's claim, which was previously theorized by the NCBE back in 2015 (namely, that bar exam declines are related to LSAT declines), previous empirical research found a lack of empirical support for the NCBE's LSAT claim, albeit limited to one jurisdiction, one law school's population, and admittedly not updated to reflect this summer's bar exam results. Testing the Testers.
As an armchair statistician with a mathematics background, I am leery of one-size-fits-all empirical claims. Life is complex and learning is nuanced. Conceivably, there are many factors at play that might account for bar exam results in particular cases, with many factors not ascribable to pure mathematical calculus, such as the leaking roof in the middle of the first day of the Colorado bar exam. http://www.abajournal.com/news/article/ceiling_leaks_pause_colorado_bar_exam.
Here's just a few possible considerations:
• The increase to 25 experimental questions embedded within the set of 200 MBE multiple-choice questions (in comparison to previous test versions with only 10 experimental questions embedded).
• The addition of Federal Civil Procedure as a relatively recent MBE subject to the MBE's panoply of subjects tested.
• The apparent rising incidences of anxiety, depression, and learning disabilities found within law school populations and graduates.
• The economic barriers to securing bar exam testing accommodations despite longitudinal evidence of law school testing accommodations.
• The influence of social media, the internet age, and smart phones in impacting the learning environment.
• The difficulty in equating previous versions of bar exams with current versions of bar exams given changes in the exam instrument itself and the scope of subject matter tested.
• The relationship among experiential learning, doctrinal, and legal writing courses and bar exam outcomes.
Consequently, in my opinion, there's a great need (and a great opportunity) for law schools to collaborate with bar examiners to hypothesize, research, and evaluate what's really going on with the bar exam. It might be the LSAT, as the NCBE claims. But, most problems in life are much more complicated. So, as a visual jumpstart to help law schools and bar examiners brainstorm possible solutions, here's a handy chart depicting the overall downward trend with respect to the past ten years of national MBE average scores. (Scott Johns).