Thursday, November 13, 2014
There have been several blog posts, email exchanges, and listserve threads discussing the decline in the Summer 2014 MBE scores and state pass rates, including a reproachful letter recently sent to the NCBE from a law school Dean. The National Conference of Bar Examiners stated in an October letter, that they see the drop in scores as a “matter of concern.” But, they also stated that their equating, the adoption of the Uniform Bar Examination, and scoring of the test were not the cause of the decline. The only explanation mentioned in the letter, albeit vague and slightly patronizing, was that the July 2014 test takers were “less able” than the July 2013 test takers.
After recovering from a bit of shock, this statement led me to question whether students really were “less able” to pass the bar exam this summer than they were last summer. I have helped prepare students for the bar exam, in various forms, for over 14 years. Throughout this time, I have in fact encountered only a handful of individuals who are not capable of passing the bar exam. However, I have also worked with dozens of capable and competent bar applicants who struggle with passing the bar for various reasons.
Tennessee reported that the national mean scaled MBE score for July 2014 was 141.47, which is the lowest since the July 2004 MBE. Thus, this drop created lower bar pass rates across almost all jurisdictions this summer. Again, while this drop is noted, it is not evident as to why this drop occurred. In the same letter referenced above, the NCBE noted that the number of test takers dropped by 5% between the July 2013 and July 2014 exams. However, it is unclear how the number of test takers has any bearing on the actual test taker’s performance. The mere fact that less individuals took the exam, 5% less, should not dictate a lower pass rate.
The data, or lack of it, led me to again question, “Were this summer’s bar applicants “less able?” Many commenters have refuted that the LSAT is to blame since LSAT scores for this group of test takers (2011 law school start dates) did not take a plunge. Thus, lower LSATs cannot adequately explain this decline.
In addition, according to one of the leading national bar review companies, the mean scores for their simulated exams in Summer 2013 and Summer 2014 were virtually identical. As many of us know, there is a strong correlation between simulated exam performance and actual MBE performance. Thus, it bears noting that, readily identifiable performance indicators lead to the conclusion that the test takers in summer 2014 were just as capable as the test takers in summer 2013.
If this is the case, why did this decline occur? Was it the exam-soft fiasco? Could it be the NCBE’s new question format? Is this a result of an error in the scaling process? Or, could it possibly be due to the retirement of long time NCBE Director of Testing Dr. Susan Case? Ultimately, is the decline a consequence of form difficulty differences and not group differences? Without knowing the specifics of the anchor test, the equating calculations, and specific differences within the tested groups, it is virtually impossible to have a definitive answer. That said, we should keep asking these questions. Individuals who fail the bar exam need us to keep asking these questions.
(Lisa Bove Young)