Wednesday, November 26, 2014
The Michael S. Maurer Crossword Puzzle
Apropos of not much at all, I noticed when printing my New York Times crossword puzzle this morning that its constructor was the same Michael S. Maurer who is the named benefactor of Bill Henderson's school.
I knew Mickey Maurer when I was in Indianapolis and he ran the Indiana Economic Development office. Stand up guy. He told me that he wasn't very good at doing crosswords, even though he was a regular contributor to the New York Times.
N.B.: Will Shortz is also an Indiana U. grad. I don't know if that's a coincidence.
November 26, 2014 | Permalink | Comments (2)
Tuesday, November 11, 2014
What Might Have Contributed to an Historic Year-Over-Year Decline In the MBE Mean Scaled Score?
The National Conference of Bar Examiners (NCBE) has taken the position that the historic drop in the MBE Mean Scaled Score of 2.8 points between the July 2013 administration of the bar exam (144.3) and the July 2014 administration of the bar exam (141.5) is solely attributable to a decline in the quality of those taking a bar exam this July. Specifically, in a letter to law school deans, the NCBE stated that: “Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013.”
Notably, the NCBE does not indicate what other “indicators” it looked at “to challenge the results.” Rather, the NCBE boldly asserts that the only fact that explains an historic 2.8 point drop in the MBE Mean Scaled Score is “that the group that sat in July 2014 was less able than the group that sat in July 2013."
I am not persuaded.
(Neither is Brooklyn Law School Dean Nicholas Allard, who has responded by calling the letter “offensive” and by asking for a “thorough investigation of the administration and scoring of the July 2014 exam.” Nor is Derek Muller, who earlier today posted a blog suggesting that the LSAT profile of the class of 2014 did not portend the sharp drop in MBE scores.)
I can’t claim to know how the NCBE does its scaled scoring, so for purposes of this analysis, I will take the NCBE at its word that it has “double-checked” all of its calculations and found that there are no errors in its scoring.
If we accept the premise that there are no scoring issues, then the historic decline in the MBE Mean Scaled Score is attributable either to a “less able” group taking the MBE in July 2014 or to issues associated with the administration of the exam or to some combination of the two.
The NCBE essentially has ignored the possibility that issues associated with the administration of the exam might have contributed to the historic decline in the MBE Mean Scaled Score and gone “all in” on the “less able” group explanation for the historic decline in the MBE Mean Scaled Score. The problem for the NCBE is that it will be hard-pressed to demonstrate that the group that sat in July 2014 was sufficiently “less able” to explain the historic decline in the MBE Mean Scaled Score.
If one looks at the LSAT distribution of the matriculants in 2011 (who became the graduating class of 2014) and compares it with the LSAT distribution of the matriculants in 2010 (who became the graduating class of 2013), the NCBE probably is correct in noting that the group that sat in July 2014 is slightly “less able” than the group that sat in July 2013. But for the reasons set forth below, I think the NCBE is wrong to suggest that this alone accounts for the historic drop in the MBE Mean Scaled Score.
Rather, a comparison of the LSAT profile of the Class of 2014 with the LSAT profile of the Class of 2013 would suggest that one could have anticipated a modest drop in the MBE Mean Scaled Score of perhaps .5 to 1.0. The modest decrease in the LSAT profile of the Class of 2014 when compared with the Class of 2013, by itself, does not explain the historic drop of 2.8 reported in the MBE Mean Scaled Score between July 2013 and July 2014.
THINKING ABOUT GROUPS
The “group” that sat in July 2014 is comprised of two subgroups of takers – first-time takers and those who failed a bar exam and are retaking the bar exam. I am not sure the NCBE has any basis to suggest that those who failed a bar exam and are “retaking” the bar exam in 2014 were a less capable bunch than a comparable group that was “retaking” the bar exam in 2013 (or in some other year).
What about “first-time takers”? That group actually consists of two subgroups as well – those literally taking the exam for the first time and those who passed an exam in one jurisdiction and are taking the exam for the “first-time” in another jurisdiction. Again, I am not sure the NCBE has any basis to suggest that those who passed a bar exam and are taking a bar exam in another jurisdiction in 2014 were a less capable bunch than a comparable group that was taking a second bar exam in 2013.
So who’s left? Those who actually were taking a bar exam for the very first time in July 2014 – the graduates of the class of 2014. If we accept the premise that the “retakers” in 2014 were not demonstrably different than the “retakers” in 2013, than the group that was “less capable” in 2014 has to be the graduates of 2014, who the NCBE asserts are “less capable” than the graduates of 2013.
COMPARING LSAT PROFILES
The objective criteria of the class that entered law school in the fall of 2011 (class of 2014) is slightly less robust than the class that entered law school in the fall of 2010 (class of 2013). The question, however, is whether the drop in quality between the class of 2013 and the class of 2014 is large enough that we could anticipate that it would yield an historic drop in the MBE Mean Scaled Score of 2.8 points?
The answer to that is no.
The difference in profile between the class of 2014 and the class of 2013 does not reflect an “historic” drop in quality and would seem to explain only some of the drop in MBE Mean Scaled Score, not a 2.8 point drop in MBE Mean Scaled Score.
To understand this better, let’s look at how the trends in student quality have related to changes in the MBE Mean Scaled Score over the last decade.
Defining “student quality” can be a challenge. A year ago, I noted changes over time in three “groups” of matriculants – those with LSATs at or above 165, those with LSATs of 150-164, and those with LSATs below 150, noting that between 2010 and 2013, the number at or above 165 has declined significantly while the number below 150 has actually grown, resulting in a smaller percentage of the entering class with LSATs at or above 165 and a larger percentage of the entering class with LSATs below 150.
While the relatively simplistic calculations described above would provide some basis for anticipating declines in bar passage rates by 2016, they would not explain what is going on this year without more refinement.
In his blog posting earlier today, Derek Muller attempts to look at the strength of each class by calculating "projected MBE" scores drawing on an article from Susan Case and then comparing those to the actual MBE scores, showing some close relationship over time (until this year). I come to a similar conclusion using a different set of calculations of the "strength" of the graduating classes over the last several years based on the LSAT distribution profile of the matriculating classes three years earlier.
To develop this more refined analysis of the strength of the graduating classes over the last nine years, I used the LSAC’s National Decisions Profiles to identify the distribution of matriculants in ten five-point LSAT ranges – descending from 175-180 down to 130-134. To estimate the “strength” of the respective entering classes, I applied a prediction of bar passage rates by LSAT scores to each five point grouping and came up with a “weighted average” bar passage prediction for each class.
(In his article, Unpacking the Bar: Of Cut Scores, Competence and Crucibles, Professor Gary Rosin of the South Texas College of Law developed a statistical model for predicting bar passage rates for different LSAT scores. I used his bar passage prediction chart to assess the “relative strength” of each entering class from 2001 through 2013.
LSAT RANGE |
Prediction of Success on the Bar Exam Based on Lowest LSAT in Range |
175-180 |
.98 |
170-174 |
.97 |
165-169 |
.95 |
160-164 |
.91 |
155-159 |
.85 |
150-154 |
.76 |
145-149 |
.65 |
140-144 |
.50 |
135-139 |
.36 |
130-134 |
.25 |
Please note that for the purposes of classifying the relative strength of each class of matriculants, the precise accuracy of the bar passage predictions is less important than the fact of differential anticipated performance across groupings which allows for comparisons of relative strength over time.)
One problem with this approach is that the LSAC (and law schools) changed how they reported the LSAT profile of matriculants beginning with the entering class in the fall of 2010. Up until 2009, the LSAT profile data reflected the average LSAT score of those who took the LSAT more than once. Beginning with matriculants in fall 2010, the LSAT profile data reflects the highest LSAT score of those who took the LSAT more than once. This makes direct comparisons between fall 2009 (class of 2012) and years prior and fall 2010 (class of 2013) and years subsequent difficult without some type of “adjustment” of profile in 2010 and beyond.
Nonetheless, the year over year change in the 2013-2014 time frame can be compared with year over year changes in the 2005-2012 time frame.
Thus, having generated these “weighted average” bar passage projections for each entering class starting with the class that began legal education in the fall of 2002 (class of 2005), we can compare these with the MBE Mean Scaled Score for each July in which a class graduated, particularly looking at the relationship between the change in relative strength and the change in the corresponding MBE Mean Scaled Score. Those two lines are plotted below for the period from 2005-2012. (To approximate the MBE Mean Scaled Score for graphing purposes, the strength of each graduating class is calculated by multiplying the weighted average predicted bar passage percentage, which has ranged from .801 to .826, times 175.)
Comparison of Class Strength Based on Weighted Average Class Strength (Weighted Average Bar Passage Prediction x 175) with the MBE Mean Scaled Score for 2005-2012
What this graph highlights is that between 2005 and 2012, year to year changes in the MBE Mean Scaled Score largely “tracked” year to year changes in the “quality” of the graduating classes. But perhaps most significantly, the degree of change year over year in “quality” generally is reflected in the “degree” of change year over year in MBE Mean Scaled Scores. From 2008 to 2009, the drop in “quality” of 1.5 from 144.6 to 143.1 actually was reflected in a drop in MBE Mean Scaled Scores from 145.6 to 144.7, a drop of 0.9 points. Similarly, from 2009 to 2010, the drop in “quality” of 1.1 from 143.1 to 142 actually was reflected in a drop in the MBE Mean Scaled Scores from 144.7 to 143.6, a drop of 1.1 points. This two-year drop in quality of 2.6 points from 144.6 to 142 corresponded to a two-year drop in MBE Mean Scaled Scores of 2.0 points from 145.6 to 143.6.
How does this help us understand what has happened in 2014 relative to 2013? The decrease in quality of the class of 2014 relative to the class of 2013 using the “Weighted Average Bar Passage Projection” methodology above reflects a change from 145.1 to 144.2 – a drop of 0.9 (less than the year over year changes in 2009 and 2010). Accordingly, one might anticipate a decline in MBE Mean Scaled Scores, but probably a decline slightly smaller than the declines experienced in 2009 and 2010 – declines of .9 and 1.1 point, respectively.
Does the decline in quality between the Class of 2013 and the Class of 2014 explain some of the decline in MBE Mean Scaled Scores? Certainly. This analysis suggests a decline comparable to or slightly less than the declines in 2009 and 2010 should have been expected.
But that is not what we have experienced. We have experienced an historic decline of 2.8 points. Yet, the NCBE tells us that in looking at other indicators “all point to the fact that the group that sat in July 2014 is less able than the group that sat in July 2013.”
THE EXAMSOFT DEBACLE
What the NCBE fails to discuss, or even mention, is that there is one other “indicator” that was a distinctive aspect of the bar exam experience for the group that sat in July 2014 that the group that sat in July 2013 did not experience – the ExamSoft Debacle.
For many of those in one of the many jurisdictions that used ExamSoft in July 2014, the evening between the essay portion of the bar exam and the MBE portion of the bar exam was spent in needless anxiety and stress associated with not being able to upload the essay portion of the exam. This stress and anxiety were compounded by messaging that suggested the failure to upload in a timely manner would mean failing the bar exam (which messaging was only corrected late in the evening in some jurisdictions).
In these ExamSoft jurisdictions, I can only imagine that some number of those taking the MBE on the second day of the exam were doing so with much less sleep and much less focus than might have been the case if there had not been issues with uploading the essay portion of the exam the night before. If this resulted in “underperformance” on the MBE of just 1%-2% (perhaps missing two to four additional questions out of 200), this might have been enough to trigger a larger than expected decline in the MBE Mean Scaled Score.
ONE STATE’S EXPERIENCE BELIES THE NCBE STORY
It will be hard to assess the full reality of the July 2014 bar exam experience in historical context until 2015 when the NCBE releases its annual statistical analysis with state by state analyses of first-time bar passage rates. It is very difficult to make comparisons across jurisdictions regarding the July 2014 bar exam at the present time because there is no standardized format among states for reporting results – some states report overall bar passage rates, some disaggregate first-time bar passage rates and some states report school specific bar passage rates. To make meaningful comparisons year-over-year focused on the experience of each year’s graduates, the focus should be on first-time bar passage (even though as noted above, that also is a little over inclusive).
Nonetheless, the experience of one state, Iowa, casts significant doubt on the NCBE “story.”
The historical first-time bar passage rates in Iowa from 2004 to 2013 ranged from a low of 86% in 2005 to a high of 93% in 2009 and again in 2013. In the nine-year period between 2005 and 2013, the year to year “change” in first-time bar passage rates never exceeded 3% and was plus or minus one or two percent in eight of the nine years. In 2014, however, the bar passage rate fell to a new low of 84%, a decline of 9% -- more than four times the largest previous year-over-year decline in bar passage rates since 2004-2005.
YEAR |
2004 |
2005 |
2006 |
2007 |
2008 |
2009 |
2010 |
2011 |
2012 |
2013 |
2014 |
First Time Bar Passage Rate |
87%
|
86% |
88% |
89% |
90% |
93% |
91% |
90% |
92% |
93% |
84% |
Change from Prior Year |
|
-1 |
2 |
1 |
1 |
3 |
-2 |
-1 |
2 |
1 |
-9 |
The NCBE says that all indicators point to the fact that the group that sat in 2014 was “less able” than the group that sat in 2013. But here is the problem for the NCBE.
Iowa is one of the states that used ExamSoft in which test-takers experienced problems uploading the exam. The two schools that comprise the largest share of bar exam takers in Iowa are Drake and Iowa. In July 2013, those two schools had 181 first-time takers (out of 282 total takers) and 173 passed the Iowa bar exam (95.6% bar passage rate). In 2014, those two schools had 158 first-time takers (out of 253 total) and 135 passed the Iowa bar exam (85.4% bar passage rate), a drop of 10.2% year over year.
Unfortunately for the NCBE, there is no basis to claim that the Drake and Iowa graduates were “less able” in 2014 than in 2013 as there was no statistical difference in the LSAT profile of their entering classes in 2010 and in 2011 (the classes of 2013 and 2014, respectively). In both years, Iowa had a profile of 164/161/158. In both years, Drake had a profile of 158/156/153. This would seem to make it harder to argue that those in Iowa who sat in July 2014 were “less able” than those who sat in 2013, yet their performance was significantly poorer, contributing to the largest decline in bar passage rate in Iowa in over a decade. The only difference between 2013 and 2014 for graduates of Drake and Iowa taking the bar exam for the first time in Iowa is that the group that sat in July 2014 had to deal with the ExamSoft debacle while the group that sat in July 2013 did not.
TIME WILL TELL
This analysis does not “prove” that the ExamSoft debacle was partly responsible for the historic decline in the MBE Mean Scaled Score between 2013 and 2014. What I hope it does do is raise a serious question about the NCBE’s assertion that the “whole story” of the historic decline in the MBE Mean Scaled Score is captured by the assertion that the class of 2014 is simply “less able” than the class of 2013.
When the NCBE issues its annual report on 2014 sometime next year, we will be able to do a longitudinal analysis on a jurisdiction by jurisdiction basis to see whether jurisdictions which used ExamSoft had higher rates of anomalous results regarding year-over-year changes in bar passage rates for first-time takers. When the NCBE announces next fall the MBE Mean Scaled Score for July 2015, we will be able to assess whether the group that sits for the bar exam in July 2015 (which is even more demonstrably “less able” than the class of 2014 using the weighted average bar passage prediction outlined above), generates another historic decline or whether it “outperforms” its indicators by perhaps performing in a manner comparable to the class of 2014 (suggesting that something odd happened with the class of 2014).
It remains to be seen whether law school deans and others will have the patience to wait until 2015 to analyze all of the compiled data regarding bar passage in July 2014 across all jurisdictions. In the meantime, there is likely to be a significant disagreement over bar pass data and how it should be interpreted.
November 11, 2014 in Data on legal education, Data on the profession, Scholarship on legal education, Scholarship on the legal profession | Permalink | Comments (4)