[The MPRE is required in most jurisdictions before admission to the bar, and each state mandates its own passing score, usually between 79 and 86, out of a scaled 150 possible. Jeff posted last year on some useful observations on the exam process, and also mused on the numerical realities of it -- with helpful comments by Bill Henderson of IU. Following up on that, we have a guest-post from John McSweeny,
Ph.D., a psych professor in the med school of the University of Toledo with expertise in stats. By night, he is also a UT law student and a recent taker (and passer) of the MPRE. Jeff reports that John's findings as reported below are in line with how Bill Henderson later explained it last year to Jeff on a cocktail napkin [how all great teaching is done]. I thought our readers and MPRE takers would want to see these numbers and John's observations. -- Alan Childress]
background in statistics and psychometric methods, I was curious about MPRE scores and what they meant, including where certain scores fell in the distribution of scores from all
of the candidates who took the test. Accordingly, I contacted the American
College Testing program (ACT), which developed the test, but they wouldn't tell
me anything beyond what was on the website, i.e., that the range was 50 to 150
and the mean [average] was 100. Of course, this irritated
me to no end and launched me on a quest to find out the information I
little bit of searching I found that the National Council of Bar Examiners [which apparently contracts the MPRE out to ACT] has a
website where you can download a PDF file with the statistics for the MPRE
including the means and standard deviations, on an annual basis. These are
excerpts from The Bar Examiner. If you want to go to the site it is
Looking at the last two years which have been reported,
we can see that in 2005 the actual mean was 98.72 and the standard deviation, or SD, was 19.72;
for 2006 the mean was 97.97 and the SD was 19.72 (same as in 2005). A weighted
average mean from these two years = 98.33 and the SD (of course) =
use this information to estimate a Z-score with a mean of zero and an SD of one
and then go to a table to determine an approximate percentile equivalent. The
formula is Z = (X - 98.33)/19.72 where X is your scaled score. Thus, if you
received a score of 85 (the passing score in Ohio [and many other states]), Z = - 0.676. This would be
equivalent to an approximate percentile of 25, i.e., 75% of the persons taking
the test did better than you did. Thus, it appears that the magic number
of 85 used in several states, including Ohio, is set at the 1st quartile. Of
course, I do not know this for a fact, but the evidence sure points in that
caveats are in order:
distributions appear to vary slightly from year to year. Thus, the distribution
for 2008 may be slightly different from that in 2005 and 2006.
scores do not appear to be entirely normal (especially in 2006) and the
percentile equivalents for Z scores are based on the standard normal
these statistics give you only an approximation of where your score lies
relative to others.
A few more comments by John appear below the fold.
[UPDATE: Two follow-up posts here and here discuss the curve and numbers with more detail, link to state-by-state minimum scores (including news on NY and California), and note an article written by the head of number crunching for the NCBE about the exam. Plus more from John to follow in this post...]
Further observations [by John], cont.:
1) There appears to be a slight downward drift in the
mean since the test was standardized in 1999. I believe that the scores were
set to have a mean of 100 and an SD of 20 in 1999. If we use our Z-score
formula for a scaled score of 85 we get Z = -0.75 which is approximately
equivalent to the 23rd percentile. The difference in practical terms from
the figures I used previously is small, and people like scores that are whole
numbers rather than scores with decimal places.
2) I checked the 1999 stats and
interestingly the actual mean listed was 99.07. No SD was listed but the
percentage of candidates exceeding different scores was listed; 85 indeed was at
the 25th percentile.
3) It would be interesting to know if
there has been any attempt at a criterion validity study, i.e., does one's MPRE
score have any predictive value with one's tendency to be a recipient of
professional discipline or malpractice lawsuits in one's career as a lawyer?
Carrying out such a study would be a daunting (and expensive) task, however. We
will probably never see it.
4 ) The reason that ACT is reluctant to release
MPRE test score information may be due to their contract with the National
Conference of Bar Examiners (NCBE) , which is
probably the organization that owns the intellectual property, i.e., the MPRE, and the rights to release information
about it. Thus, ACT is likely under
contract to provide testing services for NCBE. This is a different from the situation with the ACT College Admissions
test, which is owned by ACT itself. Again, just a guess, but it seems