Saturday, November 5, 2022

Colorado Supreme Court Amends Bar Exam Passing Score

This past week the Colorado Supreme Court adopted a new Uniform Bar Exam (UBE) score that is more in line with the majority of jurisdictions. The previous minimum passing score was 276. The new cut score will be 270 starting with the February 2023 UBE. The decision is not retroactive, either for new Colorado UBE takers or for transfers into Colorado.  Here's the link for the details along with a report from the State of Oregon about cut score calculus:

https://www.courts.state.co.us/Media/release.cfm?id=2019

Note: The map below still shows CO as 276 rather than 270 but the table is correct. https://www.ncbex.org/exams/ube/score-portability/minimum-scores/

 

Map

November 5, 2022 in Bar Exam Issues | Permalink | Comments (0)

Wednesday, October 5, 2022

What to Expect When Your Loved-One is Expecting Bar Results!

Here's a terrific handout for bar takers to share with loved-ones as bar exam results are released (created by the Young Lawyer's Division of the Center for Legal Inclusiveness, Chair Marika Rietsema Ball, Esq.):

What to Expect - Bar Results

What to Expect - Bar Results  2

October 5, 2022 in Advice, Bar Exam Issues | Permalink | Comments (0)

Thursday, September 8, 2022

Impervious to Facts

"Too often facts around me change, but my mind doesn't.  Impervious to new information, I function like a navigation system that has missed a turn but won't re-route,"  writes attorney Mike Kerrigan in a story about "A Sweet Lesson From Pie," WSJ (Sep. 8, 2022).

I suspect that is true of most of us.  But why?  In my own case, my stubborn mind clings to the facts as I know them because, to admit that facts have changed and a new course of "navigation" is required is in someways to admit that I'm a human being, frail in more ways that I wish to admit.

I think that is especially a challenge in legal education and for bar exam authorities.  We cling to the past because that's all we know and, to be frank, sometimes all we want to know.

Take legal education.  We know that learning requires much from our students and from us.  But many of our classes go on despite the new facts that have emerged from the learning sciences. Louis N. Jr. Schulze, Using Science to Build Better Learners: One School's Successful Efforts to Raise Its Bar Passage Rates in an Era of Decline, 68 J. Legal Educ. 230 (2019)., Available at SSRN: https://ssrn.com/abstract=2960192

Take the bar exam.  The best available data suggests that there is a dearth of evidence to support a relationship between bar exam scores and competency to practice law.  Yet we cling to the past. Putting the Bar Exam on Constitutional Notice: Cut Scores, Race & Ethnicity, and the Public Good (August 31, 2022). Forthcoming, Seattle University Law Review, Vol. 45, No. 1, 2022, Available at SSRN: https://ssrn.com/abstract=4205899

I've made lots of wrong turns in my career, my work, and in my life.  To keep on going in the wrong way gets me no closer to where I should be going.  So let's give ourselves and each other the freedom to be changed, the freedom to travel a new path, the freedom to, in short, be curious, creative, and courageous about our work in legal education, on the bar exam, and in life in general.  (Scott Johns).

 

September 8, 2022 in Advice, Bar Exam Issues, Encouragement & Inspiration, Study Tips - General | Permalink | Comments (0)

Thursday, June 30, 2022

Making Memories that Stick - At Least Thru the Bar Exam

You've heard the quip about "the chicken or the egg, which comes first?"  

Well, as the joke goes, "I've just ordered one of each from Amazon, so I let you know tomorrow!"  

That got me thinking about memorization.  

Most bar takers are really concerned about memorization, particularly because most of their law school exams, unlike bar exams, were open book/open note exams.  But take a look at the word "memorization."  That's a word of action, of a process, of recalling something previously learned.  In other words, at its root core the word "memorization" derives from creating "memories."  So how do you create memories when it comes to learning rules of law?  

Or, to ask it another way, which comes first, memorization or memories?  

Well, I think that the answer to that question is in the question because it's memories that we memorize.  So the key to memorizing is to work through lots of problems, to test yourself with your study tools, to practice retrieval practice, and, in short, to create lots of memories with the rules.  

You see, memorization is just a fancy word for the process of experiencing memories through distributed and mixed practice over time.  So, instead of worrying about memorization as you prepare for your bar exam this summer, focus on making memories (and lots of them).   (Scott Johns)

June 30, 2022 in Bar Exam Issues, Bar Exam Preparation, Exams - Studying | Permalink | Comments (0)

Tuesday, May 31, 2022

Dear Practicing Attorneys:  Please Stop Giving Our Bar Students Inaccurate Advice. 

I still fondly remember the judge for whom I interned as a 3L.  Knowing that bar prep was coming up and sensing my anxiety, he asked me about my plan.  I told him that the bar prep company each day would provide lectures, outlines to read, some more outlines to read, and then finish things off with some outlines to read.  When I told him that the program started just after Memorial Day and ended the day before the exam, he was astonished.  His advice was to save myself all that money, take three weeks off from work, and study from July 4th until the exam.  He said that would be plenty.

Of all the advice my judge gave me, this was the one bit I did not take.  His guidance was well-intentioned, and I appreciated his attempts to calm me down.  But as the Type-A person that I am, I could not rest without feverishly checking off each scheduled study item.  His was advice I could not take.    

Twenty-something years later, students still receive that advice.  They insist:  “The partner at my firm said that she took just two weeks off for the exam and did just fine.”  The partner professed:  “You’re a smart kid.  You don’t need to do all that work.  Just watch the videos, read the outlines, and you’ll pass.”  Happy to internalize this message so as to mentally corroborate the partner’s flattering assessment, students’ confirmation biases drive them to adopt suboptimal learning behaviors. 

And then they fail the bar exam.

The practicing lawyers who give this advice sometimes believe that the bar exam world is a static place devoid of change.  However, recent substantial reforms severely limit the applicability of their experiences.  Below the fold, I describe those changes and how they require more careful advising. 

Continue reading

May 31, 2022 in Bar Exam Issues, Bar Exam Preparation, Bar Exams | Permalink | Comments (0)

Thursday, May 19, 2022

Crowd Control - Bar Exam Style

With the move to the Next Generation bar exam, here's an interesting chart, produced using data from the ABA and printed by the State of California, that might caution about placing too much trust in numbers to do the hard work of measuring competency to practice law.

So let's take a closer look. What do you see?  On the horizontal access, we see cut scores for the jurisdictions varying from low too high.  On the vertical access, we see attorney discipline rates corresponding with those jurisdictions. Except for the two outliers, North Dakota and South Carolina, the chart suggests that bar exam cut scores have no apparent relationship in lowering rates attorney discipline claims.  Or, as the State of California put it:

What the scatter plot shows is that attorney discipline – as measured by private and public discipline per thousand attorneys – appears to have no relationship to the cut score. With so many states using 135 for their cut score, the details of the Figure can be somewhat difficult to tease out. The big picture, however, is clear. At a cut score of 135 the rate of attorney discipline ranges from a low of 1.9 per thousand in West Virginia to 7.9 per thousand in Tennessee. Looking across the entire range of cut scores we see strikingly similar rates of attorney discipline in states with cut scores from 130 – Alabama – all the way to 145 – Delaware. California’s rate of discipline (2.6 per thousand) is just over one-half (55 percent) the rate of discipline in Delaware (4.7 per thousand). Given the vast differences in the operation of different states’ attorney discipline systems, these discipline numbers should be read with caution. But based on the data available, it raises doubts as to whether changing the cut score would have any impact on the incidence of attorney misconduct. As with the research conducted by professors Anderson and Muller, this measure of “misconduct” is admittedly limited to cases where misconduct is detected, reported, and sanctioned. There is however currently no better measure of the actual incidence of attorney misconduct or, more importantly, of public protection.[1]

Granted, as California recognizes, attorney misconduct is perhaps a poor proxy for whether bar exam cut scores relate to attorney competency.  But make not mistake.  Bar exam cut scores, by nature of their very arbitrariness, likewise are imperfect proxies for measuring attorney competency.  

Why is this important?  

Well because state supreme courts are now in pitched conversations about whether to adopt the s0-labeled "Next Generation" bar exam.  The chart below suggests caution because it's easy to place confidence in numbers, more confidence that they deserve, like a friend who has betrayed one's trust. As a person trained in mathematics, count me as a skeptic.  Just picking a number out of "thin air," as the range of cut scores suggest, doesn't compute, in my book, as the proper way to judge whether one is competent to practice law.  So as jurisdictions contemplate big changes to a possible next exam format in several years, let's hold them accountability for the math.  Our students and the public at large deserve nothing less. (Scott Johns).

ABA Chart

[1] Final Report 2017 Studies, The State Bar of California, available at: https://www.calbar.ca.gov/Portals/0/documents/reports/2017-Final-Bar-Exam-Report.pdf (last accessed Sep. 16, 2021). 

May 19, 2022 in Bar Exam Issues | Permalink | Comments (0)

Sunday, April 17, 2022

Thoughts on the NextGen Bar Content Outlines

The NCBE released the NextGen Bar's content outlines recently and asked for public comment.  The comments are due Monday April 18th.  They contacted me asking me to spread the word about the notice and comment process and also encouraged me to provide my thoughts on the blog.  I hesitated until now to not cloud others thoughts during the comment process.  I probably should have posted my comments prior to February bar results.  I might (probably not) have been less harsh a few weeks ago.

I will start with my overall view of the NextGen Bar is severely impacted by prior interactions with the NCBE.  I sat in a room with roughly 150 ASPers (probably most of you reading this) at an AASE plenary session where ASPers were supposed to be able to interact with the NCBE regarding massive drops in MBE scores after adding Civ Pro as a topic.  The NCBE representative spoke for about 55 minutes of the hour session.  She took 1 question regarding cognitive load, didn't provide a coherent answer, and the session ended.  No meaningful interaction occurred.  I attended 2 workshops in Madison with the NCBE, which they do graciously provide at no cost to ASPers.  Their hospitality was wonderful, but the substance in the first workshop didn't further schools ability to assist students.  Most of the workshop justified their exams.  The second workshop provided more information.  They did a great demonstration of how they train graders and provided some information about MBE scores.  While the second event still included justifications, it did provide more information for helping students prepare for the bar.  Like many of you, I found the NCBE's claims about "less able students" offensive and lacking any self-reflection.  I felt the NCBE's pandemic response and white-paper justifying the bar exam lacked basic social responsibility to fellow suffering humans.

I could expound on other grievances with cathartic rants, but I should progress to the current topic. I did want to be transparent that my views are based on interactions beyond the current exam restructuring. 

One stated goal from the communication with the NCBE is the bar will test fewer topics less deeply.  I think that is great.  The current MBE tests a depth of knowledge that is significantly beyond minimum competence to practice law.  The goal is great, but I don't trust the NCBE to execute it.  I believe testing skills attorneys use on a daily basis and law that is truly the minimum amount to be competent is an outstanding goal.  I applaud the NCBE for undertaking a task that could radically change the bar exam.  However, goals and ideas alone don't always produce the best results.  Executing a plan to provide a good exam is critical because individual's livelihoods are at stake.  From my experience with how poorly the MBE tests competence, I worry the NextGen Bar will look different but not actually test minimum competence.  My fear is also the NCBE will continue to justify their exam without self-reflection.  A different bar exam isn't inherently good if it continues to test irrelevant skills on a standardized test.

One way to to fail in execution is content.  The content scope outlines illustrate my worry.  I love they are decreasing subjects.  Students across the country will rejoice when Secured Transactions falls off the exam.  Also, no more Family Law or the UCCJEA rules. Decreasing subjects should focus more on the necessary topics for new attorneys.

In theory, substantive cognitive load decreases.  However, I still see 2 problems.  First, the new skills to be tested (Legal Research, Client Counseling, Negotiation, etc.) can't logistically be tested in a real-world environment.  Texas, New York, or Florida can't watch a simulated Negotiation or Client Counseling session for every taker.  Those skills will be tested in some standardized format, which means students will have to learn the "best" answers for those sections.  That still counts towards the cognitive load required to pass the exam.

My second problem relates to the content within the subjects.  The content includes the traditional MBE subjects.  The outline places an asterisk next to areas that must still be memorized.  Glancing at the Contracts outline, nearly everything still must be memorized, including third-party rights, interpretation, and omitted terms.  Business Associations also seems to need memorization of sections (ie - LLCs) that should be state specific statutory law.  The amount of substantive memorization may decrease due to less subjects, but some subjects still seem to require memorization.  I believe some of the memorization is still beyond what regular "competent" attorneys know.

My problems aside, I do love that common law crimes are no longer tested.  Virtually none of Crim law must be memorized.  Significant portions of Real Property doesn't need to be memorized, especially future interests.  I would throw future interests out completely, but no memorization is a compromise.  Civ Pro requires memorization but most of Evidence doesn't.  There does seem to be effort to decrease content, but I think more could be taken out.

If I merely read the NextGen Bar's content scope outline with their goals, I think it could be a reasonable and relevant exam.  However, I am skeptical of the NCBE based on prior interactions.  I question whether the execution will follow the goals and whether this becomes another standardized mechanism to exclude diverse populations.  I hope I am wrong.

(Steven Foster)

April 17, 2022 in Bar Exam Issues | Permalink | Comments (0)

Saturday, March 26, 2022

Notice and Comment on NextGen Bar Content

The NextGen Bar Exam's preliminary Content Scope Outlines are out.  The announcement is below.  The NCBE is also asking for comments on the outlines.  You can use the links in the announcement to provide comments.  I encourage everyone to take advantage of this opportunity to advocate for our students.

 

"NCBE is pleased to announce the publication of the preliminary Content Scope Outlines for the next generation of the bar exam. NCBE requests public comment on the outlines from our colleagues in the legal community by Monday, April 18. The new bar exam is expected to launch in approximately five years. Please feel free to share this announcement and request for comment with your colleagues in practice, the judiciary, and legal education.
 
The Content Scope Outlines delineate the topics and lawyering tasks to be assessed within the eight Foundational Concepts and Principles (subjects) and seven Foundational Skills established through the work of NCBE’s Content Scope Committee, a group of 21 dedicated legal professionals, including legal educators, deans, practitioners, and bar examiners. The focus of the topics and skills to be tested are those that are most important to the practice of newly licensed attorneys.
 
These subjects and skills are based on the input of nearly 15,000 members of the legal community who responded to NCBE’s 2019 nationwide practice analysis survey. The subjects to be tested are Civil Procedure, Contract Law, Evidence, Torts, Business Associations, Constitutional Law, Criminal Law, and Real Property. The skills to be tested include legal research, legal writing, issue spotting and analysis, investigation and evaluation, client counseling and advising, negotiation and dispute resolution, and client relationship and management.
 
Thank you for your interest in the future of the bar exam. To submit comments by April 18, please visit the next gen website."

March 26, 2022 in Bar Exam Issues | Permalink | Comments (0)

Tuesday, January 25, 2022

Academic and Bar Support Scholarship Spotlight

Recent SSRN posts on academic and bar support:

1. Catherine Martin Christopher,  Modern Diploma Privilege: A Path Rather Than a Gate (SSRN Post, October 5, 2021). 

From the abstract:

This article proposes a modern diploma privilege, a licensure framework that allows state licensure authorities to identify what competencies are expected of first-year attorneys, then partner with law schools to assess those competencies. Freed from the format and timing of a bar exam, schools can assess a broader range of competencies over longer time horizons. This will allow the development of law school curricula aimed at preparing students to assist clients rather than to pass the bar exam. The modern diploma privilege is structured as an ongoing partnership between licensure authorities and schools, which means that changes can be easily made to the list of desired competencies and/or the assessment methods. This in turn allows for a more nimble licensure mechanism that can quickly adapt to changes in the evolving market for legal services.

2.  Katharine Traylor Schaffzin, First-Generation Students in Law School: A Proven Success Model, 70 Arkansas L. Rev. 913 (2018).  

From the abstract:

This article addresses the ever-increasing population of first-generation college students and the academic challenges they face both in undergraduate school and in seeking to matriculate to law school. This demographic has been heavily studied at the undergraduate level, but very little data is available about the challenges and success of first-generation college students in law school. The article describes the best practices for the academic success of first-generation college students as researched and implemented by various colleges and universities. It also summarizes the findings of the only study done on the experiences of first-generation college graduates who matriculated to law school.

This research serves as the backdrop for the description of a unique program with proven success directed toward securing the academic achievement of first-generation college students in law school. The University of Memphis School of Law Tennessee Institute for Pre-Law (TIP) program is decades old and has been recording the successful outcomes of such students. This article analyzes data collected since 2012 on the academic outcomes of first-generation college graduates who participated in TIP to conclude that the program leads to successful results for these students in graduating law school and passing the bar exam. The article details the program itself and explains how a law school can implement the promising methods uncovered at the undergraduate level. It offers TIP to readers as a proven intervention and success model for law schools seeking to ensure the academic success of first-generation college graduates in law school.

Recent book:

Charles Calleros (New Mexico),  Law School Exams 3rd edition, VitalSource (2021).

From the publisher:  

Law School and Exams: Preparing and Writing to Win, Third Edition is the third edition of a popular book whose first edition Bryan Garner reviewed and judged to be “the best on the market.” It combines:

    1. Clear and comprehensive explanations of study and exam techniques
    2. Numerous illustrative samples that are truly instructive
    3. Twenty in-class exercises or take-home assignments on everything from case briefs to essay and multiple-choice exam questions.

Comprehensive and self-contained, the Third Edition is suitable for use as the textbook for a sophisticated Prelaw course, 1L Orientation, or a 1L Academic Success course. Alternatively, incoming freshmen can work through it independently over the summer to be optimally prepared for law school in the fall.

(Louis Schulze, FIU Law)

January 25, 2022 in Bar Exam Issues, Bar Exams, Publishing | Permalink | Comments (0)

Friday, November 5, 2021

A Dean Speaks Out

I've been meeting with unsuccessful bar takers, and I'm finding that it is increasingly difficult to explain holistic relative-rank scoring, in which what appears to meet competency standards is judged incompetent.  

I realize that the NCBE and jurisdictions say "trust us" because we use statistical equating and scoring methods to standardize written scores based on MBE distribution data including median and mean MBE scores and standard deviations.  

But, frankly, it seems unfair to toss some written exam answers, especially legal writing performance test answers, into the 1 or 2 out of 6 "buckets" when the pool of applicants have already undoubtedly proven their merit through earning doctoral juris degrees.  

So, fancy this, a dean speaks out, suggesting that the bar exam as a rite of passage is not moored to its stated goal of measuring entering level attorney competency but rather tied to 1920-era exclusionary politics.  

I'll let the dean speak for himself:

"When we started seeing diversity increase or people from underrepresented communities — mostly people of color and recent immigrants, trying to become lawyers — then all of a sudden the ABA (The American Bar Association) and other bar organizations were doing whatever they can to keep them from being lawyers," Niedwiecki explained. "The written bar exam became a requirement of the ABA at that time. So that's when we started seeing all these written bar exams. Before that there were oral exams... apprenticeships, there were other ways to become licensed. I think we have to go back to those days knowing that the bar exam really kind of was back in the '20s rooted in exclusion." Niedwiecki, A, "Why a Mitchell Hamlin's Dean is Calling for an End to the Bar Exam," KARE-11 TV (Sep. 30, 2021).  (S. Johns).

 

 

November 5, 2021 in Bar Exam Issues | Permalink | Comments (0)

Monday, November 1, 2021

Pull the Goalie

*I am going to preface this by clearly admitting that I am not someone who regularly teaches bar prep and I know that what I am saying may come from a place of relative ignorance on many issues. I am sure I have missed some important nuances here-and for that, I apologize in advance.*

Recently, we got the news that my youngest child has passed all his required MCAS exams for high school graduation (MCAS is the Massachusetts Child Abuse System according to my kids, but really the Massachusetts Comprehensive Assessment System). These are the standardized tests that students in public schools start taking in third grade and take until they pass the required high school level exams for graduation. The elementary grade exams do not have any impact on grades in classes or promotion between grades-they may indicate a need for other school-based interventions or testing, but that is it. I’ve never let any of my children even see the reports that are mailed to parents.

These yearly exams are meaningless…until they have ultimate, high stakes meaning. Students cannot (without jumping through some significant, fiery hoops)graduate from high school without passing the English, Math and Science exams. Some parents complain that “teaching to the test” ruins learning for their children-which is a valid point. Some parents worry about the achievement gap between various groups of children (mainly along racial and socio-economic lines) which is a complete and unavoidable truth. If a test cannot be administered fairly, what is it assessing at all? And why would we attach such significance to an instrument that is irremediable?

And so, we arrive at the current iteration of the Bar Exam. At times, it seems to test a student’s ability to take the exam more than assessing knowledge of the concepts, theories, and skills it purports to assess. The same criticisms that are true about the MCAS are relevant here. We should not teach to a test-we should be teaching for learning. The achievement gap has not been bridged despite being widely acknowledged. And yet, the Bar is the key that opens the gate to many careers in law. With COVID and remote bar exam issues (technical, physical, and psychological), can we really say that it is an accurate instrument of assessment for practice readiness?

Has it ever been?

My thought for this Monday morning is this: since we all know people (not students, but peers) who have passed the Bar and were not ready for primetime, and we all know people who did not pass but were born ready to practice law, then how is passing the Bar a guarantee of anything? Think about it: (just about) every person who has ever been disbarred must have passed the Bar. So why not just pull the goalie here? What are we protecting when not every shot to the goal goes in--even when no one is there? The fact that law school accreditation is in some part contingent on bar pass rate shows, at best, a lack of creativity in assessment. At worst, it shows that we do not really wish to welcome all the qualified potential members into the profession.  We can do better.

(Liz Stillman)

November 1, 2021 in Bar Exam Issues, Bar Exam Preparation, Bar Exams, Diversity Issues | Permalink | Comments (0)

Monday, October 25, 2021

Gestalt

I was a social psychology major as an undergraduate and I remember studying the psychological theory of gestalt, which is defined as “something that is made of many parts and yet is somehow more than or different from the combination of its parts.”[1] Basically, if I had known about outlining back in those days, I would have written the rule as: the whole is greater than the sum of its parts. As bar exam results trickle in from parts near and far, I think it worth revisiting this idea with both students and colleagues.

To students who have passed the bar, I would say, “Wonderful! Remember, there is more to you than this one credential. As an attorney, you will bring your whole self to the table and that will always be more than the sum of your parts.” To the students who have not passed the bar this time, I might say the same thing. I do not want to be dismissive of how meaningful this one credential is for them after a three (or four) year journey that has already been fraught with confidence crushing moments. I don’t want to toss out, “oh well, maybe next time” either because right now, I think these students may see “next time” as a craggy mountain to climb without any safety gear in truly inclement weather.  I also know that social media means that students will know about their classmates’ successes almost immediately and silence will be interpreted as failure. Literally. There really is no good answer other than “I’m sorry. How are you doing?”

I also worry about my colleagues who have poured every ounce of what they have into students to help them pass the bar (regardless of whether the students were willing vessels or not) and now have someone else’s success or failure be determinative of their worth. Is this how we value professionals?

When a football team loses a game, media outlets tend to blame everyone on the team-not just the quarterback or coaches, but the team as a whole: offense, defense, big guys, little guys. Even when one player makes an egregious error, the sportscasters tend to find additional reasons for the loss-even the weather or altitude can be roped in. When the team wins, the press is similarly wide in praise, as seen here by today’s Boston Globe after the Patriots won a home game yesterday, “[e]veryone went home happy Sunday. Mac Jones got his first 300-yard game and hit a 46-yard deep ball. Damien Harris rushed for 100 yards. Eleven players made a catch, and five different players got in the end zone. The defense created two interceptions…Smiles all around.”[2] And remember, these guys probably each get paid more than all the ASP folks at a regional conference combined.

So, when bar results are good, ASP folks are part of the overall winning team with smiles all around. But when bar results are not what we are hoping for, why do our ASP colleagues not get the same level of camaraderie? Why aren’t we always a team at that moment also? ASP folks, and particularly those who do bar exclusively, need to be given the grace of gestalt. So I say to you, regardless of the bar results at your school, you are more than the sum of your parts. As an ASP professional, you bring your whole self to the table and you are mighty.  

Judging someone’s competency or job security based on the performance of other people at a task that is not entirely knowable is something that is far above our pay grade.

(Liz Stillman)

 

[1] https://www.merriam-webster.com/dictionary/gestalt

[2] https://www.bostonglobe.com/2021/10/24/sports/who-cares-that-its-jets-patriots-needed-this-ego-boost/

October 25, 2021 in Bar Exam Issues, Bar Exam Preparation, Bar Exams, Encouragement & Inspiration, Professionalism | Permalink | Comments (0)

Wednesday, October 13, 2021

Bar Exam Pass Rates and Academic Support

Maya Angelou wrote “we are more alike, my friends, than we are unalike.”  One of my favorite songs right now is Bleed the Same by Mandisa where she conveys a similar message.  I believe the message from both of them would apply to the current discussion surrounding factors impacting bar passage rates. 

Most of you are aware Rory Bahadur wrote a series of articles examining the relationship between certain factors and bar passage rates.  He specifically questions whether FIU’s emergence as the leader in Florida’s bar pass rate is significantly impacted by factors such as involuntary attrition, incoming transfers, and incoming credentials.  An oversimplification of his conclusion is that these factors have a major impact on Florida’s bar pass rankings.  His 3 articles are on SSRN here:

  1. Blinded by Science? A reexamination of the Bar Ninja and Silver Bullet Bar Program Cryptics
  2. Reexamining Relative Bar Performance as a Function of Non-Linearity, Heteroscedasticity, and a New Independent Variable
  3. Quantifying the Impact of Matriculant Credentials & Academic Attrition Rates on Bar Exam Success at Individual Schools

FIU’s academic support team, which includes one of our editors Louis Schulze, responded last weekend in a series of blog posts.  You can read the posts here:

  1. Does Academic Support Matter? A Brief, Preliminary Response to Blinded by Science and its Progeny
  2. Does Academic Support Matter? A Brief, Preliminary Response to Blinded by Science and its Progeny, Part 2

Louis’ response questions the statistical methods used in the previous articles and posits that FIU’s new Academic Support program made a statistically significant effect on bar passage rates.  Rory responded to the posts with a message on the ASP listsev/google group.  You should be able to access his message within that group. 

Rory and Louis are engaged in a relevant and important discussion for ASP.  I encourage everyone to read the articles and posts.  AccessLex also published a brief post addressing this topic and one of Rory’s articles.  The AccessLex authors state they are conducting  a couple projects that will provide even more insight.

The academic debate surrounding this topic is necessary, but we should also recognize the reason why the debate is important and sometimes personal.  While they disagree, both Rory and Louis are passionate about helping ASPers and students.  They both cite the lack of tenure for ASPers as a major concern.  They both argue for more resources for Academic Support.  Knowing them both, I truly believe they are trying to do what is best for both ASP and students.

As long as we are trying to figure out what helps students succeed, I do want this discussion to continue in an academic manner.  One of my major concerns is when schools/Deans evaluate whether ASPers are effective based primarily on bar pass rates.  Bar pass rates are an easy number to stamp on a department, almost treating bar pass numbers as wins and losses.  Media and other entities fuel that perception with articles about who had the highest bar pass rate in the state.  FIU’s success has brought national attention from the ABA journal and other legal news sources.  Deans around the country, especially ones in Florida, do specifically ask, “why isn’t [insert school] having the success of FIU?  Are our people doing their job correctly?”  Those outside ASP want to know, what is the secret sauce?

I also want the discussion to continue to demonstrate the impact ASP has on students.  Both Louis/Raul and Rory presented at regional and national ASP conferences about best practices in teaching.  Many of us agree that law school education and pedagogy needs improving.  Most of us agree that better teaching would improve student learning and that we should use scientifically proven methods to teach students.  We would also agree that improved student learning should have an impact on student success and bar performance.  I want to know what everyone else does, including Louis and Raul, to lead to improved student performance.  I especially want to read studies that quantify the impact of Academic Support and/or specific Academic Support programs.  Anecdotally, we know we have an impact on individual lives.  That impact matters, and should be measurable. 

Promoting ASP is important to the majority of us.  We need ongoing projects to measure what works and how we can all improve our students’ chances to pass the bar exam.  I know we are all striving to promote each other and help students.   I hope we can continue to do that.

(Steven Foster)

October 13, 2021 in Academic Support Spotlight, Bar Exam Issues, Program Evaluation | Permalink | Comments (0)

Sunday, September 19, 2021

This Year's Explanation?

Another year, another set of results to explain away.  The NCBE released the national MBE mean last week, and the change over the last 2 years is massive.  The NCBE ignored 2020 results, and just compared to 2019.  The "changed test" and "different sample size" is an easy explanation.  The explanation also ignores a nearly 6 point drop from last year's July MBE score.  Excuses abound, but if the test is reliable and easy to scale (or they could have used the same test as 2019) the 6 point drop is inexcusable.  2020 graduates dealt with the immediate impact of a pandemic and social unrest.  My rudimentary understand of the LSAC reports indicates 2020's graduates had worse LSAT scores than 2021 and 2019.  Why the anomaly in scores?

I will humor the NCBE for my next query.  If 2020 is an outlier and we ignore those statistics, the 2019 comparison also doesn't make sense.  2021 results were .7 points lower than 2019.  I may be wrong, but I believe 2021 graduates had much larger numbers of the high scoring LSAT takers in the pool.  They should have been closer to 2020 scores than 2019.  If the LSAT correlates to MBE scores (which both the LSAC and NCBE claim it does), then why did the 2021 national MBE mean drop?  "Less able" test takers is no longer an acceptable answer.

I have a number of theories, and the real answer probably includes numerous factors.  Most of my theories revolve around the fundamental thought that the MBE tests more than the ability to practice law. 

2021 graduates endured longer COVID-19 interruptions.  They may have been tired of zoom and online education, and bar prep is primarily online.  The online fatigue may have led to less work.  

Students may have worked at law firms more last summer.  This thought comes from anecdotal conversations, but some jobs decreased in 2020.  When those jobs came back, students did what they could to keep jobs during uncertain times.  That may have included working at firms more and on bar prep less.  They could also make this choice because working is more fun than bar prep, and they didn't get to work as much in the previous year.  Students may have also been in harder financial times from not working the previous year.  

I could give many reasons, but overall, I believe students did less bar prep work last summer.  However, should the MBE really be a test of commitment over 10 weeks?  Why should students be required to devote that amount of time to a test regarding the jurisdiction of nowhere?  RAP, the rule of sevens, common law burglary, and many other rules provide an obstacle to practice.  The test seems to assess someone's ability to financially and emotionally devote extreme time to a task for 10 weeks.  Is that really what we should assess to become an attorney?  The NCBE's creation of a testing task force implicitly confirms the MBE is a poor instrument, but they continue to administer it.  Shouldn't we stop using a poor instrument even if the alternative isn't ready?  Many questions, but my guess is all we will hear is *crickets*.

(Steven Foster)

 

September 19, 2021 in Bar Exam Issues | Permalink | Comments (0)

This Year's Explanation?

Another year, another set of results to explain away.  The NCBE released the national MBE mean last week, and the change over the last 2 years is massive.  The NCBE ignored 2020 results, and just compared to 2019.  The "changed test" and "different sample size" is an easy explanation.  The explanation also ignores a nearly 6 point drop from last year's July MBE score.  Excuses abound, but if the test is reliable and easy to scale (or they could have used the same test as 2019) the 6 point drop is inexcusable.  2020 graduates dealt with the immediate impact of a pandemic and social unrest.  My rudimentary understand of the LSAC reports indicates 2020's graduates had worse LSAT scores than 2021 and 2019.  Why the anomaly in scores?

I will humor the NCBE for my next query.  If 2020 is an outlier and we ignore those statistics, the 2019 comparison also doesn't make sense.  2021 results were .7 points lower than 2019.  I may be wrong, but I believe 2021 graduates had much larger numbers of the high scoring LSAT takers in the pool.  They should have been closer to 2020 scores than 2019.  If the LSAT correlates to MBE scores (which both the LSAC and NCBE claim it does), then why did the 2021 national MBE mean drop?  "Less able" test takers is no longer an acceptable answer.

I have a number of theories, and the real answer probably includes numerous factors.  Most of my theories revolve around the fundamental thought that the MBE tests more than the ability to practice law. 

2021 graduates endured longer COVID-19 interruptions.  They may have been tired of zoom and online education, and bar prep is primarily online.  The online fatigue may have led to less work.  

Students may have worked at law firms more last summer.  This thought comes from anecdotal conversations, but some jobs decreased in 2020.  When those jobs came back, students did what they could to keep jobs during uncertain times.  That may have included working at firms more and on bar prep less.  They could also make this choice because working is more fun than bar prep, and they didn't get to work as much in the previous year.  Students may have also been in harder financial times from not working the previous year.  

I could give many reasons, but overall, I believe students did less bar prep work last summer.  However, should the MBE really be a test of commitment over 10 weeks?  Why should students be required to devote that amount of time to a test regarding the jurisdiction of nowhere?  RAP, the rule of sevens, common law burglary, and many other rules provide an obstacle to practice.  The test seems to assess someone's ability to financially and emotionally devote extreme time to a task for 10 weeks.  Is that really what we should assess to become an attorney?  The NCBE's creation of a testing task force implicitly confirms the MBE is a poor instrument, but they continue to administer it.  Shouldn't we stop using a poor instrument even if the alternative isn't ready?  Many questions, but my guess is all we will hear is *crickets*.

(Steven Foster)

 

September 19, 2021 in Bar Exam Issues | Permalink | Comments (0)

Thursday, September 16, 2021

Ominous, says a Bar Exam Article

Hat tip to Professor Chris Newman (University of Idaho School of Law):

The NCBE has released information about the median MBE score on this summer's July 2021 bar exam (140.4) and it is down significantly from the July 2020 median MBE score (146.1) and down a bit from the previous national cohort taking the July 2019 bar exam (141.1). Sloan, K, "Ominous early signs emerge for July 2021 bar exam pass rates," Reuters (Sep. 15, 2021).

As a closer look, the NCBE posted a graph depicting the median MBE scores for the past several years:

July 202 MBE Mean Score Increases

https://www.ncbex.org/news/national-means-july-mbe-august-mpre/

Because it is likely that so many bar takers either just barely fail or just barely pass bar exams, small differences in scores can result in dramatic differences in pass rates, with Reuters reporting that of the 9 states reporting July 2021 bar exam results, only 1 state had an increase in bar passage.  Reuters.  The article suggests, quoting in part Professor Derek Mueller, that widespread technical difficulties, pandemic fatigue, and perhaps a loss of learning effectiveness with the significant transition to online learning may be contributors.  Reuters.

One fact stands out to me.  Small changes in median MBE scores ought not be indicative of attorney competency issues because, to be repetitive, they are only small differences.  But, because most bar exam cut scores center around the median MBE score (and because most bar exams scale the written scores to the MBE scores), small differences can lead to big impacts.

September 16, 2021 in Bar Exam Issues | Permalink | Comments (0)

Tuesday, September 7, 2021

Academic and Bar Support Scholarship Spotlight

A plethora of recent scholarship to report:

1.  B. Templin (Thomas Jefferson), Integrating Spaced Repetition and Required Metacognitive Self-Assessment in a Contracts Course (2021).

From the abstract:

This article provides an example for doctrinal law professors to integrate metacognitive exercises into their courses in order to increase student retention and understanding of the material as well as improve exam test-taking skills. Teaching metacognition is traditionally the domain of law school ASP departments. However, when ASP methods are supplemented with required exercises in a doctrinal course, student performance can improve measurably.

2.  S. George (Suffolk Law), The Law Student's Guide to Doing Well and Being Well (Carolina Academic Press, 2021). 

From the abstract:

The ABA and most state bar associations have identified a wellness crisis in the legal profession, and called for educating students on how to better cope with the challenges of law school and practice. At the same time, students must learn how to maximize their brain health so that they perform well in law school and on behalf of their clients in practice. The same way musicians would tune their instruments, or chefs would sharpen their knives, law students must sharpen their minds. This book aims to help students “do well” in their ability to learn, and “be well” in the process, by exploring the deep connection between brain health and wellness.

3.  A. Soled (Rutgers) & B. Hoffman (Rutgers), Building Bridges: How Law Schools Can Better Prepare Students from Historically Underserved Communities to Excel in Law School, 69 J. Legal Educ. 268 (2020).

From the introduction:

This article discusses the needs of law students whose circumstances—including but not limited to economic status, race, nationality, sexual orientation, gender identity, and/or educational background—disadvantage them in relation to their classmates whose privileged environment better prepared them for law school. This article first discusses factors that affect academic performance at law school. Second, it illustrates prelaw school and law school programs that target the needs of students from historically underserved communities. Finally, this article proposes ways law school faculty and administration can help these students succeed in law school and in their careers.

4.  K. Testy (Washington), Advancing an Evidence-Based Approach to Improving Legal Education, 69 J. Legal Educ. 561 (2020). 

From the article:

Student-centeredness should not be a remarkable idea for legal education.
Yet, some educators resist student-centeredness on the grounds that such an
approach sounds too much like “the customer is always king.” Under this
line of thought, faculty members instead see their role as the expert with the
duty of deciding what the student needs. As one of my faculty colleagues once
explained to me, “Dean, you pay me to mold them, not to listen to them.”

In my experience, however, students usually do know what they need; we
can learn a great deal by listening

(Louis Schulze, FIU Law)

September 7, 2021 in Bar Exam Issues, Bar Exam Preparation, Diversity Issues, Publishing | Permalink | Comments (0)

Thursday, August 26, 2021

Two ASP Professors cited in recent ABA Journal Article

Two ASP Professors Marsha Griggs (Washburn Law) and Melissa Hale (Loyola University Chicago School of Law) are cited in an American Bar Association article detailing technical difficulties experienced by some remote bar exam takers with the July 2021 bar exam.  

In my opinion, these sorts of problems demonstrate - for far too long - that regulators and courts are too reluctant, insular, and wedded to a one-sized fits all approach as the only method to determine whether law school graduates are competent to practice law.  It's like trying to fly an airplane regardless of the storm clouds and turbulence ahead.  Our future graduates and our future communities deserve better.  

For the article, please see the following link. :https://www.abajournal.com/web/article/technical-problems-again-plague-remote-bar-examinees-who-blame-software-provider.

(Scott Johns)

August 26, 2021 in Advice, Bar Exam Issues | Permalink | Comments (0)

Wednesday, July 28, 2021

For Immediate Release - Re: Technical Issues for Remote Bar Exam (Statement from the Association of Academic Support Educators)

Friday, June 25, 2021

High Water Mark for UBE?

Hat tip to Greg Bordelon for sending out the link to New York's report on the UBE.  You can find the report here: 

https://nysba.org/app/uploads/2021/06/9.-Task-Force-on-the-New-York-Bar-Examination-with-staff-memo.pdf

I found this paragraph interesting:

The Task Force reaffirms its central recommendation that applicants for admission to

practice law in New York be required to demonstrate basic knowledge of New York law. It

remains our view that, if passage of a Bar Examination is either the exclusive, or an alternative,

pathway to practice in New York, that examination should include a rigorous test on matters of

New York law. We strongly believe that persons seeking admission to practice law in New

York must be required to demonstrate that they are able to do so competently. Given the unique

complexities of the New York legal landscape, including an elaborate court structure, a

complicated civil practice code, and distinctive rules governing evidence, family law, and trusts

and estates, among a myriad of legal principles unique to New York, it is not enough that an

applicant show competence solely with reference to the “law of nowhere.”

 

This paragraph is the reason I erroneously believed New York would never adopt the UBE in the first place.  Many argue NY adoption is what led many other states to follow, including Texas.  Texas adoption is what led to Oklahoma, etc.  If this is a concern, NY probably shouldn't have adopted the UBE.  

The other interesting conclusion from the report is the worry that the new bar exam in 5 years will be even worse than the UBE.  I will admit to being skeptical of the entire bar task force process, but I was actually impressed with a few of the choices on the drafting committee (I am sure I would be impressed with others, I just didn't know everyone).  The new bar exam may be the best measure of minimum competence the legal profession has seen, and we may also see states flee from it.  The next 4-5 years will be interesting.

(Steven Foster)

June 25, 2021 in Bar Exam Issues | Permalink | Comments (0)