Tuesday, April 26, 2022
Law schools have not yet fulfilled the Carnegie Report’s call for more formative assessment. One reason for falling short is conflicting narratives about what is “good” formative assessment. One specific narrative seems particularly troublesome: That the only legitimate method for providing formative assessment is for the instructor to sit down with each student and explain their errors. This post pushes back on that narrative.
- Self-critique is more effective than we appreciate.
Most would agree that individualized feedback from an instructor, the expert both on the subject and the way the student will be graded, is most effective. But especially when using a model answer or quality student answer, allowing students to compare their work against the ideal version, doing so not only assists with doctrinal comprehension, legal writing, and exam skills, but also builds metacognitive abilities. Having the capacity to determine one’s own weaknesses is crucially important, as demonstrated by countless studies showing the performance-enhancing effect of improving metacognition.
The assumption that all feedback must come from the instructor certainly undercuts the mission to improve students’ metacognition. When students find themselves professorless during bar study, they will scramble around helplessly if they have absorbed the legal education fable that only professor knows best. Moreover, new lawyers will certainly struggle in the early years of practice when they need to run to the partner/ division chief/ client to do the metacognitive work for them.
Although some students certainly will do a poor job of self-critiquing (“I mentioned res ipsa loquitur just like the model did! I should get full points!”), this is no reason to underappreciate self-critique. First, in my experience, most students DO figure out their weaknesses from this process. While before the self-critique process they think their C-minus should be an A-minus, seeing the student essay that booked the course tends to leave them thinking otherwise. Second, even if the student still does not see the problems, this is where academic support faculty come in. In partnering with doctrinal faculty, academic support faculty can meet with underperforming students and comment not on the law but on the student’s metacognition. This method distributes personnel resources in a way that makes robust feedback more possible, fosters metacognition, demonstrates to students the valuable connection between doctrinal and academic support faculty, and frees up time for doctrinal faculty.
- Calcifying the status quo.
A particular danger with the solely instructor-based feedback narrative is that it preserves the status quo. We all know that formative assessment is lacking in legal education. The principal argument against remedying that problem is that individualized feedback is so time-consuming that one can accomplish little else. When those inclined to pursue efficient formative assessment are then met with the chorus of voices claiming that self-provided feedback is inadequate, they throw the baby out with the bathwater, dismiss formative assessment, and turn back to the same one-final-exam process used since 1877. Therefore, creating this strict dichotomy between individualized feedback and self-provided feedback makes the perfect the enemy of the very good and leaves students with nothing instead of at least something.
None of this is meant to say that instructor-led feedback is unnecessary or inferior. Feedback from course instructors is crucial. But when that type of formative assessment is not feasible, self-critique is a solid option.
There is a lot more to discuss on this subject. Unlike almost all other graduate programs, why do we think that TAs providing feedback is an unspeakable heresy? Why do we almost never use summative assessment as formative assessment by improving the process of post-semester self-critique? (FYI, simply letting students see their exam answers does not accomplish this goal.) Why do we see testing only as formative and summative assessment but not as a learning tool in-and-of itself?
Unfortunately, the time constraints on writing about the time constraints of formative assessments are such that I have to stop tying now. Ironic.
Louis Schulze, FIU Law
 Self-critique without a model answer is possible, too, but I concede that having a model answer is preferable. To those who would avoid providing such an answer because doing so would take time to write or risk being imprecise, I would argue that a simple solution is to release the strongest student answer.
 Metacognition is the process of assessing one’s knowledge: Do I really know the felony murder rule, or do I just think I know it? As I tell my students, it is like hovering over one’s knowledge and objectively scrutinizing one’s real comprehension.
 See generally J.A. Gundlach & J. Santangelo, Teaching and Assessing Metacognition in Law School, 69 J. LEGAL EDUC. 156 (2019) (reporting on empirical study of first-year law students, finding that students who demonstrated strong metacognitive skills were more likely to perform well).
Monday, February 7, 2022
I have spent the last few weeks working with students who did not perform as well as they (or their doctrinal professors) thought they should on exams. While information on what went wrong on multiple choice exams is scarce (other than not choosing the correct answer), determining where things went awry on essay questions is diagnostic gold. So, like I am sure we all do, I tell students to go talk to the professors about their exams. This isn’t a novel idea to the students, but often the reaction I get is abject fear. Here are some things you can tell students about the necessary post-mortem conversation:
- The Professor isn’t going to lower your grade. There is a lot of paperwork and sometimes even a faculty vote involved, so it is extremely difficult to change a grade in any direction. You are more likely to have the U.S. Supreme Court grant certiorari on an appeal than for a professor to change your grade based on this conversation.
- The Professor is very, very unlikely to raise your grade unless there is a clear mathematical error. See above (less paperwork and no votes when it is math).
- There may be some fiery hoops to jump through to get to this conversation. There are no three-headed dogs guarding the gates, but you may need to pick up your exam, check the class-wide comment sheets, check your individual grading rubric, and read any posted sample answers before you even attempt to sign up for an appointment.
- Jump through the aforementioned hoops and then make the appointment.
- Keep the appointment. Trust me, there is someone who could have used this appointment and by making it and monopolizing this time, you have an obligation (unless it is an emergency) to not waste it.
- Prepare concrete questions for the appointment-not, “what did I do wrong?” but more, “I see the rubric indicated that you were looking for a different restatement section here, can you explain why that one was more applicable than the one I mentioned?” Write these questions down because you will get flustered. Also, write down the answers because your relief that this meeting is over will erase the answers from your memory.
- This is going to be hard. There is no way (and no need) to sugarcoat it. Facing the reality that you did not do well is really hard and hearing the details is even harder.
- It is worthwhile-because the information from this meeting will form the basis of the plan that we will work on together to prevent the issues from reoccurring. The emphasis is on “we,” because you are not alone in this journey. ASP is with you.
- It is also worth noting that while the Professor is very unlikely to yell at you, belittle you, mock you, or tell you to leave law school altogether, you may see it that way in the heat of the meeting. Try to distinguish your inner voice from the comments and critique they are giving you. If you have done this and realize that the call is not coming from inside the house, please come tell me and we will make a plan to address it.
- Someday you will laugh about the fear you had going into this meeting. I recognize that today is not that day.
So, as Irving Berlin said in 1936, “Before the fiddlers have fled; Before they ask us to pay the bill; And while we still have the chance, Let’s face the music and dance.”
Monday, January 31, 2022
I would imagine that almost all of us in the ASP world see students who are in academic difficulty: those who are on warning, probation, and even double secret probation now and then. I know I work with students in classroom settings and one on one who have been told they must seek my help to stay in school. Some students are right on the (wrong) edge of the standards that would exclude them from these graduation conditions, and they are, usually, unhappy about the circumstances. And they are right-because some of the requirements and limitations may only serve to dig them deeper into the hole they barely fell into in the first place.
One example of this is a list of required classes that students must take depending on their GPA. The classes are carefully curated to correlate to bar passage. Yet, they also tend to be bigger upper-level classes (at least 40+ students), so chances are that there is a mandatory grading curve applied to these classes. Sometimes the grading curve (required by many schools especially in the 1L Year) may be the reason the student is in this predicament in the first place. And thus, students who might have easily dug themselves out of academic difficulty in their 2L year by being able to choose classes that are better suited to their interests and strengths, find themselves further entrenched. To make matters worse, these classes also tend to have one summative assessment to earn that curved grade. Sometimes the issues students face are far more exam related than comprehension related.
These same students are also often locked out of, or put at the bottom of the list for, clinics and other programs that give them experience (needed for graduation) and confidence (also needed for graduation). This is exactly the kind of class experience that students who struggle with exams need. This is where they could shine, if they could just reach the light switch.
A student who is currently occupying this space met with me last week and told me that she felt, particularly in light of the pandemic and the chaotic atmosphere of her first year, that she was being kicked while she was down. Even more disheartening, she felt that she was still being kicked while on her way back up. It reminded me of the song Dirty Laundry (Don Henley-and if you also remember this song, we are both officially pretty old). The chorus of this song, “kick ‘em when they’re up, kick ‘em when they’re down, kick ‘em all around,” is what came to mind in that moment. Considering the NextGen bar exam that incoming classes (next year’s incoming evening students at my school, for example) will be taking, perhaps we need to rethink how we handle students when they’re down. The new edition of the bar exam will emphasize competencies over memorization. While we will still all encounter students who may not be up to the task, there are many students clinging to edge of the cliff who are absolutely capable of finding solid ground-given the chance. Let’s throw them a rope. I don’t want students to think that we, “love it when people lose.”
 Ok--that is not a real thing, but I thought it was clever to slip in a reference to the movie Animal House and see if anyone noticed. Of course, explaining the reference in a footnote kind of defeats the humor….
Monday, January 10, 2022
When I was in high school, and college and law school, I would tell my parents when I was nervous about exams like SATs, midterms, finals... And they would always answer, “you’ll be fine.” I’m not complaining about the faith they had in me, but even after I explained the reason for my extra concern, the answer remained the same. I was dismissed. It didn’t help me feel better in any way and certainly didn’t help me prepare for what was ahead.
Grades were released last week at my law school, and it has been…a lot. I can hear a lot of you nodding in agreement right now. In between extremely interesting AALS sessions, I spent hours speaking with students towards the end of the week. And like most of you, I met with students at all positions on the grade spectrum from, “I don’t know how I got an A-“ to “Am I going to get dismissed?”
Our list-serv has also been full of amazing emails and messages we send students to get them through this time-all starting with the basic idea that “your grades do not define you.” I wholeheartedly agree that students are more than their grades and that their grades do not define them. Collectively, in the next few weeks, we will help students make study plans, assure them that they have more exam experience going forward, and remind them that we are here to help. We will give advice to talk to professors about exam performance, diagnose the issues or types of questions that plagued their exam, and offer practice materials. We will take action.
Yet, there is an elephant in the room: how can I tell students that they are not their grades and at the same time fail to acknowledge the reality that until they have some legal work experience, they may, in fact, be defined by their grades. I am telling them to transcend the grades at the same time I am helping them make plans to get better ones. They know, and they know that I know, that potential employers do care about class rank even I don’t agree with that as a bright line rule for granting interviews (and trust me, “don’t agree” is an extremely diluted way to express how I feel about that). I worry that I am being dismissive if I say it shouldn’t matter-or even worse-misleading some students to blame circumstances (or people) they cannot control for the grades they received. I absolutely know that some students are laid low by circumstances outside of their control (I had a student whose house burned down last year), but frequently students need to own (or adversely possess) the bad grades to make positive changes.
I think some of the hardest work I have done these past few days (and I assure you, my dance card is full today as well) is speaking to students who need to plead their case to a committee to be allowed to stay in law school (after one seemingly catastrophic semester). There is, per our academic rules, a presumption of dismissal (albeit rebuttable). We advise our students to share all the distractions, traumas, and circumstances that led to this situation. No doubt, this pandemic will be the underlying cause of trauma and academic distress long after we box up our masks and hope they get moldy in the basement from non-use. More importantly, students need to tell the committee about the plans they have made to deal with these issues. I remind them to tell the committee that they are taking control over what is in their power to control and talk about their plans to ask for help when what is uncontrollable becomes too much. I assure them that asking for, and receiving, help is a sign of maturity and resilience-not weakness. And we should not forget that the next time these students take an exam, they will have an extra layer of stress added because they need to do better and are still frightened by how things went last time.
I will definitely tell students that things are going to be okay (and more often than not, they will be)-but it cannot be the only thing I tell them. I know students need to hear those words in my voice, but I also need to be certain that they will benefit from hearing it more than I will.
 Thank you to Melissa Hale, Susan Landrum, and Kirsha Trychta!
 I don’t even agree with ranking them, but that will be another post.
Monday, December 6, 2021
Last night was the eighth and final night of Hanukkah (or Chanukkah, or even Hannukah). This year we had two different types of candles for our two menorahs. We had one box of artisanal long and graceful white to blue ombre candles. We also had a standard 99¢ little blue box of shorter, more colorful candles from the supermarket (or maybe a leftover box that one of our three kids brought home from Sunday school). We lit both menorahs each night: one with the pretty candles and one with the garish little blue box candles. The pretty candles burned and melted. The plain candles did as well. The bottom line was this: it was meaningful regardless of which candles we used.
Here comes the (possibly heavy handed) link to law school exams. If students have an exam answer where they spotted the issues, used the correct the rule, did both sides of analysis, and weighed the options before concluding, then it is meaningful even if it isn’t graceful (or long). There are all sorts of other holiday analogies I could make here…like remember to go one at a time when lighting your candles; remember that you need to light the helper candle first (that being the student’s knowledge and wellbeing); do not re-spin your answer to multiple choice questions, and, of course, the miracle of being asked eight multiple choice questions about one thing you know really, really well. Surely, miracles and light are what many students are asking for this time of year.
It is also important to remember, though, that like any ritual, exams have their traditions and practices. We should be sure to remind students that after each exam, they should scrape off the remnants of the last one and reload with one more point of light before moving on to the next one. Make this a tradition. Lamenting over what went wrong on the last exam is always going create a barrier to going forward-and moving on to the next exam is part of the ritual. Remembering what went well (this year, none of our cats lit themselves on fire!) will be more productive. Make this a practice. Afterall, you cannot light fewer candles as Hanukkah progresses because you cannot travel through time (yet).
Finally, when exams are all over, students should be sure to clean up before putting their exam self away. No one wants to deal with a December mess in May. And for what it is worth, the fancy candles were a bear to clean up.
Happy Holidays to all!
Tuesday, August 10, 2021
Over the years, I have noticed that many legal educators and students have an imperfect understanding of the utility of using prior exams for practice. This misunderstanding usually holds that the purpose of such materials is for students to review the exams simply to see what topics professors test and methods with which they do so. In turn, faculty become leery of providing such materials, as doing so might create an unwarranted expectation on students’ part that their exam will test the same topics and use the same methods.
This impression is problematic. Both students and faculty are squandering the opportunity for students to use materials that will make them better learners, improve their performance in law school and the bar exam, and increase their knowledge and skills (both in classes and on the bar exam).
An important recent (methodologically sophisticated) study supports this claim. In Understanding the Metacognitive “Space” and its Implications for Law Students’ Learning, Professors Jennifer Gundlach and Jessica Santagelo found statistically significant evidence that: “Students who reported using active strategies at the end of the semester were more likely to succeed in the class … relative to students who never used active strategies.”
Faculty should better understand the use of prior exams and other materials that would allow students to practice rather than re-read over and over again. Although many law professors used re-reading and re-reviewing prior exams in their studies, their success quite possibly could have been despite and not because of those flawed methods. Faculty tend to have had an elite education, elite aptitude, and elite socio-economic condition opportunities for academic success. They thus had a great degree of wiggle-room in terms of the efficacy of their learning methods
Many of our students are not so lucky. If we admit students with fewer socio-economic opportunities and with non-elite academic credentials, we should not erect further obstacles to their success by assuming that the methods we used in very different circumstances will be effective for them.
Especially given the recent findings quoted above, we should not rely on the anecdote fallacy (that because one person had success with a method all will have such success) and the post hoc ergo propter hoc fallacy (that because certain study methods preceded success, those methods must have caused that success). Instead, we should rely on the empirical evidence that shows that active learning, including taking practice exams, fosters success more optimally.
(Louis Schulze, FIU Law)
Thursday, February 25, 2021
This week the Association for Academic Support Educators ("AASE") published Best Practices for Online Bar Exam Administration. AASE President, DeShun Harris, says that the best practices advocate for "procedures that ensure a fairer test for online test takers." The organization, established in 2014, urges state high courts and bar examiners to adopt these procedures. The AASE Bar Advocacy Chair, Marsha Griggs, says "many of the best practices that we identified are things that bar examiners are already doing." Yolonda Sewell, Vice President for Diversity, adds that in addition to the great strides that bar examiners have made in deploying an online exam, we seek to make sure that the online administration does not unfairly disadvantage any bar applicant on the basis of skin tone, race, gender orientation, biophysical conditions, disability, need for test accommodations, or socio-economic resources. The Best Practices are aimed to level the playing field, both among applicants of varied backgrounds, and between the online and in-person versions of the exam."
One of several effects of the COVID-19 pandemic, was that bar examiners and bar applicants questioned the wisdom and feasibility of administering in-person exams in the traditional large group format. In response to COVID-19 limitations, the first online bar examinations in the United States were administered between July and October 2020.
With but a few exceptions, the online exams were remotely proctored using artificial intelligence technology provided by a commercial vendor. As the exam dates approached many issues surfaced surrounding the use of facial recognition software and remote proctoring. One prominent issue was the number of complaints voiced from students who are people of color, asserting that the software did not recognize them. During and after the exam, other complaints sounded, ranging from data breaches, and poor technical support, to "flagging" hundreds or thousands of applicants for alleged cheating or "testing irregularities." At the extreme, some applicants reported having to sit in their own waste—as the exam instructions warned applicants about being out of view of the camera except during scheduled breaks—for fear of failing the exam. Additionally, there were reported issues with the technical delivery, submission, and scoring of the Multistate Performance Test, and jurisdictional scoring errors that wrongly identified applicants who earned passing scores as exam failures, and falsely notifying others who failed the exam that they had passed.
AASE lauds the efforts of bar examiners at the local and national levels for their flexibility and willingness to provide options for remote administration. While we defer to the proven expertise of the test-makers in determining matters related to exam content, scoring, accommodations and character and fitness eligibility, we add our collective expertise in assessment delivery, performance application, and enhancement pedagogies for non-traditional test takers. We recognize that online bar exam delivery will outlive the pandemic and current circumstances. We also believe all who play roles in the process of creating and delivering a bar exam, want the exam to be fair and effective. In light of those dual goals, we think the time is ripe for adoption of additional policies that are more than performative gestures toward a more diverse legal profession.
(Association of Academic Support Educators)
Monday, April 27, 2020
For the times they are a-changin’. -Bob Dylan
The times certainly have changed. Almost overnight, every facet of daily life has transitioned to online delivery. Telehealth and telemedicine are becoming the primary source for doctor-patient interaction during the pandemic. Law school classes are online. College classes are online. K-12 primary education is online. Church and religious services have moved to online formats. My grocery and organic farm-to-table products —gone online. Court hearings, also online. I can buy a car, entirely online. I can have legal documents notarized online.
But I cannot take the bar exam online. At least not yet.
The COVID pandemic has tested our resolve and our ability to utilize available technology. Almost every aspect of the legal profession, from court proceedings and probate administration, to law enforcement and legal education, has mobilized for remote administration. Bar examiners at the state and national levels should hang their heads in shame for not harnessing the available technology to deliver the existing exam remotely. It is an embarrassment of epic proportions that those at the helm of legal licensure are so behind the times that the pipeline for entry to the legal profession could be closed until further notice.
Relentlessly tethered to tradition, those insistent that 2020 law grads take an exam that may not be offered until early 2021 have either dropped the ball or are hiding it. It is fundamentally unfair to require an exam for licensure and at the same time withhold that exam from licensure candidates. The cries for diploma privilege and supervised practice options have sounded around the world. To which bar examiners and high courts have responded with either feigned indifference or a proposed solution that is no more than a band-aid for a gaping wound.
To become attorneys, bar candidates should not have to risk their health or the health of their vulnerable loved ones to the spread of the coronavirus. Even today, there are still more unanswered questions than answers. The majority of U.S. jurisdictions have made no announcement as to whether they will offer an exam in July or not. A number of states have canceled the July exam, but still have not announced definitive information about the date or form of the replacement exam. Candidates across the country remain in the dark as the bar exam becomes an archaic qualifier for competence. If the bar examiners hold fast to the pencil and scantron method of testing, we can expect to see it go the way of the pay phone, the answering machine, and the 8-track tape.
Two states, California and Massachusetts, have alluded to an online exam, but with little detail. It remains unknown what role the National Conference of Bar Examiners (NCBE), who produces the multistate exams used in all states except Louisiana, will play in the online exam. If the NCBE can provide an online exam for two states, why not do so for all UBE jurisdictions? And why make candidates in other states suffer the risk of exposure to COVID or career delays by withholding the online exam?
If the NCBE has not developed an online exam, we must ask “why not” and "where has it been for the last two decades?" And we must not accept “test security concerns” as a viable response. Test security is no less of a concern to law school faculty, and to those who administer admissions exams. Yet all law school exams and the LSAT will be offered online in May 2020. The MPRE (another NCBE exam) and other professional licensing exams are already online.
Whether the bar exam effectively assesses one’s competency to practice law is a reoccurring question that will continue to resurface. At a time when virtually every state, except maybe Utah and Wisconsin, is under fire for indecisiveness and poor communication regarding the fate of would-be July 2020 bar takers, bar examiners are justifiably under scrutiny. As is the bar exam. The future of the exam is in the examiners’ hands. We’ve only to watch and see if they’ll respond like Blockbuster or Netflix.
(Marsha Griggs© 2020)
Saturday, March 28, 2020
There is so much that goes into the making of a bar exam. There are layers of research, accountability, and quality control involved in the drafting of the questions. There is beta testing of the exam content. There is scoring, rescoring, and equating. And there are levels of exam security that rival Area 51. The parties involved range from statisticians to politicians, who cautiously weigh input from the podium, the bar, and the bench. To top it all off, the job of bar examiner – at least at the state level – is a modestly compensated appointment that is held all the while keeping a day job of managing a law practice, or ruling from the bench. Too little appreciation is shown to our almost volunteer bar examiners in times of rest and high passage rates. So, I sincerely and thankfully shout out bar examiners everywhere who discharge an office of such societal importance. And I use the term bar examiners in the collective to include every role, from essay graders to the character and fitness investigators, from the licensure analysts to the admission administrators and honorable members of the board.
Bar examiners have to operate independently and make decisions about scoring and bar admissions that will be unpopular to some. But the examiners must make decisions, and it is the failure or delay in reaching a particularly important decision that has placed examiners under fire across the country. That decision: what about the July 2020 exam?
It is understandable to the legal and lay public that a law license is a privilege not to be indiscriminately awarded. It is equally clear that security protocols must be in place to maintain the integrity of the exam. What is not understandable is how some examiners can fail to make adjustments in the face of the extreme and dire circumstances of the COVID pandemic. In less than two weeks’ time, the nation’s ABA-accredited law schools went entirely online, trained faculty (many with limited technology skills) for online teaching, and adopted pass-fail grading. There is simply no excuse for bar examiners to not be just as creative and as willing to implement emergency protocols for the prospective July 2020 examinees.
This week 1,000+ students, representing all of New York’s law schools, petitioned the New York State Bar Association’s Task Force on the New York Bar Examination for an emergency diploma privilege. Days later, New York canceled the July exam. Adding ambiguity to injury, the exam has been rescheduled to the fall, but no date is provided to examinees who need to make study, travel, and lodging plans for the two-day exam. Are you kidding us? It’s almost like the examiners are not listening. At all.
A reasonably prudent person will interpret the New York decision as a signal for other states to follow. New York is considered highly influential, as its 2016 adoption of UBE was followed by Illinois, Maryland, North Carolina, Ohio, Tennessee, Texas, and others. The 2020 bar takers are not asking the examiners to give away law licenses without merit. They —joined in large number by law faculty, deans and alumni— are asking for necessary emergency licensing measures. They are asking examiners to think outside of the traditional bar exam box. They are asking that fairness, humanity, and the chance to earn a living be prioritized over security worries. They are asking the examiners to listen.
Excerpted from An Epic Fail, Volume 64 Howard Law Journal _____ (2020)(forthcoming).
Monday, January 13, 2020
The Uniform Bar Examination (“UBE”) has juggernauted from an idea to the primary gateway for entry into the practice of law. To the resounding support of law graduates and law schools, a supermajority of states has abandoned individual state law exams for a uniform exam written by a private entity. The UBE is the exam of the future and I anticipate that at least three more states will have adopted the UBE by year end. The UBE remedies many voiced complaints about varying degrees of exam quality and exam difficulty across states. Perhaps the most touted feature of the UBE is score portability.
UBE takers may "port" or transfer their scores into other UBE states, thus, relieving examinees from the arduous chore of having to sit anew for a bar exam. However, the promise of score portability is allusive at best. Transfer procedures vary by state. The fees to transfer one’s UBE score may be as high as $1700, possibly more than the cost of taking the bar exam in the transferring state. For a majority of students who exit law school burdened with student loan debt, these transfer costs will make the promise of portability unrealizable.
According to attorney and bar prep professional Ashley Heidemann, “the UBE is not as portable as law students are led to believe.” Heidemann feels that the promise of portability is highly deceptive to law students who believe that a widespread uniform exam means that once licensed, UBE attorneys will be able to transfer into other states at any time. “The biggest misconception students have,” says Heidemann, “is that UBE scores can be transferred to a different UBE jurisdiction at any time. In reality, UBE scores are only good for generally two to five years, meaning one cannot transfer a score from one state to a different UBE state after their specified time period is over.”
Even staunch supporters of the UBE seem to think that the UBE has not yet reached its greatest potential. UNLV Professor Joan Howarth advocates for a uniform cut score, citing that a six point score differential could effectively exclude hundreds of bar takers from the practice of law. Melissa Hale, Director of Academic Success and Bar Programs at Loyola University Chicago School of Law says, “I’d love to see a more uniform process [regarding admission and transfer policies].” Hale, who sees the UBE as an improvement over predecessor exams and self-identifies as pro-UBE, wants to make sure that students understand the score transfer process and that it is “not without hurdles.”
As more and more states adopt the UBE, academic support professionals will need to stay in the know and keep students informed about the true costs and limitations of score portability. That is — until or unless a uniform cut score becomes a reality. Stay tuned, we may be closer than we think!
 Marsha Griggs, Building a Better Bar Exam, 7 Tex. A&M L. Rev 1 (2019).
 Interview with Ashley Heidemann, President, JD ADVISING LLC (Mar. 25, 2019).
 Joan W. Howarth, The Case for a Uniform Cut Score, 42 J. LEGAL PROF. 69, 72 (2017).
Monday, December 2, 2019
Follow (v): To act according to an instruction or precept; to pay close attention to; to treat as a teacher or guide.
While in law school, I never connected with any of my professors on social media. Let's pretend that's not because social media tools were not yet sufficiently developed to allow me to do so. Fast forward into the information age where I've seen healthy discussions about whether law professors should encourage students to "follow them" on Twitter and other social mediums. Ultimately every professor has the right to their own individual preferences and likewise, their students have the freedom to decide whether and how to interact with their professors online.
Many professors are kind enough to freely spew out words of wisdom as regards exam preparation, and the beauty of Twitter makes these gems available to all. University of North Carolina School of Law Professor O.J. Salinas tweeted some words of wisdom that I wish I had access to as a first-year (or even second-year) law student. Professor Salinas shared:
"Law students (particularly 1Ls): Finals are here. Remember to support your conclusions w/ analysis. Apply the law to the facts of the hypo for every issue you spot. Conclusory answers (conclusions w/out analysis) don’t get you a lot of points (if any). The facts of the hypo are your friends. The facts are there to help nudge you (sometimes quite directly) to your analysis. If you are stuck on the exam and don't know where to go, first take a couple of deeps breaths. Then re-read the call of the question. Then revisit the facts. As you revisit each line of the facts, ask yourself: Why is this fact here? Have I applied this fact to any laws that we have covered in class? Does this fact or could this fact relate to something that we have covered in class?
Finally, make it easy for your prof. to read your exam. Aim for clear & concise writing. Short sentences. Paragraph breaks. Headings/subheadings. Walk the reader through your prediction by providing effective/complete legal analysis. And don't presume your reader knows anything. You can do this!"
I have a list of professors that I follow. Many of whom I know only through online interactions. I am grateful to be able to follow their wisdom and shared experiences. I benefit regularly from our exchanges. My daily takeaways include teaching tips, common struggles, and concise study and writing advice for my students. Thanks Professor Salinas for your exam writing wisdom. I remain a follower.
Monday, November 4, 2019
Logically it makes no sense that, in today’s world, failing at something because you tried will tarnish you with a negative social label. . . . [T]o continue evolving, the stigma associated with failure has to be shaken off and be replaced with positive personal development. When you fail at something, hopefully you can recogni[z]e why and where you failed, so that next time you can move forward accordingly. – C. Montcrieff
Bar takers in all but one state have received results from the July 2019 bar exam. Although California examinees may have to wait another week for results, with increased MBE scores reported nationally, bar passage rates (overall) are deliciously higher than recent past exams. What better way to transition to the semester wind down than with news of newly licensed attorneys joining the ranks of your alumni rosters!
I am elated and overjoyed for my students who find their names on the bar pass list. I understand the sacrifice, the grit, the fear, the pressure, the exhaustion, and the anxiety that are necessary conditions precedent to bar passage. I actually get teary-eyed as I scroll through the social media feeds of newly minted attorneys that contain expressions of joy and gratitude for the obstacles they overcame and support they received.
My joy is tempered by the heartache I feel for those who fought so valiantly and fell short of the state cut score. It never ceases to amaze me how a day that brings elation can, at the same time, end in devastation. Those of us doing ASP work must manage that range of emotions altogether in the same day. We collect data and publish articles on interventions that lead to bar success in licensure candidates with known failure indicators. We are experientially trained to manage bad news and to earnestly encourage unsuccessful students to try anew. But how does the reality of our calling square with the purpose of our profession?
We must examine the role and reality of stigma in bar exam failure and determine where, how, and if, it fits into the notion that diversity in the legal profession is not solely about racial and socio-economic inclusion. The diversity promoted by effective academic support programs includes intellectual disparities, physical and emotional disabilities, linguistic variations, and learning differences.
The definition of academic and bar success is changing. Success for some may be sitting through a two-day exam without the testing accommodations relied upon during law school. For others, it can be completing an exam scribed in a language other than the test-taker's native tongue. For many bar takers who graduated in the bottom quartile of their law school classes and/or with low entering LSAT scores, success may be coming within 5-10 points of a passing score, that all published statistics said that they could not achieve.
I dare not suggest that legal educators dismiss or ignore bar failure, but I challenge the status quo about how we frame bar failure as part of professional identity formation. Moved by the MacCrate Report, law teachers have become more intentional about teaching, and have begun to support law students’ professional identity formation inside and outside of the classroom.1 I see no reason for that support to end with the bar examination. As we normalize struggle2, we must communicate bar failure as a temporary status and not as an indelible component of one’s professional identity.
1 Susan L. Brooks, Fostering Wholehearted Lawyers: Practical Guidance for Supporting Law Students' Professional Identity Formation 14 U. ST. THOMAS L.J. 377 (2018).
2 Catherine Martin Christopher, Normalizing Struggle, ___ Arkansas L. Rev. ___ (2019).
Tuesday, September 24, 2019
Last year, one of my international students brought to me a response she had written to a mid-tern exam question. She was wholly perplexed, because the professor had given her a low score on this particular response, and yet, even in looking at the notes the professor had written on her paper, she could not fathom where she had gone wrong. Bizarrely, the more the two of us discussed her essay, the more confused I became about why she had written what she had written. Finally, and wholly by accident, I stumbled across the source of the trouble. At one point the exam question referred to someone being "served", and my student had not recognized this usage as being connected with "service of process". The latter term she understood, but she read the off-hand and abbreviated statement that "X was served" as some form of hospitality, not legal action. ("Have some tea!") This was partly because English was her second language, and undoubtedly also partly because she did not grow up watching movie and TV shows in which frumpy anonymous operatives walk up to the protagonists, slap envelopes against their chests, and say, "You've been served!" For much of our discussion, it had not even occurred to me that this could be a source of confusion, and of course there was no way the student could have known it herself.
I thought about this episode last week, when I was attending a conference hosted by the NCBE, in which some of the presenters were discussing the ongoing evolution of the development of MBE and MEE questions. Part of that evolution includes the elimination, or at least minimization, of the use of terms whose meaning was not tied to the practice of law and might not be recognized by all of the examinees. An example given involved a torts question involving a car that had been damaged in a collision. In the original question, the defendant was identified as "Union Pacific", and it was apparent that the rest of the question was written with the assumption that examinees would recognize Union Pacific as a company that operated railroads, and that therefore the collision under consideration was between a car and a locomotive. The newer, improved version of the question simply referred to the defendant as "a railroad company", thus providing the information needed for proper analysis to all examinees.
Discussion at that point livened up a bit, as presenters and participants brainstormed about other terminology that question writers should considered changing in order to make their questions more accessible. These tended to fall into a few categories:
- References to people, businesses, locations -- generally, things that could be identified with proper nouns -- that might be recognized by some people (but not all people) as possessing some characteristic relevant to the legal analysis. For example, a question that named Gregory Hines as a plaintiff in a case in which his feet were injured might reflect the expectation that examinees would recognize Hines was famously a dancer, and that therefore a foot injury might generate greater damages to him than to an average person. A question that mentions "Reno" might rest on the assumption that everyone knows Reno is in Nevada and gambling is legal there.
- References to technology, fads, or news items from two or more decades ago that most of us who were alive and adult at that time would instantly recognize, but the significance of which might be totally lost on people currently in their 20s. A question that depends on the operation of an answering machine or the effect of a slap bracelet may only be accessible to a portion of the testing population.
- Specialized terms for everyday objects that nevertheless are not commonly used in conversation. A question that depends on knowing the difference between a banister and a balustrade, or between a lintel and a gable, is probably going to lose a portion of the examinees.
It can be hard, when writing exam questions or practice questions, to resist the temptation to make a clever reference or to give examinees the chance for a moment of recognition. But our tests are not supposed to be tests of any vocabulary but legal vocabulary. If an examinee misses the opportunity to demonstrate that he knows the appropriate rule, and can apply it skillful to relevant facts, because he did not have access to the full meaning of the fact pattern so that he could recognize the issue that leads to that rule, then the examinee has been unfairly denied a chance to shine.
Sunday, July 28, 2019
Social media timelines are aflutter since the California Bar Examiners released, days early, the question order and subjects for the July written exam. After someone “inadvertently transmitted” test information to “a number of deans of law schools,” the CA examiners disclosed the same information to all registered July 2019 California bar takers. The internet remains undefeated and the information now hovers in the public domain accessible to us all for comment and critique. The CaliLeaks, as I refer to them, sent ripples of shock, resentment, and gratitude throughout the community of future, past, and present bar takers.
Dear California Bar Examiners, you did the right thing. You responded to a mistaken disclosure by disseminating the same information to all bar takers, to prevent any actual or perceived unfair advantage. You made a mistake and you owned it. There is a lesson in every mistake and I hope that other bar examiners, and especially the NCBE, with its foot on the jugular of all but a few states, will learn from yours.
In an ideal scenario, the premature and selective leak of confidential information to some law deans would not have occurred. No student should be disadvantaged in terms of familiarity with the exam content, inside knowledge, or the opportunity to pass. We now know the identities and school affiliation of the receiving deans. I am naive enough to believe that respected academic leaders would not compromise the integrity of the bar exam by sharing confidential information about its content. I am also cynical enough to recognize the good reason of those who question whether bar takers from some schools may have received information days before bar takers from other schools. Notwithstanding the many unanswered questions, California's disclosure (the one to all of its bar takers) is something that could have and should have happened long ago.
For goodness sake, the bar exam is based, at least in theory, on fundamental legal principles learned in law school. Knowing the general subject area to be tested is not a dead giveaway to the question content. Bar examiners in Texas have provided general subject matter information for decades. It is a preposterous notion that knowing the subjects that will be tested will lead to a flood of unqualified lawyers. Consider the law school final exam as the loosest conceivable model. Law students know to expect Property questions on their Property final exam, but it still leaves them to their own devices to prudently review the full scope of course coverage from possessory estates and future interests, to conveyances, recording acts, and landlord-tenant rules. Disclosure of the tested question areas should not be Monday morning tea, instead it should be the norm in bar examination. Telling would-be lawyers what they need to know to be deemed competent to practice law isn’t a blunder or a gracious act. It is the right thing to do.
I challenge any lawyer, law student, or law professor to imagine the futility and frustration of completing a full semester of required first-year courses, spending weeks preparing for final exams, and then not learning until the beginning day of final exams which courses will be tested and which will not. As unthinkable as this notion may be, this precisely describes the current practice of bar examination in most states and under the UBE. Time will tell if California’s leak leads to a more reasonable exam process and to less arbitrary bar failure rates. If it does, then others should follow suit. We need a better bar exam and California’s error could be an accidental step in the right direction.
Thursday, February 7, 2019
Recently, I heard a discussion suggesting that bar passers do things differently in the final two weeks than those who are not successful on the bar exam. That got me thinking about what I've been seeing, at least anecdotally, in my 10-plus years working with students in preparing for their bar exams.
First, both groups tend to work extraordinarily hard in the last two weeks before their bar exams. So, what's the difference? It must be in the type of work that the two groups are doing. In short, during the final two weeks, it seems to me that bar passers tend to ramp up their practice with lots and lots of MBE questions and essays [while also creating super-short compact homespun study tools (2-3-page outlines, flashcards, or posters)]. In contrast, people who find themselves unsuccessful tend to focus on creating extra-bulky study tools and trying to memorize those study tools with very little continued practice of MBE questions and essays. In brief, one group is continuing to practice for the exam and the other group is focused on memorizing for the exam.
But, here's the rub:
It’s a perfectly natural feeling during the final two weeks of bar prep to want to focus solely (or mostly) on creating perfect study tools and trying to perfectly memorize all the law.
But, according to the educational psychologists, there’s something called “useful forgetfulness.” You see, when we jam packet our study tools with everything, we aren’t learning much of anything because we haven’t had to make any hard decisions about what to let go (what to “forget”). We’re just typing or handwriting or flowcharting like a scribe. But, when we purposefully decide that we are only going to make a super-short “starter” study tools (knowing that we can always add more rules as we work through more questions during the next couple of weeks), our decisions about what to put in our super-short study tools (and what to leave out) means that we actually empower ourselves to know both what we put in our study tools (and what we left out).
As a suggestion, tackle two subjects per day – one subject that is essay-only and one subject tested on both the essay and the MBE exam. Starting with one subject in the morning, using the most compact outline that your commercial course provides (and referencing the table of contents for each subject), create a super-short study tool with the goal of completing your study tool in 2 hours or less.
Here’s a tip:
If you think that you need a rule, don’t put it in because you can always add more later. Instead, only add a rule that you’ve seen countless times over and over. Just get it done. Move quickly. Don’t get stuck with definitions of elements, etc. Stick with the big picture umbrella rules. Think BIG picture. For example, be determined to get through all of contracts in 2 hours (from what law governs to remedies). As a suggestion, have just one rule for each item in the table of contents for your commercial bar review outline. Don't go deep sea diving. Stay on the surface. Then, in the remainder of the morning, work with your study tool through a handful of practice essays. In the afternoon, repeat the same tasks using a different subject (creating a snappy study tool and working through a few essays). Finally, in the evening, work through mixed sets of MBE questions.
In the last week before the bar exam, with most of your starter study tools completed, focus on talking through your study tool (for about one hour or so) and then working through lots and lots essay problems and MBE questions. As you practice in the last week, feel free to add rules that come up in practice essays and MBE questions to your study tool. As I heard one person explain it, your study tool becomes sort of a "bar diary" of your adventurous travels through essays and MBE questions (thanks Prof. Micah Yarbrough!). In short, you've created a study tool that has been time-tested and polished through the hard knock experiences of working and learning through lots of bar exam hypothetical problems.
So, for those of you taking the February 2019 bar exam, focus on practice first and foremost because you aren't going to be tested on your study tool. Rather, you're going to be testing on whether you can use your study tool to solve hypothetical problems. And, good luck on your bar exam! (Scott Johns).
P.S. For those taking the Uniform Bar Exam, there are 12 subjects as grouped by the bar examiners (I think there are 14 subjects in California, depending on how you count subjects):
* Business Associations (Corporations, Agency, Partnership, and LLC)
* Secured Transactions
* Federal Civil Procedure
* Family Law
* Wills & Trusts
* Conflicts of Law
* Constitutional Law
* Criminal Law & Procedure
Sunday, December 2, 2018
Thank you to Sandra L. Simpson, Co-Director of the Institute for Law Teaching and Learning at Gonzaga, for her email about a post written by Lindsey Gustafson of University of Arkansas Little Rock on the ILTL pages reviewing a 2016 article by Elizabeth Ruiz Frost entitled "Feedback Distortion: The Shortcomings of Model Answers as Formative Feedback." The review can be found here: Article Review. (Amy Jarmon)
Tuesday, November 13, 2018
I have learned probably hundreds of tips, tricks, and techniques to improve one's performance on examinations. But there is only one that I learned with ten million people watching.
In 2005, I took the Florida Bar Exam -- my second bar exam, after passing the DC Bar Exam seven years earlier. When I returned to my car, the lone message waiting for me on my cell phone was not the expected call from my family. Instead, it was Glenn, from Culver City, California, calling to inform me that I had been selected to be a contestant on Jeopardy! -- the fast-paced quiz show in which contestants vie to answer 61 questions in 22 minutes.
The taping was to be in a month, and so I went right from cramming for the bar to cramming for trivial warfare. I knew there was no way I could study every possible subject that might come up on the show. At the same time, I felt like I ought to be "training". Today, there are websites that archive years of Jeopardy! clues, and old episodes on demand on Netflix, but these weren't available in 2005, so my main source of practice was watching the daily broadcast of the show at 7:30 p.m. And, perhaps because I felt that it was a rather precious resource, I decided that I wasn't just going to casually sit on the couch and shout out responses with the contestants. I decided that I was going to act like a contestant. Each contestant stands behind a podium and holds in one hand a pen-sized electronic button, and the first person to press that button after host Alex Trebek finishes reading the clue gets the chance to give the response -- famously, in the form of a question (e.g., "Who is George Washington?"). So, for a month, I tried to simulate their actions. I watched the show standing up, behind a living room chair. I held a clickable ballpoint pen, and practiced pressing the top button after Trebek finished reading each clue, and only then did I allow myself to call out a response in the form of a question. From time to time, I would feel a little goofy doing this, thinking, Isn't the show really about what you know? But I kept at it, because it seemed like the only way to really practice.
Finally, I arrived in California for the taping. Jeopardy! tapes five episodes in one day, a couple days every few weeks, so on the day on which I was scheduled to tape, I was herded into the studio with about a dozen other contestants. We spent a few hours signing documents and having make-up applied and learning all the rules and, most important and exciting, playing a few practice rounds on the set to familiarize ourselves with the equipment. I noticed some of the other contestants -- all clearly bright and as delighted as I was to be there -- seemed slightly awkward behind the podium. We all knew intellectually what to do, of course; we had all been fans watching the show for years, and we had just received a thorough briefing on what was expected of us. Even so, some contestants struggled to push their electronic button at the right time -- pushing it before Trebek was done talking would lock you out so that you could not answer, but if you waited too long, someone else would get in before you. Others got the hang of the button, with concentration, but then could not remember the responses they were trying to give. And there were times when contestants would press the button correctly, and give the right response, but forget to give it in the form of a question.
But when I went up on stage to practice, it was like I was standing back in my living room. I had practiced the timing of pushing my pen button so many times that, when it came time to press the real thing, I did not even have to think about it. I rang in quickly, focused entirely on recalling the information needed, and then gave the answer automatically in the form of a question. It worked in practice, and it worked in the actual taping. Yes, the show is about what you know, but it's important that nothing hinder you from demonstrating what you know. I won four games, and eventually came back to be a finalist in the Tournament of Champions.
In the years since, I have learned that what I had stumbled onto is known as "simulation training". It is a kind of practice that is not unlike the physical training that athletes do to develop muscle memory and automatic responses. In the context of quiz shows and law examinations, though, what makes simulation training particularly useful is not just the physical skills that it develops. What makes it useful is that it frees up mental space and focus for more complex thought. Not having to think about when to push the button and how to phrase my answer enabled me to devote full attention to reading the clue and retrieving the correct response.
Practicing to take examinations -- whether final exams or Bar exams -- can provide the same kind of simulation training, under the right conditions. Of course, students should write practice exams for other very good reasons, like improving legal analysis and uncovering weaknesses in subject matter knowledge, because law examinations should also be about what you know. But there is an added benefit when practice exams are done under conditions that imitate expected exam conditions. There are dozens of details and stimuli that students encounter consistently during an actual exam that, if unfamiliar, can demand valuable thought or cause detrimental distraction: dressing comfortably, locating a seat, timing bathroom use, logging into ExamSoft, calculating timing targets, contending with silence or noise, reading and following directions, cutting and pasting text, properly submitting responses, etc. Encouraging students to incorporate attention to these elements during their practice work, even when they are not really necessary, can help them improve performance, not because performance depends on finding a proper seat, but because being able to do so with almost no thought allows them to devote their mental energies to the tasks that really need them. Exam performance is about what you know, but it is important that nothing hinder you from demonstrating what you know.
Tuesday, October 30, 2018
To lawyers, law students, and professors, the IRAC formula is as commonplace a tool as yellow highlighters or The Blue Book. Some may tout or prefer one of its dozens of variations, particularly in specific situations, but at heart, they all do the same basic job of providing a reliable structure for building an argument. It may take some time for students to internalize that structure and use it consistently. Once they do, however, some students lean on it heavily, as a way of making sure all the expected components of their analysis (Issue, Rule, Application, Conclusion) are included. Other students may see it with more anxiety, as a set of expectations imposed by certain professors; they may worry that if they don't use IRAC, they won't receive full credit in their essay responses.
In either case, students can sometimes be stymied when trying to adhere to IRAC format in an essay test response that requires multiple pieces of analysis, like a rule with multiple elements. For example, trying to fit a discussion of a negligence claim into one big IRAC paragraph -- as some students may feel they are required to do -- may start off well, as the student correctly identifies the question of negligence as the issue and the requirement to show duty, breach, causation, and damages as the rule. But then the application section may become messy, as the student tries to write about each element. If more than one element depends on tricky or subtle facts, or if there are multiple arguments and counterarguments to some elements, then the student may struggle to control multiple threads of analysis, without additional structure, in an enormous paragraph that spreads over two or three pages. The student may lose some of those threads, and so might the reader.
This is an unsurprising consequence of the emphasis on sticking to an overall IRAC format: students, for comfort or consistency, might feel compelled to turn every argument into a unitary IRAC. This may be less of a problem for long-term projects, like a legal research and writing memo, where a student may be given more instruction about formatting and will have opportunities to rewrite and edit their essays. But on a timed assignment, like a final exam, the urge to create one big IRAC argument -- or the fear of not doing so -- can slow students down and inhibit clarity.
One way to help students improve their relationships with IRAC is to point out that a well-reasoned argument can have layers of IRACs built into it. The Application portion, after all, is where the meat of the analysis appears, and if that analysis requires that the student examine multiple elements, each element could be discussed in its own separate sub-IRAC paragraph. To use the negligence example:
Issue: Negligence claim
Rule: Duty, Breach, Causation, Damages
Rule1: [e.g., Obligation to act as reasonably prudent person under circumstances]
Application1: [Application of rule to specific facts]
Conclusion1 re: Duty
Conclusion2 re: Breach
Conclusion re: Negligence claim
This layering of IRACs allows students to take advantage of the order imposed by the format, while still providing the flexibility to address separate sub-issues separately. Theoretically, the layering could continue indefinitely, if certain elements have sub-elements to consider:
Rule3: Actual cause and Proximate cause
Issue3A: Actual cause
Conclusion3A re: Actual cause
Issue3B: Proximate cause
Conclusion3B re: Proximate cause
Conclusion3 re: Causation
This layering of IRACs may not always be the most artful way to organize a legal discussion, but in an exam situation in which students are trying to maximize speed, completeness, and clarity simultaneously, it can provide an efficient way for them to put together a complex analysis.
Thursday, September 20, 2018
According to the American Bar Association (ABA), citing to Law.com and TaxProfBlog editor Dean Paul Caron, the national average score on the MBE multiple-choice portion of the July bar exam dropped to its lowest level in 34 years. http://www.abajournal.com; https://www.law.com; http://taxprof.typepad.com. The National Conference of Bar Examiners (NCBE) reports that the July 2018 MBE average score was just 139.5, while for the July 1984 exam, Law.com reports that the MBE average score was likewise low at 139.21. http://www.ncbex.org/news; https://www.law.com.
In an article by Law.com, the President of the NCBE - Judith Gundersen - is quoted as saying that "they [this summer's lower MBE scores] are what would be expected given the number of applicants and LSAT 25th percentile means of the 2015 entering class." https://www.law.com. In other words, according to the NCBE, this summer's low score average is the result of law school admissions decisions based on the NCBE's appraisal of 25 percentile LSAT data for entering 2015 law students.
Nevertheless, despite the NCBE's claim, which was previously theorized by the NCBE back in 2015 (namely, that bar exam declines are related to LSAT declines), previous empirical research found a lack of empirical support for the NCBE's LSAT claim, albeit limited to one jurisdiction, one law school's population, and admittedly not updated to reflect this summer's bar exam results. Testing the Testers.
As an armchair statistician with a mathematics background, I am leery of one-size-fits-all empirical claims. Life is complex and learning is nuanced. Conceivably, there are many factors at play that might account for bar exam results in particular cases, with many factors not ascribable to pure mathematical calculus, such as the leaking roof in the middle of the first day of the Colorado bar exam. http://www.abajournal.com/news/article/ceiling_leaks_pause_colorado_bar_exam.
Here's just a few possible considerations:
• The increase to 25 experimental questions embedded within the set of 200 MBE multiple-choice questions (in comparison to previous test versions with only 10 experimental questions embedded).
• The addition of Federal Civil Procedure as a relatively recent MBE subject to the MBE's panoply of subjects tested.
• The apparent rising incidences of anxiety, depression, and learning disabilities found within law school populations and graduates.
• The economic barriers to securing bar exam testing accommodations despite longitudinal evidence of law school testing accommodations.
• The influence of social media, the internet age, and smart phones in impacting the learning environment.
• The difficulty in equating previous versions of bar exams with current versions of bar exams given changes in the exam instrument itself and the scope of subject matter tested.
• The relationship among experiential learning, doctrinal, and legal writing courses and bar exam outcomes.
Consequently, in my opinion, there's a great need (and a great opportunity) for law schools to collaborate with bar examiners to hypothesize, research, and evaluate what's really going on with the bar exam. It might be the LSAT, as the NCBE claims. But, most problems in life are much more complicated. So, as a visual jumpstart to help law schools and bar examiners brainstorm possible solutions, here's a handy chart depicting the overall downward trend with respect to the past ten years of national MBE average scores. (Scott Johns).
September 20, 2018 in Bar Exam Issues, Bar Exam Preparation, Bar Exams, Encouragement & Inspiration, Exams - Studying, Exams - Theory, Stress & Anxiety, Study Tips - General | Permalink | Comments (0)
Thursday, June 28, 2018
It's sweltering in much of the USA. And, the heat is only getting hotter for the many recent law school grads preparing for next month's bar exam.
So, I thought I'd offer a few "hot" tips on how to enhance one's learning this summer based on a recently published study entitled: "Smarter Law School Habits: An Empirical Analysis of Law Learning Strategies and Relationship with LGPA," by Jennifer Cooper, adjunct professor at Tulane University, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3004988
As detailed in the article statistically analyzing study tactics and learning, Professor Cooper found that two particular study strategies are positively correlated with law school grades.
The first is elaboration, i.e, explaining confusing concepts to others. So, be a talker this summer as you prepare for your bar exam. In short, be a teacher...be your teacher!
The second is the use of practice questions to learn. So, grab hold of every opportunity you have this summer to learn by doing. Take every mock bar exam you can. Work through every bar exam practice problem available. Be tenacious in your practice. Learn by doing!
Finally, as documented by Professor Cooper, beware of reading and re-reading. It might make you feel like you are learning, but there is little learning going on...until you put down the book and start working on problems for yourself. And, that particularly makes sense with the bar exam...because...the bar exam is testing the "practice of law" not the "theory behind the law."
So, throughout this summer, focus less on reading and more on active learning - through lots and lots of practice problems and self-taught elaboration to explain the legal principles and concepts - as you prepare for success on your bar exam next month. (Scott Johns).