Tuesday, July 18, 2017
ILTL Cultural Competency Takeaways, 2 of 2
This blog post is the second in a two-part series detailing my takeaways from the Institute for Law Teaching and Learning’s “Teaching Cultural Competency and Other Professional Skills” conference which was held in Little Rock, Arkansas on July 7-8, 2017. For part one of the series, click here.
Professor Andrew Henderson from the University of Canberra in Australia discussed “The Importance of Teaching Self-Evaluation and Reflection in Law School,” especially in an ethics course. One study revealed that when presented with an ethical dilemma, law students tend to resolve the dilemma consistent with their personal beliefs and without regard for the professional rule of conduct. The students answered the same question the same way before taking an ethics course, while taking the course, and after successfully completing the course. In other words, professional responsibility courses do little to teach ethical judgment makings skills. Knowing this, Professor Henderson sought to design a course that would reframe the discussion entirely He required students to identify their internal motivations, such as what makes them get up in the morning, what keeps them awake at night, why do they want to be a lawyer. He then used the students’ responses as a means to jumpstart a conversation and to identify the intersection between the students’ self-identified motivations and the ethical rules. He reported that students have become more engaged in the ethics course and that the student responses have also helped to provide more targeted academic advising and job placement advice. At the end of the discussion, a few attendees discussed how a similar exercise could be added to the start of the 1L year to assist academic support professors in providing more tailored advice to at risk students.
Professor Benjamin Madison of Regent University School of Law and his colleagues developed a course to “Help[] Millennials Develop Self-Reflection.” The mandatory 1L class focuses on the development of problem solving skills, emotional intelligence, responsibility, and “other” ABA mandated skills. To begin, students get to request a specific faculty coach. The school makes every effort, but does not guarantee, to match students with their top choice. Next, students meet with their designated faculty coach to complete an intake self-assessment or “roadmap.” After the student self-assesses him/herself, the student is assessed on those same skills by two of their peers. Professor Madison has already noted several trends at his school. First, 1L students frequently rate themselves quite high (i.e. mastery level) despite having little to no professional development training, and students rate their peers even higher. Essentially students “don’t know what they don’t know.” This phenomenon is commonly referred to as the Kruger-Dunning effect in psychological circles. Second, students gravitate toward those peers who unequivocally support them, rather than peers who challenge them and hold them accountable. Lastly, students are more concerned about obtaining meaningful employment than making a sufficient income, which is especially intriguing when you consider internal motivation as a component of self-refection. (As an aside, their research concluded that the primary professional goal for 1L students is to pass the bar exam – whew!). Professor Madison said that if other schools are interested in adopting a similar program, they should reach out to the St. Thomas School of Law Holloran Center, which “continues to focus on its mission to help the next generation form professional identifies with a moral core of responsibility and strive to others.”
Professor Christine Church of Western Michigan University’s Thomas M. Cooley Law School Immerses Students in Lawyering Skills. Her nine credit program is centered on all-day classes that simulate a law practice environment. During the 14-week semester, four distinct four-person law firms handle three cases: (1) a custody battle requiring intense interviewing, counseling, and negotiation skills, (2) a personal injury suit involving pretrial litigation skills, and (3) a DUI criminal trial. The clients are actually other law students who are completing a 1-credit directed study, relying on the principles discussed in the book “Through the Client’s Eyes” for guidance. The “attorneys” within each firm exchange documents throughout the week using Google Docs and then meet on Saturdays to engage in simulation exercises. Professor Church commented that the unique course schedule—which is ABA Standard 310 compliant—has helped students to develop the stamina needed to study for the bar exam and actually practice law on a daily basis. The program now has a waitlist; students love it! She concluded the session by sharing a plethora of fact patterns, grading rubrics, and syllabi to assist participants in establishing their own litigation skills immersion program.
After Professor Church’s session, I enjoyed a tasty Greek salad lunch. In my view, a good indicator of the quality of a conference is the quality of the breaks. ILTL did not disappoint. Not only was the host school welcoming and attentive, but all the attendees were more than willing to offer helpful suggestions at every turn—well beyond the theme of the conference. Many thanks to those who shared teaching tips, performance review and tenure advice, and general support to this junior faculty member. And, let me extend a special shout out to one colleague’s pet squirrel!
Before I wrap-up, let me share the most bizarre tidbit I heard while in Little Rock. One professor explained that one of her students genuinely believes that some version of the following conversation occurs routinely at her law school—Professor A to Professor B: “When Mary comes to your office to discuss her exam, tell her that her poor grade is due to an underdeveloped rule block. And, when you meet with John, tell him that he needs to work on his application. That’s what we’re all going with this semester.” The student came to this epiphany after every single one of her professors targeted the same exact exam skill for improvement. Feel free to insert the emoji of your choice here.
I wish I could tell you about all the concurrent sessions, but unfortunately my J.K. Rowling approved Time-Turner is not TSA approved. I heard chatter in the hallway suggesting that I missed several good sessions, but as author Ashim Shanker has noted, “freedom brings with it the burden of choice and of its consequences.” For those who are interested in learning more about the other sessions or about the Institute for Law Teaching and Learning’s larger mission, checkout the Institute’s webpage. (Kirsha Trychta)
July 18, 2017 in Meetings, Program Evaluation, Teaching Tips | Permalink | Comments (0)
Tuesday, July 11, 2017
ILTL Cultural Competency Takeaways, 1 of 2
I attended the Institute for Law Teaching and Learning’s “Teaching Cultural Competency and Other Professional Skills” conference on July 7-8, 2017 at the William H. Bowen School of Law in Little Rock, Arkansas. The conference opened with a quick sticky dot poll of the attendees. The dots revealed that while most professors felt comfortable teaching skills like trial practice, negotiations, and document drafting, only a few were confident in their ability to teach cultural competencies in the classroom. In an attempt to ameliorate this (real or perceived) deficiency, the approximately fifty attendees—a surprisingly even mix of doctrinal, clinical, legal writing, and academic support professors—worked collaboratively for two days to develop a portfolio of concrete exercises to satisfy ABA Standard 302. What follows are some of my major takeaways:
The first suggestion was to “[Bring] Marginalized Populations into the [Legal Writing] Classroom.” Elon Law Professors Thomas Noble, Patricia Perkins, and Catherine Wasson explained how they each drafted a legal writing factual scenario involving a potentially unsympathetic and culturally diverse plaintiff: an Egyptian immigrant, a convicted felon, and a mentally ill survivalist, respectively. These plaintiffs’ legal claims were then further complicated by the intentional inclusion of gender neutral names, ethnic sounding names, ambiguous facts, and words with strong connotations (think: “fetus” versus “the child”). These professors crafted case files that not only required students to learn the mechanics of legal writing, but also forced students to confront their biases in a thoughtful and controlled way. Occasionally students made unwarranted assumptions which allowed the class to discuss the importance of understanding cultural sensitivity and implicit bias. Other students wanted to “help” by taking action contrary to the client’s expressed desires creating a great opportunity to talk about the ethical complexities of being a counselor-at-law. The presenters reported that many students came to realize that there might not always be a “right” answer, especially when dealing with legal issues that intersect with human dignity and diverse cultural norms.
Next we discussed the importance of “Building [a] Student[’s] Capacity for Self-Evaluation” with the use of a robust “soft skills” rubric. Before the presenters shared their rubric, Professors Lauren Onkeles-Klein and Robert Dinerstein used Mentimeter’s in-class polling software to highlight that professors view self-assessment as an opportunity for student “reflection,” but students view self-assessment exercises as “painful busywork”—regardless of whether the assessment process occurs in a doctrinal class, legal writing course, or the clinical setting. The question then became: how do we shift student mindset about self-assessment? Their response was to create a rubric that establishes expectations early and often, introduces a common language around measuring skill, and reframes the connection between self-assessment and grades. Professor Dinerstein discussed the rubric’s evolution from a one-page outline to an unwieldy 15+ page document, before he finally settled on a streamlined 10-page student self-assessment form, which borrows heavily from assessments commonly used in medical residency training. Throughout the academic year, supervising professors repeatedly remind students that the goal is “competence” not “mastery” during law school. The current form also highlights long-term patterns within the individual student’s self-assessment, clarifies conflicts between student partners, and frequently invites a dialogue about the importance of teamwork in a law firm setting. The presenters reported that students do, in fact, get better at self-assessment over time through the interactive and frequent assessment process. Anyone interested in reviewing, or possibly adopting, the presenters’ rubric handouts are invited to reach out to the authors directly for permission. (Sorry about the sideways picture below; I am still learning the blog-posting ropes.)
After a delicious taco lunch break, we went back to work “Grow[ing] Future Lawyers in the Image of ABA Standard 302…”. Three professors from West Michigan University’s Thomas M. Cooley Law School explained how they successfully embedded the same acquaintance rape fact pattern in all three years of law school. In Professor Tonya Krause-Phelan’s 1L criminal law course students learned the elements of rape before conducting an in class jury trial. In Professor Victoria Vuletich’s 2L evidence course students reexamined their 1L trial with fresh eyes, having now learned the Rape Shield Laws. Then, as a 3L in the public defender’s clinic, Professor Tracey Brame set aside time to talk about the unique cultural sensitivities required to competently represent a defendant or victim in a sexual assault case. Reusing the same factual scenario in each year enabled the same students to see the same story from a variety of different legal angles. In addition to reusing the same hypothetical, the three professors created a long-term structure of evolving course rules to better reflect the students’ growth from year-to-year. During the first year, Professor “K-P” drafted and enforced detailed courses rules, with no input from the students. She was careful, however, to relate the classroom rules to the real practice of law, such as why it is critical to be able to take handwritten notes. Then in the second year, the students were allowed to establish the classroom rules, including the sanctions for rule violation. For example, students opted to impose a “must bring treats” penalty to anyone who was late to class without good cause. Then in the final year, the same cohort had to compare and contrast the rule-following required in 1L year with the rule making privileges of 2L year.
CUNY School of Law Professors Deborah Zalesne and David Nadvorney offered suggestions on how to help “underprepared law students” acquire the “other” skills mentioned in ABA Standard 302(d). Session attendees read a few pages of a Contracts case and quickly identified legal terms that could be troublesome for any first-year student. The presenters then pointed out numerous non-legal terms (e.g. “paradigm” or “doctrine”) which also have the potential to hinder an underprepared student. To combat this problem in their own classrooms, the presenters have made a conscious effort to introduce a new concept in the students’ first language, before layering on the more professional vocabulary. Avoiding the lawyer dominant language at the outset enables students to focus on the larger legal framework (i.e. to think big) without getting bogged down in the line-by-line details of the case. Then they systematically work through the case with the students, helping them to understand each line and each new term. The presenters also stressed the importance of being sensitive to students’ wrong answers. In short, taking the time to mentor these students at the start will allow the students to make larger long-term gains during the semester.
I attended several other sessions. I’ll give you the details of those sessions in part two of this two-part series. Coming soon!
(Kirsha Trychta)
July 11, 2017 in Meetings, Program Evaluation, Teaching Tips | Permalink | Comments (0)
Tuesday, May 9, 2017
Study Questions Student Recognition of Good Teaching
Today's The Chronicle of Higher Education references a study of nearly 340,000 mathematics students at University of Phoenix that questions whether students can recognize good teaching. The link to the post is here: Student Evaluations Study.
May 9, 2017 in Miscellany, Program Evaluation | Permalink | Comments (0)
Sunday, December 11, 2016
The College Experience: Why Students Lack Critical Thinking Skills
Comments are often made among faculty and academic support professionals that students enter law school without solid critical thinking skills. An Inside Higher Ed post by Ben Paris considers why colleges fail at teaching students critical thinking skills: Failing to Improve Critical Thinking.
December 11, 2016 in Learning Styles, Program Evaluation | Permalink | Comments (0)
Sunday, July 10, 2016
Threat to ABA Accreditation Powers
Hat tip to Mark Wojcik, John Marshall Law School (Chicago), of the Legal Writing Prof Blog for providing the link to an ABA posting about this matter. The link is: here.
July 10, 2016 in Miscellany, Program Evaluation | Permalink | Comments (0)
Friday, September 11, 2015
If You Build It, They Will Come
Maybe.
All of us in academic support and bar prep offer a variety of resources to our students. At times it is discouraging that fewer students than we hoped took advantage of a particular service that was offered.
But wait. Do we need 100% participation for an event or resource to have a positive impact? Sure, it is great if we can have mandatory programs. But few of us have that luxury for all students and usually have only a portion of our students who are required to attend.
Some students will complain that they are adults and argue against mandatory events. They would argue it is their choice to decide what to attend, what to access on-line, what to pick up as a hard-copy packet, or what to hit the delete button on. Until their grades flip them into a narrow mandatory category of at risk/probation, these students want to decide independently on their academic actions - not just whether to use ASP or bar prep resources but whether they will read for class or go to see a professor for assistance.
Mandatory versus voluntary is an on-going question because the students who most need to use resources often are the ones who do not use them. We all have students on probation who comment that they wish they had used resources the prior semester/year/years. The reasons why they did not use resources run the gamut: thought they were doing fine; thought everyone else needed the help but not them; did not like the day/time the workshop was held; forgot about the resources; had boyfriend/girlfriend/family/medical/work/other issues; could not find the office; did not want anyone to know they were struggling; were just lazy.
ASP'ers offer a variety of resources and formats to provide services in ways that might appeal to different learners and student needs. Below are just of few of the common options we offer:
- Voluntary summer programs
- Mandatory summer programs
- On-line summer programs
- Live workshops
- Videoed workshops
- For-credit courses - voluntary or mandatory
- Non-credit courses - voluntary or mandatory
- Writing across the curriculum with an ASP component
- Mandatory study groups
- Voluntary study groups
- Upper-division teaching assistants/teaching fellows/tutors
- Facebook information
- Twitter information
- Internet and intranet web pages
- Email study tips
- Official law school announcements
- Stand-alone ASP/bar prep workshops
- Workshops with student organization co-sponsors
- Workshops with bar review company co-sponsors
- Electronic packets of topical information
- Hard copy packets of topical information
- PowerPoint slide shows
- Formats with exercises, pair-and-share, and more
- Student panels on topics
- Faculty panels on topics
- Podcasts
- Blogs
- Links to Internet resources
- And more
Boosting attendance? Food bribes work well until the budgets are cut (or students complain about too much pizza). Door prizes work well until the swag becomes same old-same old. And so forth.
So, here is the reality. 100% is not the only measure that matters. Having a positive impact for the students who choose a particular format/resource is legitimate. By providing options for a variety of consumers, we reach students where they are and when they want to partake.
My survey last spring on academic success resources reminded me that there are more students using resources each day than I may realize. There are a lot of "silent consumers" out there who use digital/hard copy packets and intranet/email resources; they just are not as visible as those who want appointments or attend workshops. The survey registered their appreciation for academic success services. It was a good reminder that options are important. The impact on each individual student through less visible methods was just as important an impact. (Amy Jarmon)
September 11, 2015 in Miscellany, Program Evaluation | Permalink | Comments (0)
Wednesday, January 14, 2015
Southwestern Consortium of Academic Support Professionals Workshop March 6th, 2015
3rd Annual Southwestern Consortium of Academic Support Professionals Workshop
March 6th, 2015
Assessing Students and Programs to Develop
a Targeted Approach to Academic Support
at
Texas A&M’s School of Law
in Ft. Worth, Texas
The Southwestern Consortium of Academic Support Professionals will host a one day workshop focused on targeting our efforts for maximum efficiency. Decreased enrollment created a budget crunch for most schools. Decreased budgets can disproportionately fall on ASP shoulders, but we are still expected to provide the same level of support. We must be efficient to provide a high level of service with the decreased resources. To maximize efficiency, we need to assess where to utilize resources and whether our programs are making an impact. This year’s workshop will include programs to help us assess which students need our help from pre-matriculation through the bar exam. We will also discuss ways to determine whether our programs are working and more efficient ways to deliver our services.
Registration is open to anyone interested in academic support. There is no registration fee. If you are interested in attending, please fill out the attached form and return to: Camesha Little, Assistant Director of Academic Support at [email protected]. Forms will be accepted through February 27th.
Hotel Information:
A block of rooms has been reserved at the Sheraton Ft. Worth Hotel and Spa, 1701 Commerce St., Ft. Worth, TX 76102. This hotel is located right across from the law school. We negotiated a rate of $139.00 per night. Please be advised that this block will release and the price will expire on February 20, 2015. You can book your room online at https://www.starwoodmeeting.com/StarGroupsWeb/res?id=1409306215&key=216B6F3F, or by phone by calling (800) 325-3535 and referencing Southwest Consortium of Academic Support Professionals.
Schedule:
March 5th:
6:30 – Dinner for anyone arriving early.
March 6th:
9-9:50 – Assessing Students before they enter – Marta Miller, Director of Academic Achievement at Texas A&M School of Law
10-10:50 – How to use LSSSE Data in ASP – Dr. Evan Parker, Director of Analytics at Lawyer Metrics
11-11:50 – Developing a targeted class to improve academic performance – John Murphy, Associate Professor of Law at Texas A&M School of Law
12-12:50 – Lunch
1-1:50 – Assessing the effectiveness of Voluntary ASP Programs – Rebecca Flanagan, Assistant Professor of Law, Director of Academic Skills Program at UMass School of Law
2-2:50 – Determining who is at-risk for Bar Struggles and creating a program to improve success – Jamie Kleppetsch, Assistant Professor, Associate Director, Academic Achievement Program at The John Marshall Law School
2:50-3 – Closing Remarks
If you have any questions, please feel free to contact:
Steven Foster ([email protected])
Director of Academic Achievement at Oklahoma City University
Marta Miller ([email protected])
Director of Academic Support at Texas A&M School of Law
January 14, 2015 in Meetings, Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Saturday, January 3, 2015
Sexism in ASP
This semester has been eye-opening for me. I haven't spent a lot of time thinking about sexism in ASP. Although I am a dyed-in-the-wool, true-blue feminist, I've been lucky that I haven't faced much individual sexism (as opposed to institutional or systemic sexism, which are think are endemic to the academy). In the past, it's been one-off incidents, nothing that made me really question whether ASP fosters sexism. ASPs are predominantly run by untenured women, teaching in second-class rolls. While more men have joined our ranks, many of the (admittedly talented, committed) men that have been in ASP for more than 5 years have moved into tenured or high-level administrative positions, while I see equally talented, committed women stuck in the same second-class positions, without promotions or recognition, year after year.
I don't think this is solely due to institutional sexism. Studies have shown that women receive lower course evaluations than men. A tiny, needs-to-be-replicated study out of North Carolina State demonstrated that students will give higher course evaluations if they believe their instructor is a man--whether to not the instructor actually is a man or a woman. (See study here)
This semester I co-taught an ASP course with a fantastic, very talented male (tenured) professor. Mid-semester, we asked students to fill out qualitative evals, asking them to tell us what we should do and how to improve. While the majority of the surveys were helpful and fair, a disconcerting minority used the evaluations to make personal, sexist comments that had nothing to do with the substance of the course. Not one evaluation made personal comments about my male co-teacher.
I spoke with several experienced female professors after I read the evaluations. Everyone had a similar story; students feel it's okay to attack a female professor's attire, posture, hair style, or tone of voice in evaluations meant to measure teaching.
These attacks on female professors are damaging careers. Students evaluations are regularly used to renew contracts and earn tenure. The best administrators know to ignore these damaging comments in evaluations. But many evaluations are on a 1-5 scale, with female professors losing valuable points for things that have nothing to do with their ability to teach. And administrators can't distinguish between someone who needs help in the classroom, and someone who is receiving low scores because "their voice hurts my ears" or "their clothes are too bright for my taste."
ASP is integral to the success of the legal academy. It is time we started looking at the reasons why we are still second-class citizens.
(RCF)
January 3, 2015 in Current Affairs, Diversity Issues, Miscellany, Program Evaluation | Permalink | TrackBack (0)
Monday, March 3, 2014
The Variety in ASP
I am off to the Southwestern Consortium Workshop in a few days. One of the things that I like most about ASP workshops and conferences is hearing from colleagues what they do at their schools under the ASP umbrella.
I find it fascinating that there are so many great ways to accomplish our objectives. Although programs have many commonalities, the best way to do it at a particular law school depends on philosophy, historical structures/events, student characteristics, academic standards, staffing levels, facilities, budget, ASP status, faculty interface, administrative support, and so much more.
Here are some of the ways that we differ while everyone is working toward the goals of retention and improving student skills and bar studier success:
- Areas covered: academic success - bar preparation - legal writing - some combination
- Staff size: one-person offices - under 5 people - 6 or more people
- Appointments: administrative dean - 9/10/12-month staff - faculty on tenure track - non-tenure-track faculty - hybrids
- Budgeting: budget funded by a fee - defined budget process with discussion - defined budget process without discussion - ask and hope to receive per item
- Facilities: director's office - director's office plus another secretary/library/teaching fellow space - dedicated office suite
- Study aids library: publisher complementary copy arrangement - budgeted study aids library - donated used study aids - director's own collection - hybrids
- Office equipment/files: dedicated workspace - shared space with others - in director's office - hybrids
- Secretarial support - dedicated ASP secretarial support - shared secretarial support - work study or other student support
- Duties of law student workers: teaching fellows/tutors who present doctrinal reviews and study skills - teaching fellows/tutors who present only study skills - research assistants to assist with assessment and statistics - hybrids
- Faculty involvement: ASP across the curriculum - first-year faculty involved - upper-division faculty for required courses involved - as needed/asked basis with select faculty - little/no faculty involvement - hybrids
- Summer course for invited "at risk" first-year students: length varies - number of students varies - criteria for admission vary - graded or pass/fail - conditional admission based on grade achieved or enrolled upon completion - required curriculum course or elective course segment - hybrids
- Pre-orientation ASP: pre-orientation workshops/online tutorials for all first-year students on academic skills - pre-orientation workshops/online tutorials for invited first-year students on academic skills - required/suggested summer reading for first-year students - no pre-orientation
- Orientation sessions: orientation sessions on academic skills presented by ASP - orientation sessions on academic skills presented by faculty - orientation sessions on academic skills presented by upper-division students - hybrids
- Study groups for first-years: mandatory structured study groups for first-year students - voluntary structured study groups for first-year students - freelance study groups for first-year students - hybrids
- Extended orientation: extended orientation sessions on academic skills - extended orientation sessions on non-academic topics - mandatory with consequences/"mandatory" without consequences/voluntary - hybrids
- Courses: academic support course for first-year students - academic support course for upper-division students - bar preparation course - mandatory/voluntary - credit/non-credit - graded/pass or fail
- Workshops: workshops for first-year students - workshops for upper-division students - no workshops - hybrids
- Probation or "at risk defined by cum GPA" students: mandatory with consequences meetings/workshops/course - "mandatory" without consequences meetings/workshops/course - voluntary meetings/workshops/course - hybrids
- Academic dismissal process: ASP has official role in the process - ASP has informal role in the process - no ASP role in the process
- Academic advising: ASP has official role in the process - ASP has informal role in the process - no ASP role in the process
- Other duties: adjunct professor - university committee work - law school committee work - advisor to student organization/competition team - pre-law advising - K to 12 pipeline outreach
There are many other facets of our work on which we have variations. One of the joys of ASP work is that colleagues are willing to share their expertise, materials, and suggestions. ASP'ers are a refreshing group of legally-trained folks because everyone is willing to learn from others and contribute positively to the dialogue. I am looking forward to seeing you in San Antonio, Indianapolis, or at another workshop where we can exchange ideas. (Amy Jarmon)
March 3, 2014 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Monday, October 4, 2010
Starting a new semester
As we move past the beginning and approach the middle of the semester, we are trying new things and experimenting with new formats. We are learning what wroks, and what needs some tweaks. Some of us are teaching new classes, others are teaching the same classes in a new way. This is my second year of teaching Remedies as an ASP course, and here are some of the new things I am trying. Some are going well, others need more tweaks in teh coming weeks:
1) My student's don't use a traditional casebook (until Mike Schwartz's comes out) so I send them their reading in chunks. I don't know how this will work. But my rationale for the change is that I can better tailor the reading to the movement of the class if I periodically review where we are and where we want to go throughout the semester rather than give them everything at once. I add questions and comments to the reading, and this way, I can tailor my questions and comments in the text to what the students are struggling with in the material.
2) I am definitely using handouts to go with my PowerPoints. I know, I should have been doing this from the start. I would love to say my rationale was that I researched the science and saw that handouts scaffold the material learned in class, and therefore, make for better learning by students. That is 75% of my rationale. The other 25% has to do with attention in class. I really don't like giving away my PowerPoints because I believe it reduces the motivation to be alert and attentive in class. I teach at night, and I could be Robin Williams and students would still want to zone out. If I create a handout the acts as a roadmap to where we are going, they can fill in the pertinent information. I am hoping this method also helps students start to see what they should be taking notes on in their other classes. If I give them a template, they will (hopefully) extrapolate what are the important headings to their other classes.
3) I am trying a slower movement through the material. I am trying to go one step deeper with the material, making deeper connections between the material and what students should be thinking. This is an ongoing metacognitive process for me. I am not only re-reading the material, but stopping myself to ask why? when I write notes on the case.
(RCF)
October 4, 2010 in Program Evaluation, Teaching Tips | Permalink | Comments (0) | TrackBack (0)
Friday, August 20, 2010
ASP survey coming soon to an inbox near you
So often we wish we had information about what other law schools are doing in the ASP area. For some time there has been discussion encouraging data collection from all of us about our staffing patterns and program details. The Law School Academic Success Project is undertaking a survey of academic support programs/staffing with assistance from LSAC.
The following announcement about the survey was posted by John Mollenkamp on the ASP listserv today. Please help us by providing the contact information requested so that the correct person at your law school will receive the upcoming survey. (Amy Jarmon)
Colleagues,
As you may already know, we're trying to develop a Survey of Academic Support Programs in hopes of gathering data about what programming different law schools offer (and what staffing those programs have, among other things). Those familiar with the Legal Writing Survey may be glad to know that our planned survey is MUCH shorter.
But, to have a similar response rate (even for a shorter survey), we're going to need to find out who should answer the survey at each law school. Thus, starting next week and lasting through Labor Day, we're going to begin sending out individual e-mails to the folks that we THINK might be the right person to answer the survey. Then, once we've built that list, we'll send survey information to those folks.
You can help us now, though, by coming forward (by reply e-mail OFF-LIST to [email protected]) and giving your name, school, and e-mail for purposes of getting the survey answered. If you have multiple folks at your school who might be interested in answering the survey, you'll need to collaborate and decide which person will be the contact person and the one to receive the survey (though you can all work on answering it). If we don't know the "right" person, we'll probably ultimately send it to a Dean found via a web search with hopes of it getting forwarded. This is not nearly as good, of course, as getting it directly to the person who knows the answers already.
Thank you for your help in getting the ASP Survey off to a great start. I'm also glad to answer any questions you might have about this project.
John Mollenkamp
Clinical Professor of Law
Director of Academic Support
Cornell Law School
(607) 255-0146
http://www.lawschool.cornell.edu/faculty/bio.cfm?id=275
August 20, 2010 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Tuesday, February 9, 2010
Please Fill Out The Survey!
A request from Emily Randon at UC-Davis. If you haven't taken the survey, please do!
We are in the process of collecting data to determine the use of and need for a Law of Agency Casebook to be used in law school academic support courses. Generally, and for the purposes of the course, the Law of Agency focuses and expands on concepts of agency in Torts, Contracts and Property. Many law schools offer courses to assist struggling students with reading, analyzing and study skills. Current studies indicate that combining a substantive law course with skill building is one of the most effective ways to assist these students. Many law schools are looking for a substantive law course “vehicle” in which to teach skills.
With that in mind, I am hoping you will please fill out this quick (really quick!) survey, by clicking here: http://www.surveymonkey.com/s/D2B9JP9 and respond to questions related to Academic Support courses at your law school. PLEASE FILL THIS OUT EVEN IF YOU HAVE NO SUCH COURSE OR ARE NOT CONTEMPLATING ONE.
Thank you in advance for your help! If possible, please respond by Monday, February 15th.
(RCF)
February 9, 2010 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Monday, February 8, 2010
Different Schools, Different ASP
We accept that different schools have different characteristics, different personalities, different cultures and histories. This is an important thing to consider when designing or re-designing your ASP program. One-size-does-NOT-fit-all in ASP; ASP, to be successful, must meet the needs of a unique student body, but also reflect the culture of the school. That is not to say that there are not best practices in ASP (subject of a different post), but it does mean that a program that might be stellar at one school can be lackluster (or harmful) at another.
1) What is the culture of the school? Competitive? Community-oriented?
My experience is that the more competitive the school, the more invisible ASP should be to the general student body. This sounds counter-intuitive to many, but there is significant experience behind this opinion. At highly competitive schools, the students who most need academic help will avoid anything that makes them look weak. Because these students feel weak, they will walk to the other side of the building to avoid being seen near the place where people get help with problems. However, students in the top of the class looking for any extra edge will seek out ASP and monopolize resources. This upside-down appeal exacerbates problems rather than helping students in need. The best ASP at highly competitive schools is still ASP, but it looks like something else. These schools do best with a class-based program, where students meet on a regular basis, but the class looks like any other class at the school, not a class for people who are struggling. The ASP Director should NOT be called an ASP Director; they should have a position and a title that is similar to other academics at the school. ASP classes at these schools should be intensive skills courses, preferably hybrid doctrinal-ASP; students want to know their time is being well-spent doing something about their grades. Students are identified for ASP by a professors, administrators, using student-disclosed information; the more concrete the referrel, the better.
Community-oriented schools will miss significant numbers of students if they adopt this model. At schools that appeal to students with a less-competitive, community-minded approach to legal education, students are more likely to reliably self-identify. ASP looks less academic, more administrative, and has a hybrid student-services approach. ASP is not shameful, because it is a resource for all students, and going to ASP is less stigmatizing for students in distress. ASP can be a drop-in center with regular hours for students to ask questions, check out extra resources, and come in for help.
There are in-between models for schools that mix and match elements of either model. I don't believe that any one model is ideal; all models should reflect student needs and practices. The problem is when a school adopts a program because it has been shown to be effective at a school completely unlike their own. This can happen for any number of reasons.Before you plan an ASP program, it's best to know the population you will be working with.
I realize that I am going to get flak about the vagueness of the terms "highly competitive" and "community oriented." To quote Justice Potter Stewart..."I shall not today attempt further to define [what] I understand to be embraced . . . [b]ut I know it when I see it . . . "
2) What are the student needs? Do you have an evening program? A large number of non-traditional students?
Non-traditional students will have non-traditional ASP needs. A class-based system for non-traditional, part-time, or evening students is not feasible during their first year. Every hour outside of work and classes is occupied with another high-priority committment, like family. Resources and help need to be available to these students. A robust website with PowerPoints, self-help guides, and referrel information is important, so they students can get information on their own schedule. Working with professors of doctrinal classes is also helpful; even if ASP is integrated a few hours a semester through doctrinal courses, it will help students in need. No one is stigmatized, and no one feels left out, because everyone is getting the same material. Caveat: For the ASPer's own health and well-being, they can not be available to both day and evening students for all their needs.
3) What is history of the school with ASP? Have they ever had a program? Why didn't it work?
Some faculty and schools with reticence towards ASP have just had the wrong model ASP at their school. ASP can look ineffective and costly if it does not reflect the culture, history, and needs of the students and the school. But when ASP fits a school, it hits a sweet spot; students in need receive help to achieve their potential, no one is labeled or stigmatized, it builds goodwill with students which can aid in alumni development, and faculty get better exams because students have a better idea of what is expected of them. (RCF)
February 8, 2010 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Wednesday, January 13, 2010
The Numbers Game
All of us in academic support want to know what is working and what is not. Our very nature as ASP'ers is that we want to improve our work so that we can help our students more effectively.
There are multiple ways to evaluate our programs: questionnaires/surveys, suggestion boxes, unsolicited e-mails and notes, academic improvement of students with whom we work, bar passage improvement (for those who do bar prep), attendance at programs. Some of these are qualitative, and some are quantitative. Some might use both indicators.
I agree that numerical data can be helpful. However, in nearly 25 years of working in higher education, I have never been won over by the view that numbers alone tell the story. Why do I take that view? Let me give you some examples:
- Our work matters one student at a time. I do more than just parrot study advice. I listen. I encourage. I challenge. I confront. I pick up the pieces. I search for answers to that student's success. I rejoice in breakthroughs. A number does not evaluate those encounters.
- Rarely is my work a "one hit wonder" phenomenon. My work is based on assessment of issues, individual plans for improvement, monitoring of progress, and relationships with students. One appointment as a number means little in that on-going process.
- A student who achieves a modest grade increment may be succeeding at a level that was a mountain climb away from the student's starting point. Another student may "knock the ball out of the park" because the problems were study management issues rather than understanding how to analyze legally. Both students have shown success for their abilities and their issues.
- Ultimately, my students must want to succeed by doing the hard work that is needed in law school. I cannot impact their success with strategies and techniques if they choose to not implement them. A small increase (or even a decrease) is not about me. Likewise, I cannot take all the credit for major successes.
- Academic success is a work in progress. One semester's improvement may not show the full story. I have students who have worked with me over a series of semesters. By the time we are done, their efforts have maximized their success with A's and B's rather than their initial D's and F's.
- Students vary in the number of appointments that they need to improve. One student may need four or five appointments during the semester to achieve more effective and efficient time management. Another student might be able to implement techniques fairly quickly.
- Some of the ways we help students may not be easily calculated. For example, I send out weekly e-mails of study tips to all law students. I know that students benefit from the advice even though they might never attend a workshop or ask for an appointment. I get enough informal feedback to know the service makes an impact. But, a question on a survey about "how often do you read the weekly study tips e-mails" or "how many times have you implemented a technique from the e-mails" would probably garner little useful (or accurate) information.
- Numbers do not necessarily reflect quality. If I held 2,000 30-minute appointments instead of 1,000 1-hour appointments in 12 months, would it really reflect anything positive? I doubt it. In some cases, it would merely indicate that I saw the same student twice to cover the material that I used to cover in one longer appointment. It would often mean that I was short-cutting on assistance to the student (handouts that do not match the individual problems; a canned speech on a topic that did not consider the individual learning styles; a lecture rather than a discussion).
- Numbers in bar success can monitor trends when a full bar study is done (that is those who passed as well as those who failed). We can find out lots of quantitative indicators: bad grades, 1L gpa, bar courses taken, and so forth. That information is helpful. But only by talking to each person individually would we ever truly know what equaled success or failure for that graduate beyond mere numbers: techniques that worked; bar review courses used; any medical or personal obstacles; diligence in study; timing of study; work while studying.
-
Four students at a workshop who delve deeply into strategies and ask questions focusing on their issues with the topic may be hugely successful as a workshop. Thirty students at a workshop who barely pay attention and do not really want to be there may produce little improvement.
So, I would say we should look for both quantitative and qualitative measures when we seek to find out how we can improvement our programs. (Amy Jarmon)
January 13, 2010 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)
Monday, October 20, 2008
More on evaluating research on new programs
Based on some of the feedback I have received about my post from Wednesday, I am expanding my discussion about why we need to consider our students when we are implementing new programs based on research. This is an area where I have personal experience, in two different ways. I am in the unique position of having attended classes at or worked at six different law schools in four areas of the country; UNC Law (my alma mater), Duke Law, UCONN Law, Whittier, ASU--Sandra Day O'Connor, and Vermont Law School. But I also have personal experience evaluating the research from peer institutions when making decisions about new initiatives and classes. Based on my experience, the single most important variable when evaluating whether a program will achieve desired results is the students. Students aren't a monolithic, one-dimensional variable. There are multiple sub-variables to consider.
It would be more than a blog posting, more like a journal article, for me to detail why I would suggest considering each and every factor I listed in my last post. I will examine a couple of key factors as examples of how and why to careful evaluate research before implementing new programs based on research from other schools. I will be making some generalizations based on my experience; your experience may be different. My goal is to encourage you to look carefully at how students may impact the research results, and how this may impact the success of a new program.
One of the critical factors to examine is whether the research was conducted at a school with day only, or day and evening students. Day and evening students have some dramatic differences. Demographically, evening students tend to be older, have more work experience, are more likely to be supporting a family, and much more likely to be working while in law school. There are great benefits to schools having evening programs; my experience is they are more focused students, devoted to becoming lawyers, and more mature than their daytime counterparts. But time is at an even greater premium for these students than for day students. Evening students with families or working even part-time don't have any extra time to relax, let alone participate in supplemental programs, even when it will be of great benefit in the long run. Time and money constraints have a dramatic impact in the programs they will attend, how they respond to new programs, and the time they can put into extracurricular programs, such as Bar/Bri and PMBR. I haven't seen any research on the success of evening students as compared to day students on the bar exam, but my guess is there would be a difference. If a supplemental bar program or bar prep class is evaluated using evening students or day and evening students, I would expect the success rate to be much lower than if the same class is evaluated using day law students only. It's not a measure of the program, but an outcome of the time constraints of the students.
The location of the law school is also a significant factor to be considered when evaluating research. A city school with numerous other schools in the area is going to be very different from a rural school with few or no other law schools within hundreds of miles. A school without other schools in the area is more likely to serve a student body with a diverse range of abilities. If a law school is the only one within a hundred mile radius, some students will attend, even if they could have gone to a higher-ranked law school, because they are locked to the region. Evaluating programs in a school that has an LSAT range of 147-165 is different from evaluating programs at a school where the LSAT ranges from 150-153. Let me emphasize that LSAT is not destiny, but it is a factor when evaluating whether a program will work with your students. Teaching to a wide variety of abilities results in different teaching methods, and in some cases, different outcomes. This factor overlaps with the public/private issue; if the only other law school in the area is private, or much more expensive, you will see some of these effects as well.
The history of the law school is a very important factor, with multiple variables. A new law school is creating a culture and a legacy. They don't have alumni war stories about the bar exam to rely on for student buy-in of programs. Without a strong culture and legacy, students also don't have misinformation to the same degree as students who have a wealth of bad advice built into the culture of the school. New law schools also don't have the stigma, or burn-out, if they don't have a great record with the bar exam. An older law school with more than a few lackluster years can develop a culture of failure than sends self-defeating messages to the students. One such message is that no one from Law School X passes the bar exam on the first try, so take it the first time as a trial run, or just for practice. If a school is implementing a new program while simultaniously trying to overcome the burden of law student stigma, the results of a new program will not be reliable for a couple of years. The results of the program need time to be decoupled from the efforts to change the law school culture.
Another variable relating to the history of a law school is the history of the academic success program. A law school with a well-established, reputable ASP program that has outreach during the 1L year will find it much easier implementing a program for 3L's. When the students already trust ASP, they will buy-in sooner, and put more effort into what you are asking them to achieve. Similarly, if a law school has not had ASP, but is looking to establish a 3L bar prep program for the first time, they need a different marketing strategy and should expect a more conservative student response. I am a strong advocate for starting ASP programs incrementally, starting with 1L's, and gradually introducing programs for upperclass students. The other effect ASP will have on the success of a new program relates to the skills base of the students. Law schools with a well established 1L ASP that focuses on basic skills will have 3L's with a better foundation for bar courses. It's hard to build a foundation when students have already made it through 2-3 years of law school; you wouldn't try to pour a foundation after building a superstructure. Any program that is starting with 3L's without a 1L program will need more time to achieve results, and an even longer time if the school isn't planning on creating a 1L program to introduce skills to students at the beginning of law school.
This is not an exhaustive list of factors to consider when thinking about implementing a new program. I hope I have provided an illustration of the kinds of factors to consider when considering implementing new programs based on the research of other schools. I made some generalizations about students based on my experience, and you may disagree with some of them.
Lastly, if your school is considering implementing a new program, and would like to talk to me about some of the things to consider, I would be happy to chat with anyone on this topic. (RCF)
October 20, 2008 in Advice, Bar Exam Issues, Program Evaluation | Permalink | Comments (0) | TrackBack (0)