Law School Academic Support Blog

Editor: Amy Jarmon
Texas Tech Univ. School of Law

Monday, December 3, 2018

Use Finals Period to Reflect on Courses

Finals are starting, and if your office is like mine, most students only come around for emergencies.  I may have a few students asking doctrinal questions or for a few more tips, but in general, my office is quieter during finals weeks.  I use that time to finish grading and reflect on my classes to try to improve them for the next iteration.

I tell students about self-regulated learning at orientation.  I implore them to constantly evaluate their progress and make improvements.  Studying isn’t the only area where the steps of self-regulated learning is applicable.  We can use those steps when developing and improving our classes.

Finals weeks and the week before Christmas break is a good time for reflection.  With a little quieter office, analyzing courses is easier than when attending to constant emergencies.  Finals time is also good because classes just ended.  You may remember a little better what worked and what didn’t work.  I find it difficult to remember what I didn’t like if I don’t teach the course again until the following year.  Right now is much better for evaluating courses.

I suggest analyzing the course structure, in-class exercises, and the homework.  Categorize each activity or course choice as works great, decent, and failed miserably.  I know variations among those categories exist, but the idea is to identify what you must keep, what must go, and what could be better but not necessary to change now.  I provided a few considerations below when looking at the 3 categories.  Make sure to specifically write down the assessment and note the changes now before forgetting them.

Course structure is the big picture of the class.  Some considerations are:

  • Did the course achieve its objectives?
  • Did the course flow logically through the semester?
  • Should the topics be in a different order?
  • Do students need context or other knowledge to better prepare for the topics?

In-class exercises are great when they work well, but sometimes exercises fail miserably.  Think about each exercise and consider:

  • Did the exercise achieve its purpose?
  • Did the exercise further the lesson/topic of the day?
  • Did the exercise need additional instructions to run smoother?
  • Did the setup take too long?
  • Did it take too long to get the class back on task after completing the exercise?
  • How many students completed the exercise poorly or failed to complete the exercise?
  • Was there ample time to achieve the goal of the exercise?
  • In a perfect world, what would I change about the exercise?

Many professors, including myself, spend significant time preparing for class instruction but don’t think as much about homework.  Sometimes homework is reading cases or rewriting essays.  Homework should further our goals within the class.  Being deliberate with each homework assignment can help support learning in the classroom.  Analyze:

  • Does the homework flow with the class discussion?
  • Is there good formative assessment in the homework?
  • Did the homework integrate spaced repetition?
  • Did the homework further the class discussion or improve skills?
  • Did the course assign too much writing homework so the instructor couldn’t reasonably provide feedback on the work?
  • Was the instructor able to provide any feedback using homework?
  • Did students understand the homework’s purpose?

Now is the time to evaluate our courses and write down what we should change.  I forget the changes I want to make until I see the problem again the next year, so I start making notes and changes earlier.  My suggestions are not a comprehensive list.  The goal is continued evaluation to make courses better.  We can all do that.

(Steven Foster)

December 3, 2018 in Program Evaluation, Teaching Tips | Permalink | Comments (0)

Monday, October 8, 2018

Japanese Math Teachers Could Be a Model for Improving Legal Education

Sputnik changed teaching forever.  Falling behind the Soviet Union in the race to space caused people throughout the US to evaluate how we were teaching science and math.  Numerous theories ignited thought, and many individuals wanted the US to be the world leader in technology.  Unfortunately, we never fully realized our potential.  The US continually lags behind on the international math exams, and we are at fault.

Japan is widely seen as the technology innovator.  They continually score higher than all the other counties on the international math exam.  They use a unique form of teaching focusing on one problem, but the hardest aspect to swallow is Japan’s success is primarily built on the US theories developed after Sputnik.  The US failed to deploy the new theories throughout the country.  Japan capitalized on Americans’ work to produce a technologically advanced society.  Sputnik changed teaching, but unfortunately, the changes happened in Japan.  Now, we need to look to them to train our teachers.

Elizabeth Green describes the American failure and Japanese success in an article in the New York Times Magazine.  The Japanese practice of jugyokenkyu, translated lesson study, could help law schools improve.  Jugyokenkyu is when “[a] teacher first plans lessons, then teaches in front of an audience of students and other teachers along with at least one university observer. Then the observers talk with the teacher about what has just taken place. Each public lesson poses a hypothesis, a new idea about how to help children learn. And each discussion offers a chance to determine whether it worked.”  Jugyokenkyu approaches teaching as a collaborative effort with feedback.

Obviously, law schools don’t need specifics for teaching math.  However, numerous reports, recommendations, and standards haven’t changed legal education.  Maybe it is time for law schools to embrace jugyokenkyu.

The foundation for jugyokenkyu is deliberate preparation with goals; performance for students and colleagues; and feedback from experts.  In ASP, we know that process works.  We tell students to take practice exams, seek feedback, and make changes for the next exam.  In LRW, professors tell students to put down papers for a few days because individuals tend to read over errors in his/her own work.  If those are true for our students, then those statements are true for us.  We need feedback from someone who understands teaching law students to know whether our methods are working.  We will miss our own mistakes just like reading over an error in a brief.  We need deliberate practice with feedback as much as students.

The amazing transformation of Japanese math teaching is the anomaly, but we should attempt to follow that trend in legal education.  Theories, ideas, and published articles didn’t change America after Sputnik, so continuing that failed practice won’t change legal education.  I know I am saying this in a blog.  However, let’s consider how we can take steps to make lasting improvements to help our students.

My first suggestion is work within our own law schools.  Find a group of individual professors who are determined to help students learn better.  Start small with each person in the group deliberately planning a lesson.  The rest of the group observes the lesson, or someone can record the class for observation.  Everyone should then meet and talk about the lesson.  If each person in the group does that twice during a semester, the evaluation and critiques would help everyone. 

My next suggestion is to work with ASPers at other schools.  I know the quickest response to the last suggestion is “no one at my school would do that.”  While I believe there are at least a couple professors who want to improve teaching at every school, inter-school feedback can work.  We could create a TWEN page or page on the AASE site where we post videos of our teaching.  Others within the community could then watch and provide feedback.

ASPers posting lectures would provide an additional benefit for the annual conference.  We could see others’ lectures we hear about at AASE.  Some of the presentations always talk about how he/she teaches students a particular concept.  If that lecture was already posted, we could watch the lecture prior to the presentation and have a deeper discussion of teaching.  We could also have round table feedback sessions on teaching from lectures posted.  As we change our area, we could talk about it in our law schools to get other professors on board.  We can spread jugyokenkyu throughout law schools.

We continually hear that legal education needs to change.  Similar to k-12 education, entities demand we use better practices.  Demands generally don’t lead to widespread change.  Feedback from experts, who are our colleagues, is how Japan became the best country for math in the world.  We should try a model that works instead of continually following the same failed practice.

(Steven Foster)

October 8, 2018 in Professionalism, Program Evaluation, Teaching Tips | Permalink | Comments (0)

Monday, September 17, 2018

Do Incentives Work?

“If you build it, he will come,” is a line from great American folklore, or just an 80’s sports movie.  Kevin Costner in Field of Dreams heard one of the classic sports movie lines of all time.  Kevin’s character builds a baseball field, and players from the past (ghosts) come play ball.  He didn’t invite or encourage any of the players.  They just showed up.

Many ASPers, including myself early on, have a Field of Dreams mentality for programs and workshops.  We build the most innovative workshop with great pedagogy.  We advertise a little so students know about it, and then, we expect everyone to show up.  Sometimes that works, but many times, the students who need the workshop the most aren’t in the room.  We then reevaluate to determine the best way to get at-risk students in the room.

As an early ASPer, my next idea was to bribe students to show up.  I thought if I raffle a nice item off to students who attended most of the bar review workshops, students who needed it would show up.  I was right.  Over the next few years, I raffled iPads, full bar review scholarships, apple TVs, and other new tech on the market.  Students who needed help showed up more.  I started reaching more students, but a huge problem arose.  Over 75% of the raffle winners failed the bar exam.  The winner seemed to be cursed with a new gadget and no bar license.  Bribes produced my basic goal, but the bribes did not produce the ultimate goal of helping students succeed.

I stopped incentives a few years after offering them.  Law school budgets grew tighter.  I changed my program with more for-credit offerings, so I wasn’t incentivizing attendance any more.  I always wondered if the incentives really failed or if the low pass rate was a coincidence since my sample size was small.  I didn’t think free items could possibly hurt someone’s chance of passing the bar.

Helping Children Succeed by Paul Tough provided a small glimpse into what may have occurred with my incentives.  Tough cites Roland Friar’s research where he paid kids to do educationally beneficial activities like reading books.  Friar concluded after 4 years that incentives didn’t change long-term student behavior or improve test scores.  Jonathan Guryan paid students to read books over one summer.  After the program, most students’ reading comprehension levels stayed the same.  Students who were high achievers prior to the study saw moderate increases in comprehension, but the most at-risk students didn’t improve.  The incentives failed to produce long-term educational improvement for Friar and Guryan.

The findings sound eerily familiar to my experience.  Students who needed it showed up, but they didn’t end up improving very much.  Most of my award winners failed the bar.  Tough would probably argue that while students are exposed to the material, the lack of motivation to do the tasks originally makes long term improvement unlikely.  Once the incentives cease, students stop working.  In the studies he cites, some students even adopted the mindset that work must be rewarded or the work wasn’t worth completing.  The reward system didn’t work. 

I watched that happen to my students numerous times.  The incentives or drawings stopped, so they stopped attending additional workshops.  They didn’t pay as much attention as they should have during the workshops, and many times, those students didn’t complete the work required for the bar.  Unfortunately, the incentives I tried did not lead to lasting improvement.

Simon Sinek’s marketing perspective may have an additional answer to the incentive puzzle.  He discusses why companies need a “why” to inspire employees and build brand loyalty with customers.  He argues that constant discounts and coupons can get some short term sales, but customers don’t become loyal enough to wait numerous hours for a brand new phone that is full of glitches due to discounts.  Discounts lead to commodification, and customers don’t become brand loyal to basic commodities.  Once the discounts end, customers find a new product. 

Providing incentives for our programs can have the same commodification problem.  I believe the key to success in ASP is not getting students into workshops.  The key to success is getting students to take our workshops home to use on their own time.  What students do when we aren’t looking has the biggest impact on his/her chance of success.  Students won’t be loyal to our program, vision, or idea if they are showing up for a t-shirt.  They won’t follow our lead if the only reason for showing up is winning an iPad.  Incentives run the risk of making our program a commodity, and students won’t do the work outside the classroom that is necessary if our program is a commodity. 

Incentives may not always be bad.  Incentives to fill out surveys or complete simple tasks may not risk the same problems as the studies.  Providing food prior to an event can build relationships among the students.  If the incentive isn’t attempting a long-term behavior change, then the incentive is probably fine. 

The studies were also conducted on school age children, so the applicability to adult learners may be limited.  Comparing these results to incentive studies for employees could help.  Some incentive studies for employees produced better results.

Raffles and drawings with great prizes seems like a great idea.  I thought the same thing and gave away thousands of dollars of items.  In my experience, the incentives didn’t work.  The recent research seems to indicate long-term improvement requires more internal motivation that cannot be achieved by paying someone to study.

(Steven Foster) 

September 17, 2018 in Program Evaluation, Teaching Tips | Permalink | Comments (0)

Thursday, September 6, 2018

A Digital Detox Intervention Across France (and Potential Benefits for Learners)!

Radical.  Bold.  Ambitious.  And shocking too.  Until I read the research. But first, the country-wide experiment in learning...

As reported by CNN, starting earlier this month with the new school year, France has banned, I mean completely banned, student cell phone use in all primary, middle, and high school campuses throughout France (and throughout the entire school day (lunch included)):  https://www.cnn.com/france-smartphones-school-ban-intl/index.html 

As detailed by CNN, there's research to back up the educational benefits.  As described by CNN, the research evaluated the relationship between cell phone use and academic achievement for 130,000 UK students.  The researchers "found that following a ban on phone use, the schools' test scores improved by 6.4%. [And,] [t]he impact on underachieving students was much more significant -- their average test scores rose by 14% (emphasis added)."  https://money.cnn.com/smartphones-schools-ban/index.html.  Citing research authors Dr. Richard Murphy and Dr. Louis-Philippe Beland, CNN reported that just by prohibiting cell phone use in schools, "[s]chools could significantly reduce the education achievement gap...."  

That's big news - that ought to make a big splash in legal education - because the research suggests that a low tech solution might help law schools too narrow the achievement gap for those most at-risk of not doing well in law school. So, as you meet with students who are struggling this semester, you might ask your learners about their cell phone habits.  No need to be pushy.  Instead, just show them the research and then let them make a decision.  http://cep.lse.ac.uk/publishedresearch   

Based on my own review of the research, here's my recommendation to my students:  "For one week, just leave the mobile phone at home...or in one's school locker...or tucked away with the power off in one's backpack.  Even if it doesn't lead to better learning, you'll find that you'll quickly put a quash to those never-ending furtive glances at one's phone to see if someone has tried to connect with you.  And, more importantly, you might find that you are actually making better connections with the materials (and others) by not connecting to the digital world while at law school.  In short, you might reap the same educational benefits as those documented in the UK."  That's a great educational goal for all of us.  (Scott Johns).

 

September 6, 2018 in Advice, Encouragement & Inspiration, Learning Styles, Program Evaluation, Study Tips - General | Permalink | Comments (0)

Tuesday, August 7, 2018

Please help! It's Not Too Late to Complete the AASE Survey!

Law school contacts who had not completed the survey for AASE for their law schools prior to the AASE conference were emailed in June with the information on the restructuring of the survey to make it easier to complete. The new deadline was set for 11:59 p.m. on Friday, August 10, 2018. All schools should still use 2017-2018 information to complete the survey.

Thank you to the 60 law schools that have completed the survey already! A list of the schools that had completed the survey by Wednesday, August 1st is below. If your school completed after that date and is not listed, thank you for your completion!

If you do not see your school on the list of completed schools, please ask the ASP/bar person at your school to complete the survey. The data collected will be most useful if a high number of law schools complete the survey.

Remember all information from the survey will be reported in the aggregate; no individual school's information will be identified. Also, if there is a question that you are unable to answer, just leave that question blank and complete the remainder of the survey.

Problems in completing the survey? If your ASP/bar person has changed over the summer, we can re-send the survey to the new person if you notify Amy Jarmon of the change. If you have other problems or questions about the survey, we can also help you with those. Just contact Dr. Amy L. Jarmon at amy.jarmon@ttu.edu for any assistance you need to complete the survey for your school.

Best regards,

Amy L. Jarmon, Chair AASE Assessment Committee, Texas Tech School of Law

 

Law Schools Completed as of 8.1.18:

Albany

Benjamin N. Cardozo

Brigham Young

Brooklyn

Buffalo

California Western

Case Western Reserve

Catholic University

City University of New York

Cleveland Marshall

Elon

Florida International

Golden Gate

Gonzaga

Indiana - McKinney

Loyola - Los Angeles

New England

New York Law School

North Carolina Central

Northeastern

Northern Illinois

Oklahoma City

Pace

Quinnipiac

Rutgers - Camden and Newark

Santa Clara

Seattle

Seton Hall

Southern Illinois

Stetson

St. John's

St. Louis

St. Mary's

Stanford

Suffolk

Texas Tech

Thomas Jefferson

UC Davis

UC Irvine

UNT Dallas

U of Chicago

U Denver

U of Florida

U of Houston

U of Idaho - Boise and Moscow

U of Kansas

U of Louisville

U of Massachusetts

U of Miami

U of Minnesota

U of Nebraska

U of Nevada Las Vegas

U of New Mexico

U of North Carolina

U of Pittsburgh

U of Tennessee

Valparaiso

Washington U

West Virginia U

Whittier

 

 

 

 

August 7, 2018 in Miscellany, Program Evaluation | Permalink | Comments (0)

Monday, July 23, 2018

Planning for the Fall

July is almost over.  The hard work over the summer comes to an end, which means, it is time to ramp up for the fall semester!  As one chapter closes, I will usher in a new 1L class and begin bar prep with the rising 3Ls.  I must have ignored the post from a few weeks ago about taking a break.

The last few days have not felt like the end of summer in Oklahoma with a triple digit heat index every day, but I consider the bar exam the end of summer.  I will teach legal analysis to all the entering 1Ls and also a year-long bar prep class to rising 3Ls.  This will be my 4th year teaching legal analysis and my 10th year teaching a version of the 3L bar class.  After that many years, the easy route is to pull last year’s syllabi, change the dates, and post it for students.  However, I encourage everyone to consider adding something new.

Adding new items to a course or program seems daunting.  There are always more pieces than originally considered.  Between meetings, normal preparations, and taking a breath before the semester begins, adding something new seems difficult.  I have a couple suggestions that may help all of us do a little more this year.

  1. Schedule time for new ideas. We implore our students to schedule everything.  I encourage all of us to do the same.  Block out 30 minutes to an hour each day prior to school beginning.  Use that time to implement 1-2 new ideas.
  1. Look back through AASE materials. The great ideas from AASE get lost in the summer shuffle sometimes.  Make a deliberate effort to look at those materials for new ideas.
  1. Check your sticky notes. This may be more for me, but when I think of new ideas, I write them down on sticky notes on my desk.  Looking through those may jog your memory of what to do.  I also write down activities that didn’t work as well or slight modifications needed for class.  Keeping a running list is helpful because remembering the next year is difficult.
  1. Choose something small. You don’t have to transform your class, workshops, or department in 1 semester.  Most of us tell students to get gradually better through practice.  1% better every day makes a huge different in the long run.  The same is true for our courses and workshops.  A little better each time will make a huge impact.

The last few weeks before classes begin is normally a mad dash to get everything ready.  Try to spend a little time adding a few new ideas to make the coming year just a little better.  Enjoy the next few weeks.

(Steven Foster) 

July 23, 2018 in Program Evaluation, Teaching Tips | Permalink | Comments (0)

Saturday, May 5, 2018

What is AccessLex Institute?

There has been a good bit of buzz about AccessLex, a nonprofit in Washington, D.C. AccessLex has been mentioned in some postings on the listserv in the past months. Kirsha Trychta recently posted on the Blog about her takeaways from the AccessLex bar exam research forum; the link to her post is here.

You may remember Sara J. Berman, the Director of Programs for Academic and Bar Success at the Center for Legal Education Excellence at AccessLex Institute. Sara wrote the book published by the ABA, Pass the Bar Exam. Sara was part of the ASP/bar law school community for a number of years having worked at Nova Southeastern and at Whittier. She started her position at AccessLex this spring and is working hard to bring to the forefront issues that concern the ASP/bar profession.

The URL for the AccessLexCenter for Legal Education Excellence is https://www.accesslex.org/accesslex-center-legal-education-excellence.

There is a new Bar Success Research Grant Program accepting letters of inquiry May 1-31, 2018: https://www.accesslex.org/bar-success-grant-program. See the website or the May 1st posting to the ASP listserv for more information.

And there has even been a recent job posting for an academic and bar success research analyst at AccessLex: https://accesslexinstitute-openhire.silkroad.com/epostings/index.cfm?fuseaction=app.jobInfo&version=1&jobid=76.

There seem to be a number of potential resources for the ASP/bar profession that AccessLex can provide. (Amy Jarmon)

May 5, 2018 in Academic Support Spotlight, Jobs - Descriptions & Announcements, Program Evaluation | Permalink | Comments (0)

Monday, April 30, 2018

Modeling Failure

The best plans don’t always work out as intended.  Trying something new with a course or activity may sound groundbreaking.  However, the reality is sometimes it doesn’t work.  Students may dislike the program and not engage in the work or the message doesn’t click with students.  Our response to those difficulties can help train our students to overcome similar occurrences.    

I had one of those groundbreaking failures this year.  I planned to create super-learners.  I completely agree with Louis Schultz’s arguments in his article and have implemented similar programs throughout my tenure at OCU.  The art of learning can make a huge impact on students, and the earlier students understand how to learn, the better they can perform in school and on the bar.  I took that idea a step further.  I heard presentations and read articles about Millennial students.  One tidbit I latched onto was the notion that Millennial’s won’t do what they are told “because I said so”, but they want more information for why they are told to do something.  I knew I could provide them that information, so I started planning to assign learning articles.

I teach Legal Analysis to every 1L.  I found good articles about spaced repetition, testing effect, reading on a screen, self-regulated learning, mindfulness, and growth mindset.  I thought reading the articles combined with short discussions and activities related to those topics would produce better learners that remembered significantly more than ever before.  I was wrong.

Students despised the new readings.  To be fair, I chose longer articles that took a while to read.  Legal Analysis is 1 credit hour and credit/no credit graded, so they felt the reading was disproportionate to those facts.  My philosophy was the reading benefitted them and provided the why when I told them to start outlining early in the semester or study a certain way.  However, the students were probably correct.  The amount of reading was long, so many of them didn’t do it.

In essence, my new idea and integration failed.  I am sure that happens to everyone.  However, our response to our own failures is the best way to model improvement to our students.  As a former type A law student who did well in law school, I don’t handle being wrong very well (or at all really).  My frustration was that I knew the science, which is clear that certain activities are best for students.  Anecdotally, I have seen our best students use these methods for years.  From a learning science perspective, I did know more than most of the students, as do many of you.  That knowledge doesn’t matter though if the students don’t receive or internalize it.  Being substantively correct doesn’t help students succeed if they ignore the message.  Frustration or complaints about students not showing up to sessions, doing the reading, or putting in the effort are legitimate, cathartic, and unproductive.  If we want students to overcome their failures, creating a new solution can model that behavior.

Constant improvement is critical to success in law school and the practice of law.  We all know that is true in Academic Support as well.  New students, research, and technology make change inevitable.  I will rely on much shorter articles or more excerpts next year to decrease the amount of reading.  I will utilize more of the learning science during the spring after students receive a set of grades and realize they need help.  My hope is to balance the need to convey the information with the willingness of students to acquire the information.

My planned changes will help the new group of 1Ls but also show the 2Ls that their opinion matters.  I ask students every July to analyze their own BARBRI MBE report to find improvement areas before the bar.  They are much more likely to follow that advice if they already saw me make changes based on their experience and suggestions.  Modeling improvement can encourage others to also seek improvement, which can make a huge difference whether some students succeed.

(Steven Foster)

April 30, 2018 in Advice, Program Evaluation, Teaching Tips | Permalink | Comments (0)

Tuesday, October 17, 2017

Surveying the Room of Requirement

During the first week of class I asked my students if they had any lingering questions that weren't resolved during Orientation. Several students inquired, "Where is the student lounge?" Admittedly our student lounge is somewhat difficult to find, with the entrance tucked between two vending machine on the second floor. I gave them directions and then jokingly described the student lounge as a place that only appears to those law students who already know of its whereabouts—which incidentally helps keep the room secreted from non-law students looking for a cool new spot to relax. Students aptly pointed out that I had also inadvertently described a key aspect of the Room of Requirement, a magical all-purpose space that featured prominently in the latter-half of the Harry Potter series.  

[Sidenote: For those non-magical folk who aren’t familiar with Harry Potter, the Room of Requirement “only appears when a person has real need of it – and always comes equipped for the seeker's purpose. Any purpose.” For example, the Room of Requirement took the form of a bathroom for the headmaster when he was most in need, a training facility for Harry and the other members of his Army, and a storage room for many other students wishing to hide certain nefarious objects.]

The Potterheads were right, but if I had to pick the real Room of Requirement within the law school, it would undoubtedly be the Academic Excellence Center, especially in October. We never know who is going to walk through our door or what issue, question, or request they might bring with them. Just last week we fielded questions about academic advising, studying for midterm exams, debriefing after midterm exams, outlining, time management, moot court, legal writing, seminar papers, mental health resources, financial aid, new attorney swearing-in ceremonies, and summer employment, just to name a few.

I believe that my colleagues, while supportive of the Center, really don’t comprehend the varied roles that academic support professors play in the law school at any one time. To better capture the ever evolving list of activities within the Center, we recently installed a Survey Kiosk. The kiosk is actually an i-pad mounted on a chest-high stand near the door to the Center.  The i-pad is locked using Apple’s Guided Access feature so that visitors can only access one webpage, namely a survey link.

Survey wideways 2

We then created a 15-second survey that heavily relies on the use of skip logic. We now ask everyone to complete the survey following their visit to the Center.  We also posted the survey link to our Facebook page, just in case someone forgets to complete the questionnaire before leaving the Center.  The survey allows us to quickly capture the following information about each visit:

  • Visitor’s class year (prospective student, 1L, 2L, 3L, or graduate)
  • Who they visited within the Center
  • Whether the meeting was a walk-in or by appointment
  • Nature of the visit, i.e. the topic that was discussed
  • Overall usefulness of the meeting, rated on a Likert Scale; and
  • Any additional comments 

In just two months, we have received roughly 200 real-time responses. This data has already allowed us to track which days of the week and weeks within the semester generate increased foot traffic, how well the Dean’s Fellows and Peer Writing Consultants are connecting with their classmates, and the types of services being most utilized. Unsurprisingly, 1Ls continue to make-up the bulk of our client base. But, we anticipate a sharp increase in 3L foot traffic in the spring semester, when the 3Ls turn their attention to applying for and sitting for the bar exam.

This real-time kiosk system will replace our end-of-the-semester evaluation, which historically has suffered from low response rates.  The data should also be immensely helpful when we are tasked with completing annual Faculty Activity Reports and Performance Reviews next summer. Previously, we relied on a much less empirical system, consisting primarily of fuzzy memories, email inbox search results, and painstaking calendar reviews.

All-in-all, the Survey Kiosk has been a successful experiment, thus far.  If you’re interested in doing something similar at your institution, you can purchase a basic i-pad and stand for under $1,000.00—making this an ideal project to submit for a technology grant, especially in light of its relatively low cost and easy implementation. Finally, we are also happy to share our survey setup with you; just ask.  Unfortunately, we can't post the survey link here for you to view, because all of your curiosity clicks will create false responses in the data.  (Kirsha Trychta)

October 17, 2017 in Program Evaluation, Television, Web/Tech | Permalink | Comments (0)

Monday, October 9, 2017

Dare to Disclose?

The counseling field has often highlighted the benefits of some personal disclosure from therapists to their clients. Some cited benefits include increased trust and rapport, as well validation of the clients’ experiences.

Join me this week at the Inaugural Diversity Conference for the Association of Academic Support Educators (AASE) in Baltimore, Maryland, for a moderated discussion on the benefits of academic support professionals sharing personal stories and struggles with their students.

Participants will be encouraged to share their experiences (i.e., their stories or struggles) relating to diversity and inclusion or their law school experience in general. These experiences may either be personal stories or struggles or stories related to students that the participants may have worked with in their capacity as academic support professionals. As presenters and participants share their stories, the “listening” participants will be modeling and reviewing some of the same active listening skills and nonverbal behaviors that academic support professionals should be engaging in when they work with students in either individual or group conferences.

Hope to see you in Maryland! (OJ Salinas)

October 9, 2017 in Advice, Disability Matters, Diversity Issues, Encouragement & Inspiration, Learning Styles, Meetings, Miscellany, News, Professionalism, Program Evaluation, Stress & Anxiety, Teaching Tips | Permalink | Comments (0)

Monday, September 18, 2017

1L Enrichment Groups

I am having an Enrichment Group Leaders training meeting today at noon. So, I have enrichment groups on my mind (hence, the blog post!). Perhaps, many of you are also working with enrichment groups or are thinking about developing enrichment groups. I am sure many of us would love to chat and learn more about our various programs and how we can continue to best serve our students. We can continue the conversation via email or on Twitter (tweet me @ojsalinas, and use #lawschoolASP).

Like many law school academic success programs throughout the country, we provide an opportunity for our 1L students to get additional training and support from upper level students. One way that we provide this opportunity to our 1Ls is through participation in Enrichment Groups.

Every 1L student at Carolina Law is invited to participate in our Academic Excellence Program Enrichment Groups. These groups are run by upper level law students who have done well in school and have shown the ability to do well in mentoring and meeting with students. 1Ls are assigned to their groups based on their 1L professors, and the groups are “tied” to two of the 1L casebook classes—with one upper level student “Enrichment Group Leader” often taking the lead on one of the two casebook classes.

The groups typically meet once a week for about 50 minutes starting late September. The groups alternate discussing ASP topics related to one of their two casebook classes during the group meetings. These topics change as the 1Ls advance during the semester. So, the initial group meeting may simply focus on developing rapport within the group and identifying group member goals for choosing to participate in the group. The next groups may focus on taking notes and case reading for the particular casebook classes. Later group meetings may introduce outlining and the use of study aids to help review practice questions related to the casebook classes. And, finally, we try to end our semester with a practice exam for each of the two casebook classes.

We generally have strong positive feedback from our 1Ls on our Enrichment Groups. Students typically feel that the groups are great ways to provide additional support and guidance in their classes. They also like the idea that these study groups are voluntary and that the groups are already formed for them—the students don’t have to worry about not getting “chosen” or “asked” to join a particular study group.

As I mentioned, I am having a training session for our Enrichment Group Leaders this afternoon. One thing that we try to emphasize with our leaders and their group participants is that the leaders are not “tutors.” They are not there to teach the 1Ls the substantive law, and they certainly don’t replace their law school professors. While the leaders have done well in the casebook class that they are “leading” (and, many of them actually had the same professor for that particular casebook class during their 1L year), our Enrichment Group Leaders are there to help facilitate learning. They are there to provide further support for our students. They are there to “enrich” the students’ 1L academic experience. And we believe a more enriched 1L experience is a better 1L experience. (OJ Salinas)

September 18, 2017 in Advice, Diversity Issues, Encouragement & Inspiration, Exams - Studying, Learning Styles, Meetings, Miscellany, Program Evaluation, Reading, Stress & Anxiety, Study Tips - General, Teaching Tips | Permalink | Comments (0)

Monday, August 28, 2017

Energized After Pre-Orientation

I have returned to some normalcy after the conclusion of our two pre-orientation programs.

Our Legal Education Advancement Program (“LEAP”) is a voluntary pre-orientation program available to every incoming 1L student at Carolina Law. Faculty members participating in LEAP help students transition to the study of law by introducing them to a variety of topics, including jurisprudence, case briefing, exam writing, and the Socratic class. We had 56 incoming 1Ls who chose to participate in our first LEAP session a week and a half ago. We had another 47 incoming 1Ls who chose to participate in our second LEAP session last week. The total was nearly half of our incoming 1L class!

I am sure many ASP folks will agree that it can be an interesting feeling running these pre-orientation programs: it’s weirdly both draining and energizing. You can feel really drained from the immense amount of work that goes into preparing for and delivering the program. Yet, you can also feel energized when a new set of students enters your law school building. You feel a certain thrill and special motivation knowing that you get to be a part of the start of the students’ successful transition into the study of law. You know that your students are going to do great things during and after law school, and you are lucky to help train them on this wonderful marathon. Seeing light bulbs start to go off in your students’ minds during your programming, and receiving positive responses from faculty, staff, students, and administrators are icing on the cake.

Like many of you, I had a great group of folks who helped out during our pre-orientation programs (many of whom I thanked and tweeted about @ojsalinas). I also appreciated how many faculty, staff, and administrators came out to meet and have lunch with our LEAP students.

Wishing everyone a great start to another academic year!

-OJ Salinas

Ready to Go pic

August 28, 2017 in Advice, Diversity Issues, Encouragement & Inspiration, Miscellany, Orientation, Program Evaluation, Stress & Anxiety | Permalink | Comments (0)

Tuesday, August 15, 2017

ASPers Should Consider the Southeastern Association of Law Schools (SEALS) Conference

I went to the Southeastern Association of Law Schools (SEALS) conference for the first time. SEALS is different than most (all?) other conferences that I have attended as an academic support professor. Although the conference is not specifically academic support focused, SEALS has a variety of sessions that will interest any ASPer, including legal writing topics, effective teaching strategies, formative assessment techniques, balancing dual administrative and faculty appointments, and the like. Plus, if you also focus on a doctrinal area, SEALS has numerous sessions for that too. (You can view the full 2017 schedule here.) 

SEALS is primarily comprised of three presentation formats: (1) panel presentations, (2) roundtables, and (3) moderated discussion groups. The panels consist of three of four structured job talk-esque presentations followed by a question-and-answer session. While intriguing and thoughtfully presented, the panels are not what makes SEALS a draw for attendees. Meanwhile, the roundtables function similar to a typical “What I Wish I Would Have Known” event during a law school’s orientation week. For example, I attended a roundtable discussion where a dozen new professors were able to chat with current and former law school deans about what a typical dean expects of newer professors. 

The most interesting format, however, is the moderated discussion group. The moderator of the discussion group invites roughly 10 different individuals to pitch their projects or ideas, all of which are at varying stages of development. Each pre-selected "discussant" talks for 5-10 minutes and then the other attendees ask questions and provide feedback, in a very low stakes supportive environment. This continues for two or three hours. Most discussion groups encourage discussants to focus on a pre-selected theme, but the conference rules tend to be loosely enforced in a way that encourages innovation and brainstorming. Anyone can attend a discussion session and participate in the responsive comment period, but if you want to guarantee yourself a few spotlight minutes to pitch your idea, then you should get on the discussant list by reaching out to the moderator. I attended several discussion groups and even got to pitch an idea at one session, despite not being on the pre-selected list by simply reaching out to the moderator via email a few days before the event. A pre-selected discussant could not make the conference at the last-minute and I was permitted to use their designated slot. I was told my email strategy (which was suggested to me by a seasoned SEALS participant) is somewhat common at SEALS. Thus, I encourage you to consider the same approach if you find yourself at SEALS without a specific invitation to speak. 

Another feature which makes SEALS unique is the family-friendly atmosphere. Likely because SEALS is hosted in a warm-weather, beachy environment, many attendees opt to bring their friends and families. In fact, SEALS actually encourages guests by providing each person with an official conference name tag and invitation to numerous receptions throughout the week. 

Lastly, if I were asked to describe SEALS in a word, I would say “relaxed.” Few attendees attend all of the sessions; rather most attendees balance work-and-play very nicely at SEALS. There is no pressure to attend the entire event. The conference is long enough (10 days) that you can pick the few days that interest you most. SEALS planners even send all participants a special link to a Crowd Compass App to encourage everyone to create their own personal conference itinerary. The App allows you to set session reminders, prompts you with presenters’ names, and lets you search for other attendees. All in all, SEALS was a nice break from the more traditional academic conference. (Kirsha Trychta)

August 15, 2017 in Meetings, Program Evaluation, Travel | Permalink | Comments (0)

Tuesday, July 18, 2017

ILTL Cultural Competency Takeaways, 2 of 2

This blog post is the second in a two-part series detailing my takeaways from the Institute for Law Teaching and Learning’s “Teaching Cultural Competency and Other Professional Skills” conference which was held in Little Rock, Arkansas on July 7-8, 2017.  For part one of the series, click here.

Professor Andrew Henderson from the University of Canberra in Australia discussed “The Importance of Teaching Self-Evaluation and Reflection in Law School,” especially in an ethics course.  One study revealed that when presented with an ethical dilemma, law students tend to resolve the dilemma consistent with their personal beliefs and without regard for the professional rule of conduct.  The students answered the same question the same way before taking an ethics course, while taking the course, and after successfully completing the course.  In other words, professional responsibility courses do little to teach ethical judgment makings skills.  Knowing this, Professor Henderson sought to design a course that would reframe the discussion entirely  He required students to identify their internal motivations, such as what makes them get up in the morning, what keeps them awake at night, why do they want to be a lawyer.  He then used the students’ responses as a means to jumpstart a conversation and to identify the intersection between the students’ self-identified motivations and the ethical rules.  He reported that students have become more engaged in the ethics course and that the student responses have also helped to provide more targeted academic advising and job placement advice.  At the end of the discussion, a few attendees discussed how a similar exercise could be added to the start of the 1L year to assist academic support professors in providing more tailored advice to at risk students.

Henderson 3

Professor Benjamin Madison of Regent University School of Law and his colleagues developed a course to “Help[] Millennials Develop Self-Reflection.” The mandatory 1L class focuses on the development of problem solving skills, emotional intelligence, responsibility, and “other” ABA mandated skills.  To begin, students get to request a specific faculty coach.  The school makes every effort, but does not guarantee, to match students with their top choice.  Next, students meet with their designated faculty coach to complete an intake self-assessment or “roadmap.”  After the student self-assesses him/herself, the student is assessed on those same skills by two of their peers.  Professor Madison has already noted several trends at his school.  First, 1L students frequently rate themselves quite high (i.e. mastery level) despite having little to no professional development training, and students rate their peers even higher.  Essentially students “don’t know what they don’t know.”  This phenomenon is commonly referred to as the Kruger-Dunning effect in psychological circles.  Second, students gravitate toward those peers who unequivocally support them, rather than peers who challenge them and hold them accountable.  Lastly, students are more concerned about obtaining meaningful employment than making a sufficient income, which is especially intriguing when you consider internal motivation as a component of self-refection.  (As an aside, their research concluded that the primary professional goal for 1L students is to pass the bar exam – whew!).  Professor Madison said that if other schools are interested in adopting a similar program, they should reach out to the St. Thomas School of Law Holloran Center, which “continues to focus on its mission to help the next generation form professional identifies with a moral core of responsibility and strive to others.”

Madison 3

Professor Christine Church of Western Michigan University’s Thomas M. Cooley Law School Immerses Students in Lawyering Skills.  Her nine credit program is centered on all-day classes that simulate a law practice environment.  During the 14-week semester, four distinct four-person law firms handle three cases: (1) a custody battle requiring intense interviewing, counseling, and negotiation skills, (2) a personal injury suit involving pretrial litigation skills, and (3) a DUI criminal trial.  The clients are actually other law students who are completing a 1-credit directed study, relying on the principles discussed in the book “Through the Client’s Eyes” for guidance.  The “attorneys” within each firm exchange documents throughout the week using Google Docs and then meet on Saturdays to engage in simulation exercises.  Professor Church commented that the unique course schedule—which is ABA Standard 310 compliant—has helped students to develop the stamina needed to study for the bar exam and actually practice law on a daily basis.  The program now has a waitlist; students love it!  She concluded the session by sharing a plethora of fact patterns, grading rubrics, and syllabi to assist participants in establishing their own litigation skills immersion program.

Church 3

After Professor Church’s session, I enjoyed a tasty Greek salad lunch. In my view, a good indicator of the quality of a conference is the quality of the breaks.  ILTL did not disappoint.  Not only was the host school welcoming and attentive, but all the attendees were more than willing to offer helpful suggestions at every turn—well beyond the theme of the conference.  Many thanks to those who shared teaching tips, performance review and tenure advice, and general support to this junior faculty member.  And, let me extend a special shout out to one colleague’s pet squirrel!

Before I wrap-up, let me share the most bizarre tidbit I heard while in Little Rock.  One professor explained that one of her students genuinely believes that some version of the following conversation occurs routinely at her law school—Professor A to Professor B: “When Mary comes to your office to discuss her exam, tell her that her poor grade is due to an underdeveloped rule block.  And, when you meet with John, tell him that he needs to work on his application.  That’s what we’re all going with this semester.”  The student came to this epiphany after every single one of her professors targeted the same exact exam skill for improvement. Feel free to insert the emoji of your choice here.

I wish I could tell you about all the concurrent sessions, but unfortunately my J.K. Rowling approved Time-Turner is not TSA approved.  I heard chatter in the hallway suggesting that I missed several good sessions, but as author Ashim Shanker has noted, “freedom brings with it the burden of choice and of its consequences.”  For those who are interested in learning more about the other sessions or about the Institute for Law Teaching and Learning’s larger mission, checkout the Institute’s webpage.  (Kirsha Trychta)

July 18, 2017 in Meetings, Program Evaluation, Teaching Tips | Permalink | Comments (0)

Tuesday, July 11, 2017

ILTL Cultural Competency Takeaways, 1 of 2

I attended the Institute for Law Teaching and Learning’s “Teaching Cultural Competency and Other Professional Skills” conference on July 7-8, 2017 at the William H. Bowen School of Law in Little Rock, Arkansas.  The conference opened with a quick sticky dot poll of the attendees.  The dots revealed that while most professors felt comfortable teaching skills like trial practice, negotiations, and document drafting, only a few were confident in their ability to teach cultural competencies in the classroom.  In an attempt to ameliorate this (real or perceived) deficiency, the approximately fifty attendees—a surprisingly even mix of doctrinal, clinical, legal writing, and academic support professors—worked collaboratively for two days to develop a portfolio of concrete exercises to satisfy ABA Standard 302.  What follows are some of my major takeaways:

The first suggestion was to “[Bring] Marginalized Populations into the [Legal Writing] Classroom.” Elon Law Professors Thomas Noble, Patricia Perkins, and Catherine Wasson explained how they each drafted a legal writing factual scenario involving a potentially unsympathetic and culturally diverse plaintiff: an Egyptian immigrant, a convicted felon, and a mentally ill survivalist, respectively.  These plaintiffs’ legal claims were then further complicated by the intentional inclusion of gender neutral names, ethnic sounding names, ambiguous facts, and words with strong connotations (think: “fetus” versus “the child”).  These professors crafted case files that not only required students to learn the mechanics of legal writing, but also forced students to confront their biases in a thoughtful and controlled way.  Occasionally students made unwarranted assumptions which allowed the class to discuss the importance of understanding cultural sensitivity and implicit bias.  Other students wanted to “help” by taking action contrary to the client’s expressed desires creating a great opportunity to talk about the ethical complexities of being a counselor-at-law.  The presenters reported that many students came to realize that there might not always be a “right” answer, especially when dealing with legal issues that intersect with human dignity and diverse cultural norms. 

  Elon Profs

Next we discussed the importance of “Building [a] Student[’s] Capacity for Self-Evaluation” with the use of a robust “soft skills” rubric.  Before the presenters shared their rubric, Professors Lauren Onkeles-Klein and Robert Dinerstein used Mentimeter’s in-class polling software to highlight that professors view self-assessment as an opportunity for student “reflection,” but students view self-assessment exercises as “painful busywork”—regardless of whether the assessment process occurs in a doctrinal class, legal writing course, or the clinical setting.  The question then became: how do we shift student mindset about self-assessment?  Their response was to create a rubric that establishes expectations early and often, introduces a common language around measuring skill, and reframes the connection between self-assessment and grades.  Professor Dinerstein discussed the rubric’s evolution from a one-page outline to an unwieldy 15+ page document, before he finally settled on a streamlined 10-page student self-assessment form, which borrows heavily from assessments commonly used in medical residency training.  Throughout the academic year, supervising professors repeatedly remind students that the goal is “competence” not “mastery” during law school.  The current form also highlights long-term patterns within the individual student’s self-assessment, clarifies conflicts between student partners, and frequently invites a dialogue about the importance of teamwork in a law firm setting.  The presenters reported that students do, in fact, get better at self-assessment over time through the interactive and frequent assessment process.  Anyone interested in reviewing, or possibly adopting, the presenters’ rubric handouts are invited to reach out to the authors directly for permission.  (Sorry about the sideways picture below; I am still learning the blog-posting ropes.) 

  American Profs

After a delicious taco lunch break, we went back to work “Grow[ing] Future Lawyers in the Image of ABA Standard 302…”.   Three professors from West Michigan University’s Thomas M. Cooley Law School explained how they successfully embedded the same acquaintance rape fact pattern in all three years of law school. In Professor Tonya Krause-Phelan’s 1L criminal law course students learned the elements of rape before conducting an in class jury trial. In Professor Victoria Vuletich’s 2L evidence course students reexamined their 1L trial with fresh eyes, having now learned the Rape Shield Laws. Then, as a 3L in the public defender’s clinic, Professor Tracey Brame set aside time to talk about the unique cultural sensitivities required to competently represent a defendant or victim in a sexual assault case.  Reusing the same factual scenario in each year enabled the same students to see the same story from a variety of different legal angles.  In addition to reusing the same hypothetical, the three professors created a long-term structure of evolving course rules to better reflect the students’ growth from year-to-year.  During the first year, Professor “K-P” drafted and enforced detailed courses rules, with no input from the students.  She was careful, however, to relate the classroom rules to the real practice of law, such as why it is critical to be able to take handwritten notes.  Then in the second year, the students were allowed to establish the classroom rules, including the sanctions for rule violation. For example, students opted to impose a “must bring treats” penalty to anyone who was late to class without good cause.  Then in the final year, the same cohort had to compare and contrast the rule-following required in 1L year with the rule making privileges of 2L year.

Cooley Profs

CUNY School of Law Professors Deborah Zalesne and David Nadvorney offered suggestions on how to help “underprepared law students” acquire the “other” skills mentioned in ABA Standard 302(d).  Session attendees read a few pages of a Contracts case and quickly identified legal terms that could be troublesome for any first-year student.  The presenters then pointed out numerous non-legal terms (e.g. “paradigm” or “doctrine”) which also have the potential to hinder an underprepared student.  To combat this problem in their own classrooms, the presenters have made a conscious effort to introduce a new concept in the students’ first language, before layering on the more professional vocabulary.  Avoiding the lawyer dominant language at the outset enables students to focus on the larger legal framework (i.e. to think big) without getting bogged down in the line-by-line details of the case.  Then they systematically work through the case with the students, helping them to understand each line and each new term.  The presenters also stressed the importance of being sensitive to students’ wrong answers.  In short, taking the time to mentor these students at the start will allow the students to make larger long-term gains during the semester.

  CUNY Profs

I attended several other sessions. I’ll give you the details of those sessions in part two of this two-part series.  Coming soon!

(Kirsha Trychta)

July 11, 2017 in Meetings, Program Evaluation, Teaching Tips | Permalink | Comments (0)

Tuesday, May 9, 2017

Study Questions Student Recognition of Good Teaching

Today's The Chronicle of Higher Education references a study of nearly 340,000 mathematics students at University of Phoenix that questions whether students can recognize good teaching. The link to the post is here: Student Evaluations Study.

May 9, 2017 in Miscellany, Program Evaluation | Permalink | Comments (0)

Sunday, December 11, 2016

The College Experience: Why Students Lack Critical Thinking Skills

Comments are often made among faculty and academic support professionals that students enter law school without solid critical thinking skills. An Inside Higher Ed post by Ben Paris considers why colleges fail at teaching students critical thinking skills: Failing to Improve Critical Thinking.

December 11, 2016 in Learning Styles, Program Evaluation | Permalink | Comments (0)

Sunday, July 10, 2016

Threat to ABA Accreditation Powers

Hat tip to Mark Wojcik, John Marshall Law School (Chicago), of the Legal Writing Prof Blog for providing the link to an ABA posting about this matter. The link is: here.

July 10, 2016 in Miscellany, Program Evaluation | Permalink | Comments (0)

Friday, September 11, 2015

If You Build It, They Will Come

Maybe.

All of us in academic support and bar prep offer a variety of resources to our students. At times it is discouraging that fewer students than we hoped took advantage of a particular service that was offered.

But wait. Do we need 100% participation for an event or resource to have a positive impact? Sure, it is great if we can have mandatory programs. But few of us have that luxury for all students and usually have only a portion of our students who are required to attend.

Some students will complain that they are adults and argue against mandatory events. They would argue it is their choice to decide what to attend, what to access on-line, what to pick up as a hard-copy packet, or what to hit the delete button on. Until their grades flip them into a narrow mandatory category of at risk/probation, these students want to decide independently on their academic actions - not just whether to use ASP or bar prep resources but whether they will read for class or go to see a professor for assistance.

Mandatory versus voluntary is an on-going question because the students who most need to use resources often are the ones who do not use them. We all have students on probation who comment that they wish they had used resources the prior semester/year/years. The reasons why they did not use resources run the gamut: thought they were doing fine; thought everyone else needed the help but not them; did not like the day/time the workshop was held; forgot about the resources; had boyfriend/girlfriend/family/medical/work/other issues; could not find the office; did not want anyone to know they were struggling; were just lazy.

ASP'ers offer a variety of resources and formats to provide services in ways that might appeal to different learners and student needs. Below are just of few of the common options we offer:

  • Voluntary summer programs
  • Mandatory summer programs
  • On-line summer programs
  • Live workshops
  • Videoed workshops
  • For-credit courses - voluntary or mandatory
  • Non-credit courses - voluntary or mandatory
  • Writing across the curriculum with an ASP component
  • Mandatory study groups
  • Voluntary study groups
  • Upper-division teaching assistants/teaching fellows/tutors
  • Facebook information
  • Twitter information
  • Internet and intranet web pages
  • Email study tips
  • Official law school announcements
  • Stand-alone ASP/bar prep workshops
  • Workshops with student organization co-sponsors
  • Workshops with bar review company co-sponsors
  • Electronic packets of topical information
  • Hard copy packets of topical information
  • PowerPoint slide shows
  • Formats with exercises, pair-and-share, and more
  • Student panels on topics
  • Faculty panels on topics
  • Podcasts
  • Blogs
  • Links to Internet resources
  • And more

Boosting attendance? Food bribes work well until the budgets are cut (or students complain about too much pizza). Door prizes work well until the swag becomes same old-same old. And so forth.

So, here is the reality. 100% is not the only measure that matters. Having a positive impact for the students who choose a particular format/resource is legitimate. By providing options for a variety of consumers, we reach students where they are and when they want to partake.

My survey last spring on academic success resources reminded me that there are more students using resources each day than I may realize. There are a lot of "silent consumers" out there who use digital/hard copy packets and intranet/email resources; they just are not as visible as those who want appointments or attend workshops. The survey registered their appreciation for academic success services. It was a good reminder that options are important. The impact on each individual student through less visible methods was just as important an impact.  (Amy Jarmon)   

 

September 11, 2015 in Miscellany, Program Evaluation | Permalink | Comments (0)

Wednesday, January 14, 2015

Southwestern Consortium of Academic Support Professionals Workshop March 6th, 2015

3rd Annual Southwestern Consortium of Academic Support Professionals Workshop

March 6th, 2015

 

Assessing Students and Programs to Develop

a Targeted Approach to Academic Support

at

Texas A&M’s School of Law

in Ft. Worth, Texas

The Southwestern Consortium of Academic Support Professionals will host a one day workshop focused on targeting our efforts for maximum efficiency.  Decreased enrollment created a budget crunch for most schools.  Decreased budgets can disproportionately fall on ASP shoulders, but we are still expected to provide the same level of support.  We must be efficient to provide a high level of service with the decreased resources.  To maximize efficiency, we need to assess where to utilize resources and whether our programs are making an impact.  This year’s workshop will include programs to help us assess which students need our help from pre-matriculation through the bar exam.  We will also discuss ways to determine whether our programs are working and more efficient ways to deliver our services. 

Registration is open to anyone interested in academic support.  There is no registration fee.  If you are interested in attending, please fill out the attached form and return to:  Camesha Little, Assistant Director of Academic Support at cflittle@law.tamu.edu.  Forms will be accepted through February 27th.

Hotel Information:

A block of rooms has been reserved at the Sheraton Ft. Worth Hotel and Spa, 1701 Commerce St., Ft. Worth, TX 76102.  This hotel is located right across from the law school.  We negotiated a rate of $139.00 per night.  Please be advised that this block will release and the price will expire on February 20, 2015.  You can book your room online at https://www.starwoodmeeting.com/StarGroupsWeb/res?id=1409306215&key=216B6F3F, or by phone by calling (800) 325-3535 and referencing Southwest Consortium of Academic Support Professionals.

Schedule:

March 5th:

6:30 – Dinner for anyone arriving early.

March 6th:

9-9:50 – Assessing Students before they enter – Marta Miller, Director of Academic Achievement at Texas A&M School of Law

10-10:50 – How to use LSSSE Data in ASP – Dr. Evan Parker, Director of Analytics at Lawyer Metrics

11-11:50 – Developing a targeted class to improve academic performance – John Murphy, Associate Professor of Law at Texas A&M School of Law

12-12:50 – Lunch

1-1:50 – Assessing the effectiveness of Voluntary ASP Programs – Rebecca Flanagan, Assistant Professor of Law, Director of Academic Skills Program at UMass School of Law

2-2:50 – Determining who is at-risk for Bar Struggles and creating a program to improve success – Jamie Kleppetsch, Assistant Professor, Associate Director, Academic Achievement Program at The John Marshall Law School

2:50-3 – Closing Remarks

If you have any questions, please feel free to contact:

Steven Foster (sfoster@okcu.edu)

Director of Academic Achievement at Oklahoma City University

Marta Miller (marta.miller@law.tamu.edu)

Director of Academic Support at Texas A&M School of Law

January 14, 2015 in Meetings, Program Evaluation | Permalink | Comments (0) | TrackBack (0)