Saturday, January 16, 2010

Finding Jobs in ASP: Where to look

This is a short post for those of you who are looking for positions in ASP, or who are looking to make a switch to another school. If you made it here, this is a great start for your job search. However, there are other helpful places for you to look:

SALT (Society of American Law Teachers) Job Postings

Chronicle of Higher Education Job Site

The ASP listserv

Consortium of Higher Education (by region)

Additionally, if you are geographically limited or want to work at a particular school, search the school's internal job site and if they have a job listserv, sign up. Personally, I found my position at the University of Connecticut on their internal job site (with help from word-of-mouth); it did not show up on mass listservs. 

Go to conferences; you never know who you will meet and who knows of a job that will be opening up soon. I got every academic support position I ever held through word-of-mouth; someone met me or knew me and gave me a call or a link about a position. The ASP community is enormously friendly, and most everyone will be very happy to help with a job search.

For those of you out there looking for a new position, good luck! (RCF)

January 16, 2010 in Job Descriptions | Permalink | Comments (0) | TrackBack (0)

Thursday, January 14, 2010

Report From AALS-ASP Section

It was another successful year at AALS. Here are some of the highlights of our section and the section program:

1) The section program featured:

    Barbara McFarland (NKU), Sophie Sparrow (Franklin Pierce), and on team-based learning   

    Susan Keller (Western State) and Hillary Burgess (Hofstra) on peer feedback

    Pavel Wonsowicz (UCLA) on engaging students in class*

    Tanya Washington (Georgia State) on using bar questions in doctrinal classes

    *Please see the Chronicle of Higher Education piece on Pavel's presentation

2) New Section Board:

    Chair: Robin Boyle (St. Johns)

    Chair Elect: Michael Hunter Schwartz (Washburn)

    Secretary: Paula Manning (Whittier)

    Treasurer: Rebecca Flanagan (UConn)

    Executive Board: Jeff Minetti (Stetson), LaRasz Moody (Villanova), Emily Randon (UC-Davis)

        Herb Ramy (Suffolk)

3) New Section Committees:

    The section added these new committees:

    Exploratory Committee on a Mid-Year AALS Meeting for ASP

    Poster Committee

4) Hillary Burgess and Corie Rosen (editors) distributed copies of the newest edition of The Learning Curve, the AALS ASP section newsletter. Please look in your inboxes for electronic distribution of the newsletter in the near future.


    

January 14, 2010 in Current Affairs | Permalink | Comments (0) | TrackBack (0)

Wednesday, January 13, 2010

The Numbers Game

All of us in academic support want to know what is working and what is not.  Our very nature as ASP'ers is that we want to improve our work so that we can help our students more effectively. 

There are multiple ways to evaluate our programs: questionnaires/surveys, suggestion boxes, unsolicited e-mails and notes, academic improvement of students with whom we work, bar passage improvement (for those who do bar prep), attendance at programs.  Some of these are qualitative, and some are quantitative.  Some might use both indicators.

I agree that numerical data can be helpful.  However, in nearly 25 years of working in higher education, I have never been won over by the view that numbers alone tell the story.  Why do I take that view?  Let me give you some examples:

  • Our work matters one student at a time.  I do more than just parrot study advice.  I listen.  I encourage.  I challenge.  I confront.  I pick up the pieces.  I search for answers to that student's success.  I rejoice in breakthroughs.  A number does not evaluate those encounters.
  • Rarely is my work a "one hit wonder" phenomenon.  My work is based on assessment of issues, individual plans for improvement, monitoring of progress, and relationships with students.  One appointment as a number means little in that on-going process.   
  • A student who achieves a modest grade increment may be succeeding at a level that was a mountain climb away from the student's starting point.  Another student may "knock the ball out of the park" because the problems were study management issues rather than understanding how to analyze legally.  Both students have shown success for their abilities and their issues.
  • Ultimately, my students must want to succeed by doing the hard work that is needed in law school.  I cannot impact their success with strategies and techniques if they choose to not implement them.  A small increase (or even a decrease) is not about me.  Likewise, I cannot take all the credit for major successes. 
  • Academic success is a work in progress.  One semester's improvement may not show the full story.  I have students who have worked with me over a series of semesters.  By the time we are done, their efforts have maximized their success with A's and B's rather than their initial D's and F's. 
  • Students vary in the number of appointments that they need to improve.  One student may need four or five appointments during the semester to achieve more effective and efficient time management.  Another student might be able to implement techniques fairly quickly.
  • Some of the ways we help students may not be easily calculated.  For example, I send out weekly e-mails of study tips to all law students.  I know that students benefit from the advice even though they might never attend a workshop or ask for an appointment.  I get enough informal feedback to know the service makes an impact.  But, a question on a survey about "how often do you read the weekly study tips e-mails" or "how many times have you implemented a technique from the e-mails" would probably garner little useful (or accurate) information.
  • Numbers do not necessarily reflect quality.  If I held 2,000 30-minute appointments instead of 1,000 1-hour appointments in 12 months, would it really reflect anything positive?  I doubt it.  In some cases, it would merely indicate that I saw the same student twice to cover the material that I used to cover in one longer appointment.  It would often mean that I was short-cutting on assistance to the student (handouts that do not match the individual problems; a canned speech on a topic that did not consider the individual learning styles; a lecture rather than a discussion).
  • Numbers in bar success can monitor trends when a full bar study is done (that is those who passed as well as those who failed).  We can find out lots of quantitative indicators: bad grades, 1L gpa, bar courses taken, and so forth.  That information is helpful.  But only by talking to each person individually would we ever truly know what equaled success or failure for that graduate beyond mere numbers: techniques that worked; bar review courses used; any medical or personal obstacles; diligence in study; timing of study; work while studying. 
  • Four students at a workshop who delve deeply into strategies and ask questions focusing on their issues with the topic may be hugely successful as a workshop.  Thirty students at a workshop who barely pay attention and do not really want to be there may produce little improvement.

So, I would say we should look for both quantitative and qualitative measures when we seek to find out how we can improvement our programs.  (Amy Jarmon)

January 13, 2010 in Program Evaluation | Permalink | Comments (0) | TrackBack (0)