Call for Talks – Igniting Law Teaching 2015

LAW PROFESSORS: Are you doing innovative things in the classroom? I would love to showcase your ideas at Igniting Law Teaching, a TEDx-styled conference on law school innovations.

The Call for Talks for Igniting Law Teaching 2015 is out, We’ll be reviewing proposals on a rolling basis, until January 15th.

The conference is March 19-20, 2015 (stay tuned for registration information) in Washington DC at American University Washington College of Law.

Last year’s conference brought together more than 40 law school academics in a TEDx-styled conference to share ideas on law school innovations. LegalED’s Teaching Pedagogy video collection includes many of the talks from last year’s conference (others are being produced and will be available soon).

The topics we addressed last year are: Flipping A Law School Course, Using the Classroom for Active Learning, Simulations, Feedback and Assessment, The Craft of Law Teaching, Applying Learning Theory to Legal Education, Beyond Traditional Law Subjects, and Teaching for the 21st Century.

We would love to hear more on these topics and also expand the horizons a bit. We designed the conference to create a forum for professors like you who are experimenting with cutting edge technologies and techniques in law teaching with the goal of spreading your ideas to the broader community. We see the conference as a way to showcase you as a leader in teaching innovation and to inspire innovation by others as well.

The Igniting Law Teaching conference is unlike other gatherings of law professors. Here, talks will be styled as TEDx Talks, with each speaker on stage alone, giving a well scripted and performed talk about an aspect of law school pedagogy. In the end, we will create a collection of short videos on law school-related pedagogy that will inspire innovation and experimentation by law professors around the country, and the world, to bring more active learning and practical skills training into the law school curriculum. The videos will be available for viewing by the larger academic community on LegalED, a website developed by a community of law professors interested in using online technologies to facilitate more active, problem-based learning in the classroom, in addition to more assessment and feedback.

This is a great opportunity to showcase your innovations to the legal academy. Consider joining us for Igniting Law Teaching 2015!

Cross-posted on the LegalTech Blog

The Baby Has Finally Been Birthed!

Comprehensive revisions passed

The ABA House of Delegates passed the comprehensve revisions with “minimal  fuss” according to the ABA Journal linked  above.  One area, however, garnered  significant attention and also resulted in  an odd, though perhaps meaningless ,  procedural move.  The House voted  to send back to the Section on Legal Education for further consideration the comment to standard 305 which prohibits payment to students for credit-based courses.

What does this mean? Law schools which have not already done so must start identifying, articulating publicly and assessing student learning out outcomes, providing every student six  credits of clinic or clinic-like experiential courses and requiring students to take two credit hours worth of professional responsibility coursework.

Well, it’s a start……

Assessment Tales: The Bluebooks That Stayed

It’s that time of year when we all have the grading of our last semester’s bluebooks well behind us and the last few students have come in to review their exams. So we have packed up the bluebooks to be archived and they are out of sight and out of mind.

But wait! In the world of assessment, the bluebooks come back! These papers and exams have a wealth of assessment information for us to mine if we only take the time to gather, reflect and use that information.

How can you use your bluebooks for efficient assessment and improvement of student learning?

Many faculty gather holistic impressions as they grade about the performance of the students overall and the areas of difficulty and strength. To improve on this reflective process, faculty can take a few more simple steps:

1. Improve your data collection.

Rather than gathering general impressions as we grade bluebooks, we can mine the bluebooks for some more concrete data. Examine the distribution of performance on individual questions or issues. Note that you need not gather every data point possible from the bluebooks. Often it is helpful to begin with two or three items to analyze. For example, what is the one thing that nearly every student did well on the exam? What were the one or two questions/issues/approaches that many students had problems on? What percentage of the students had these problems?

2. Analyze your data.

For issues students appear to have learned well, look again at your questions. How confident are you that the question truly tested the student understanding? In this respect, essay questions are often easier to evaluate than multiple choice questions, because you can see the students reasoning on the former, whereas consistently correct answers on the latter can be the result of distractors that are patently wrong. What materials and techniques did you use to prepare the students for that question? When during the semester did you teach those matters? If the student performance is improved from prior exam administrations, what, if anything, did you change that may have caused this improved learning?

For issues or questions on which a significant percentage of student performance was deficient, again, begin by reexamining the question, its placement in the examination and the time allocated for responses, to identify other possible reasons for poor performance that are less related to student learning and more related to exam conditions. Look for patterns in the student errors or misconceptions that can help you diagnose what learning conditions led to the student poor performance. What materials and methods did you use to teach this doctrine?

3. Plan for the next class

When students are performing well on a doctrine or concept, especially when that competent performance appears to have been the result of your prior efforts to target and improve learning activities for that material, you may be tempted to rest on your (and your students’) laurels. However, consider that any change to one part of a course can affect other parts and each class brings with it different experiences and preparation.

To improve student learning on areas that have presented difficulties for students, consider not only improving teaching materials or methods related to that area, but also incorporate more formative assessments during the term to help you and the students identify earlier and more clearly the learning deficiencies.

4. What my bluebooks told me this semester:

To illustrate this process of mining bluebooks for assessment, I will discuss this semester’s Professional Responsibility exam. From this semester’s bluebooks, I gathered a range of data on materials well understood and poorly understood. I will share three examples of data to illustrate the process of using bluebooks for an assessment process.

The doctrinal winner this year in terms of student performance was multijurisdictional practice of law. Is this because the students understood these aspects of the course better than others? Reviewing the exam, I noticed that the question testing this subject called for a fairly low level of mastery (basic issue spotting and knowledge of rule) without any sophisticated analysis required. This was a topic for which I had provided a number of practice problems to the students and I had tested the issue in a similar fashion on a prior year’s exam, which I had made available for student review. Moreover, it is a subject that, because my law school is located on a state line, with dramatically different variations on this rule, the students understood that this was a rule that would impact their immediate future, as they chose which state bar exam to take first. What I learned from this is the fairly unremarkable understanding that my law students can and will master at a knowledge-level those topics for which they know they will be tested and for which they also have a more personal motivation to learn well. I concluded that I would and could generalize these understandings to not only raise the bar on testing this doctrine, requiring a more sophisticated understanding, but also would look for other areas in which I could improve student motivation by identifying the specific need-to-know circumstances looming in their immediate future for other rules.

A second topic about which I have been tracking student learning performance for many semesters is the student understanding of the distinction between the evidentiary attorney-client privilege and the ethical duty of confidentiality (among other doctrine). When I first began tracking, as many as 30% of students were demonstrating fundamental confusion on this topic – using language of “privilege” when the subject was confidentiality (or vice versa) or confusing the exceptions to the ethical duty with the crime-fraud exception to privilege. I knew from speaking with other Professional Responsibility teachers that this is a common area of confusion for students. Over the course of several semesters, I worked to improve student learning in this area: including more problems in course materials, writing and assigning a CALI lesson on the subject, and explicitly telling the students that this is something that I am tracking and cheering them on to “make this the 100% mastery year.” The efforts are bearing fruit. This semester was the best yet – only four out of 72 students used the vocabulary of the two doctrines improperly and three of these applied the correct rule even though they were not using the correct terminology in doing so.

An area on which I had thought I was making progress in student learning turned out to be a continuing problem. Students commonly are confused by the rule governing an attorney’s right to withdraw from representation. I have made the same efforts on this doctrine as I have with the privilege v. confidentiality confusions: increasing problems, providing additional outside resources (again, I wrote a CALI lesson on the subject); and providing in-class quizzes to assess understandings while there was still time to improve learning. However, I was puzzled to see 13 of the students declare that an attorney may not withdraw from representation if it would harm the client. What could have been the source of this confusion? Searching through my course materials and lesson plans, I uncovered the problem. A powerpoint lecture on withdrawal from representation when the client fails to pay the attorney contained a page with a bulletpoint list of reasons that courts might deny an attorney permission to withdraw even though the rules would permit the withdrawal. One of the bullet points listed “degree of harm to the client” as a factor the court would consider. Obviously some students had transferred the powerpoint slide into their notes on the general withdrawal rule rather than recognize that these factors were connected only to the judicial discretion to deny an otherwise permissible withdrawal. Again, a well-worn lesson learned anew: as helpful as powerpoint slides can be for organizing discussions and providing visual cues for learning, students will study text of these slides as definitive statements of law rather thumbnails of larger discussions and understandings. Conclusion: no shortcut summary slides!

Interactive Assessment Program at the AALS Annual Meeting

I am really excited about moderating and serving as commentator of the second session of the Joint Program sponsored by the Clinical and Professional Responsibility Sections to be held in San Diego on Wednesday, January 7.   It is on assessment and is going to be very interactive.  At the International Clinical Conference this summer, I had the distinct pleasure of meeting Professor Kevin Kerrigan and Dean Philip Plowden of Northumbia who were the organizers of the conference in Cork, Ireland. (I discussed this in an earlier post, “Two American Keynotes.” )  The plan for the AALS session will involve how they assess learning across the law school curriculum in the UK, and they plan to involve the group in an interactive assessment exercise.  They will ask the group to use a criteria-referenced protocol used at Northumbria.  I have blogged about how important it is to measure students learning by their performance with regard to a specific desired outcome rather than simply ranking them against each other, so I am very interested in seeing how a group of clinicians will use the protocol.  And, Lawrence Grosberg of New York Law School will discuss  multiple assessment methods in clinical and skills courses  as a result of his work using of “standardized clients”.  He will involve the group in an interactive assessment exercise.   Since I have been working with the  UNM Medical School using standardized clients in joint training about domestic violence, I have very interested in seeing how Professor Grosberg’s assessment techniques have developed.  I get to play “Oprah” and get the audience involved and hopefully in my best “Oprah like” way ask provocative and interesting questions!  And, if you have been following my posts, you know that I am very interested in the impact of outcome based assessment on law students of color.  My theory is that clearly articulating learning outcomes and evaluating the extent to which students acheive those objectives will benefit all students.   And, I hope that measuring performance on criteria referenced objectives will prove to be a much better way of evaluating students rather than ranking them against each other on their performance of the same skills that they demonstrated on the Law School Admissions Test.  And, true to a foundational principle of clinical teaching…this session will not involve “talking heads”…we will all learn by doing! 

%d bloggers like this: