Competencies-Based Legal Education

[This was originally posted by the Clayton Christensen Institute on Disruptive Innovation]

 Last week, I discussed why law schools need to respond to the changing marketplace for legal services and legal education.  In thinking about how best to prepare for that changing world, law schools need to consider how competency-based educational models can be employed to advance educational objectives for students seeking to enter the market for legal services.  As Michael Horn and I explain in our new whitepaper, Disrupting Law School, regulatory protections that have sheltered law schools from competition will continue to subside.  In this new environment, law schools need to reimagine themselves as educators for students interested in learning about the legal services sector, not simply those seeking a JD.

One way to do this is to think about legal education from a blank slate.  Rather that try to retrofit our current pedagogy to address 21st century needs, instead we need to think about it from its inception — if one were to start a school today to educate those who want a career in the legal services field, what would that school look like?

Upstart competency-based education programs have done just that in other parts of higher education.  They provide at least three new considerations for traditional law school as they begin to think about and prepare for the future.

1. Time is no longer the measure of accomplishment

Online competency-based learning reverses the traditional relationship in education between time and student learning. In the traditional educational model, time is fixed while each student’s learning is variable. With online competency-based learning, the relationship between time and learning is reversed — time becomes the variable and each student’s learning becomes essentially fixed. Students process at their own pace, moving from topic to topic upon mastery of each. Those who need more time to master a concept before moving on to the next take the time they need, while others move ahead to the next set of material and learning objectives.

2. Centrality of competencies, learning outcomes, and assessments

Online competency-based programs shift the teaching pedagogy toward student-centered learning. In an online, competency-based program, faculty and instructional designers start by identifying the competencies students must master to achieve the desired learning outcomes and then work through each to understand how a student would demonstrate mastery of those objectives. Through constant feedback, students know how they are doing and what they need to do next and teachers can determine when students have mastered competencies and are ready to move forward. The assessments in other words are both forward looking—assessments that help determine what a student studies nextand backward looking —assessments that indicate whether a student has mastered the material.

3.  Modularization of course material provides more flexibility and different business models

Online competency-based learning is also changing key elements of the traditional higher education business model. Online technologies make it possible to modularize the learning process—that is, to break usual semester-long courses into shorter learning units or modules, which can be studied in sequence or separately. When material is packaged in online modules, it is easier to use for multiple educational purposes and multiple audiences in different combinations.

Stackable modules allow students to create individualized curricula based on their own learning goals and objectives. For students who attend law school knowing the area of law in which they want to practice—a segment of the student body currently underserved due to limited course offerings in any one topic at any one law school—modules open up opportunities to stack credentials from multiple sources. The long tail of the Internet opens up these opportunities; there may be sufficient student demand if online courses can aggregate demand and serve students from around the country or even the world.

Modules also eliminate duplication and optimize teaching resources. This flexible architecture can create an entirely new business model for law-related education. When learning is broken down into competencies—rather than semester-long courses—modules of learning can be packaged into different scalable programs for very different audiences—for example, paralegals, legal technicians, law students, lawyers (CLE), judges, administrative agencies, non-JDs working in law-related fields, foreign students, high school/college moot court teams, undergraduate students, journalists, clients, life-long learners, and so forth.  The possibilities abound.

This exercise can take us in a lot of different directions.  Every direction, though, will ask us to change and move beyond the status quo.  While change is hard, it is also necessary.  I hope our whitepaper provides sufficient impetus to get started.

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Readers of this blog and followers of the NCBE’s expansion remember  that this past Spring New York became the 16th state  to  adopt the Uniform Bar Examination (UBE), changing  its longstanding bar admission requirements.  Many voices opposed adoption including the New York State Bar Association (NYSBA) (see Committee on Legal Education and Admission to the Bar (CLEAB) report 10-29-2014  and vote of House of Delegates), the Clinical Legal Education Association (CLEA) and the Society for American Law Teachers (SALT).  Despite these and other  opposition voices, the proposal was adopted with the new changes going into effect for the July 2016 bar examination.

During discussion of the adoption of the UBE, the Court was encouraged  to include clinical or experiential  requirements for licensing so that lawyers admitted to the New York Bar would be ahead of the curve — a position I firmly support.   On the opposite coast, California had been engaged in a multi-year process examining licensure and profession readiness which resulted in a proposal requiring 15 credits of experiential learning before admission.  In response to the movement to incorporate experiential learning in bar admission,  the New York State Court of Appeals formed a Task Force on Experiential Learning and Admission to the Bar.  Just last month, that Taskforce requested comments on its proposal that

New York adopt a new mechanism for ensuring that all applicants for admission to the bar possess the requisite skills and are familiar with the professional values for effective, ethical and responsible practice. In light of New York’s diverse applicant pool, and in an effort to accommodate the varying educational backgrounds of applicants, the Task Force suggests five separate paths by which applicants for admission can demonstrate that they have satisfied the skills competency requirement.

The New York Law Journal examined the proposal in an article found here.   In addition, the Honorable Judge Jenny Rivera, chair of the Taskforce attended a meeting of NYSBA’s Committee on Legal Education and Admission to the Bar (CLEAB) to explain the proposal and answer questions.

It is heartening that the Court is concerned about and wants to  require the development of essential lawyering skills and professional values acquisition. However, without more, Pathway 1 of the current proposal will not actually ensure  that applicants to the bar experience the kind of skill development and value formation that the Taskforce desires.  Pathway 1, referencing new ABA standards,  requires schools to confirm that they have published  their “plan for incorporating into their curriculum the skills and professional values that,  in the school’s judgment,  are required for its graduates’ basic competence and ethical participation in the legal profession.” It also requires law schools to certify  that law graduate applicants for admission “have sufficient competency in those skills and sufficient familiarity with those values” which are publicly available on the law school’s website.  Although Judge Rivera believes that the certification process described in Pathway 1 can have some real bite, as pointed out in comments submitted by the Clinical Legal Education Association (11.9. 15 CLEA SUBMISSION ON EXPERIENTIAL REQUIREMENT ), Pathway 1 simply mirrors the experiential training requirements already mandated by the American Bar Association.     

New York’s  law school deans, not unexpectedly,  submitted comments supporting the “flexibility” of Pathway 1.  The  CLEAB report to the Experiential Taskforce expressed concern that without additional content to Pathway 1 “little will be accomplished” by the proposal.   And as one member of the NYS bar committee  argued, “what law school is going to admit that one of its graduates did not acquire the skills or  values promised on its website?”

In my opinion, the most important concern is whether applicants to the bar have ever represented or interacted with a client, or operated as a lawyer, in a live setting under guided, experienced supervision before admission.  In its comment to the Taskforce, CLEA urges that a “three- credit clinical training requirement” be added for all J. D. applicants to the New York Bar.  This makes sense.  Law school clinics and faculty-supervised externships are designed to create the very kind of skill development and value acquisition with which the Court is concerned.  And clinical faculty have developed the formative assessment tools to maximize skill and professional identity formation.

I am hopeful that, in its next iteration of the proposal, the Taskforce will heed CLEA and CLEAB’s comments and come back with recommendations that will ensure applicants for the bar are ready to engage in competent, ethical and professional representation of New York’s citizenry, corporations, and notforprofits.

 

 

 

 

Teaching Legal Reasoning More Efficiently?

Teaching the traditional analytical skills more efficiently and effectively could provide a much needed opening for broadening the range of skills taught to all law students. In the legal academy’s version of the “socratic method”, law teachers historically taught the analytical skills” implicitly”. They demonstrated legal reasoning by pushing students away from their raw intuitions of fairness and justice to articulate rules and exceptions, while attending carefully to the inevitable ambiguities of language.

Some law teachers suggest that the process of learning to “think like a lawyer” fundamentally requires time and practice and therefore cannot be significantly speeded up.

Yet the implicit approach has been repeatedly challenged by scholars seeking to teach legal reasoning more explicitly, by naming and explaining how it works.*  (An obsession with the goal of teaching legal reasoning more efficiently was a major thread in two phases of my own legal career when I taught first year civil procedure. I struggled both to teach skills more explicitly and to provide students with opportunities to practice them.)

A recent contribution to this quest by my colleague Jane Winn grows out of her experiment teaching common law legal reasoning to undergraduates. Students were randomly assigned to use either a well-regarded study aid, or Winn’s own materials. The materials were also leavened by her own and colleagues’ experiences teaching foreign LL.M. and J.D. students coming from legal systems growing out of the European continental legal tradition.

Winn’s effort, aimed at law students, is notable in three respects. First, at twenty-nine pages it fills an intermediate-length niche: longer than a typical class “handout’, but shorter than the various book length alternatives. Second, it covers case briefing, outlining and exam questions, demonstrating how the three are related. Third, it grew out of an attempt to test her teaching method empirically using random assignment to a control group. Both law students and legal educators should find it a useful contribution.

The 2015 ABA accreditation standards may provide a laboratory in which to test efforts such as Winn’s. Standard 302 now requires law schools to adopt learning outcomes that, under subsection (b), must include legal analysis and reading; Standard 314 requires law schools to provide students with both formative assessment (feedback) and summative assessments (final “grades”); under Standard 315 law schools must engage in “ongoing evaluation of the program of education, learning outcomes, and assessment methods”. At its best this combination of more intentionally articulated outcomes, feedback to students, and program evaluation could prompt law schools to evaluate the potential for greater efficiency and effectiveness in teaching legal reasoning. I remain hopeful that enough schools will approach this task rigorously and in good faith that at least some progress can be made.

*Winn’s illustrious predecessors include:

  • Leading Legal Realist Karl Llewelyn, whose The Bramble Bush: Classic Lectures on Law and Law School have been assigned to generations of law students;
  • University of Chicago Professor and President and U.S. Attorney General Edward H. Levi, author of An Introduction to Legal Reasoning, originally published in the University of Chicago Law Review and then in book form;
  • Critical Theorist and Harvard Professor Duncan Kennedy, who took the decidedly un-Harvard step of visiting at New England School of Law in his attempt to reach beyond elite students and sharpen his skill at teaching students about the “gaps, conflicts and ambiguities” that underlie the development of the common law. He shared his insights widely with former students moving into teaching careers. produced a short volume
  • My former colleagues Pierre Schlag and David Skover, who produced a short volume early in their careers that catalogued the Tactics of Legal Reasoning (1985).
  • Richard Michael Fischl and Jeremy Paul, Getting to Maybe: How to Excel on Law School Exams (1999)
  • Leading clinical teachers Albert J. Moore and David Binder, Demystifying The First Year of Law School: A Guide to the 1L Experience (2009)

In recent decades much of the heavy lifting in legal reasoning has devolved upon teachers of legal analysis, research and writing. Among the results is a burgeoning literature proposing variations on the syllogistic Issue-Rule-Analysis (or Application)-Conclusion approach to analyzing and writing about legal problems, as well as a variety of textbooks.

.

Unmasking Assumptions about Employment Outcomes and Legal Education

In an upcoming Wisconsin Law Review article, Robert Kuehn, Associate Dean for Clinical Education and Professor of Law at the Washington University Law School, presents a cogent, well-supported and thoughtful article describing the limitations of and lessons we can learn from the existing empirical analysis correlating student enrollment in clinical education and employment outcomes.  Kuehn’s article, entitled Measuring Legal Education’s Employment Outcomes is particularly powerful because it provides a thorough empirical rejection of the claim that clinical coursework might actually harm employment outcomes, as asserted by Professor Jason Yackee and which attracted some sound-bite attention earlier this year. In what is, perhaps,  an unexpected twist, Kuehn demonstrates that using Yackee’s statistical assumptions and methodology also would produce negative correlations for those students who participate on law journals or in moot court competitions.  Kuehn argues that one can’t draw any reliable conclusion from Yackee’s 2013 model, and perhaps not from any nationwide statistical model – as opposed to a particularized analysis of one school –  on the likely effect of clinical courses (or other activities like law journal or moot court) on employment, and surely not the negative effect Yackee posits. Kuehn points out that as to clinical coursework, the available evidence (through surveys) indicates that such experiences do aid some students in securing employment.

If you, like me, still become a bit nervous about how much you actually remember from undergraduate statistics courses, do not be alarmed by this post!  You will find Kuehn’s article accessible and a quick good read, even when he is using words like “regression analysis,” “granular data” and “variable choices.”   Here are the points made in Measuring Legal Education’s Employment Outcomes which I found most helpful:

  1. Kuehn’s reminder that when one confuses correlationwith causation one is bound to come up with a “misdiagnosis.” One problem with Yackee’s analysis is the lack of granular data to calculate the true employment rate for those who took a clinic (or who did not).  In fact, the data is so poor that “the results never account for more than half of the variability in employment across schools.”
  2. Kuehn’s explanation of the “confounding effect of prestige” and bar passage on employment outcomes.
  3. The problems of validity and reliability raised by analyses which employ information from ABA questionnaires, particularly those self-reports submitted prior to 2014.
  4. The fact that “13% of law schools” provide 80% of the school-funded jobs to law graduates. Not surprisingly, Kuehn found this factor biases many results if you examine nationwide statistics. And when Kuehn removes those jobs from the statistical analysis, Yackee’s correlation with clinical education falls apart even using his own assumptions and methodology.
  5. Yackee’s model yields completely different results if one uses the US News Lawyers/judges data versus academic peer data to control for the possible influence of perceived prestige.
  6. Application of Yackee’s model to “Law Journals” and “Skills Competition” and S. Newssub-groups also show no relationship to employment outcomes!
  7. In Yackee’s model, a better ranking is “strongly associated with improved employment outcomes.” However, Kuehn points out that a “closer examination of the relationship between rank and employment indicates that this positive association, although statistically significant when applied across the entire range of top 100 schools, does not hold true for schools ranked 51 through 100 (emphasis added).” 
  8. Kuehn’s documentation of employers who require, “strongly prefer” or identify law clinic experience as a positive factor in hiring such as The U.S. Department of Homeland, legal services and  legal aid offices, district attorney, public defender, fellowships and private law firms.
  9. Kuehn’s description of National Association of Law Placement (NALP) existing information: such as the  2011 survey of lawyers with non-profit and government offices;  the NALP survey of lawyers in firms of predominantly more than 100 attorneys; the NALP survey of public interest legal employers;  and the NALP 2013 presentation on the employment market reporting that ” law firms say they want new graduates to have ‘more experiential learning, client-based and simulation.”
  10. Kuehn provision of good information on other employer information such as the Lexis-Nexis WHITE PAPER: HIRING PARTNERS REVEAL NEW ATTORNEY READINESS FOR REAL WORLD PRACTICEProfessor Neil Hamilton’s employer survey to determine the relative importance of twenty-one different competencies in employer hiring decisions, and Professor Susan Wawrose’s legal employer focus groups which found employers prefer new hires with ” well developed professional or ‘soft skills” along with “strong fundamental practice skills.”

Professor Kuehn concludes by recommending that studies could best be done on a school-by-school basis by “surveying likely employers to find out what educational experiences of students are most valued.”  Professor Kuehn also recommends that schools could also “retrospectively look at various employment outcomes for graduates and any relationship” to students’ experiences while in school.

I agree with Professor Kuehn and am happy to report that  Albany Law School,  through its faculty Assessment committee and Admissions office,  is currently engaged in conducting employer focus groups and analyzing what best helps our students obtain employment in their desired career paths.  Until good data and information suggests otherwise, Professor Neil  Hamilton’s advice to law students,which Professor Kuehn quotes in his “must read” article, bears repeating:

In this challenging market for employment, a law student can differentiate herself from other graduates by demonstrating to legal employers that the student both understands the core competencies that legal employers and clients want and is implementing a plan to develop these competencies, including an ability to demonstrate that the student has experience with these competencies.

Building on Best Practices now available as eBook

Are you trying to:

  • Develop a meaningful law school mission statement?
  • Understand new accreditation requirements, learning goals, and outcomes assessment?
  •  Expand your experiential offerings?  Decide whether to use modules or courses?  An on-site clinic, an externship, or community partnership?
  •  Teach ALL of your students in the most effective ways, using a full range of teaching methods?
  • Add to your curriculum more of the professional identity, leadership, intercultural, inter-professional and other knowledge, skills, and values sought by 21st century legal employers?
  • Lead thoughtfully in the face of the challenges facing legal education today?

These and other topics are addressed in Building on Best Practices:  Transforming Legal Education in a Changing World,  now available in ebook format from LexisNexis at no charge.

The print version is not yet out.  LEXIS-NEXIS is taking advance orders for $50, plus shipping.  BUT we understand that they will make one copy available to every US legal educator for free upon on request.  Details on this and international availability still to come.

Thanks, and congratulations, to book project sponsor Clinical Legal Education Association (CLEA), the more than fifty legal educators who participated as authors, and the countless others who assisted as readers and in numerous other ways.

And, a huge shout-out to my wonderful and talented co-editors, Lisa Radke Bliss, Carrie Wilkes Kaas, and Antoinette Sedillo Lopez.

Using Portfolios for Assessment

A few years ago I started to use student portfolios as part of the end-of-semester evaluation of my students. I have found that portfolios can be an excellent vehicle both for the student’s own self-reflection and for providing summative feedback.

Here is how I use them. At the end of the semester, I ask each student to prepare a portfolio of the written work the student did over the course of the semester. In doing so, each student is asked to read the first and final version of the principal documents that the student drafted during the semester (in the context of my cases, these include the client’s affidavit, any witness affidavits and a brief).

I also ask them to bring the drafts and final versions to the meeting. During the meeting, each student is expected to have reflected on his/her writing, considered how his/her writing progressed over the semester, and point out 2-3 improvements that he or she made. They are also expected to use the drafts to illustrate the progress.

My students find that the act of assembling the portfolio and rereading their own written work serves as a reminder of how far the student has come in crafting a legal theory or developing a factual account of the relevant events or even about some of the obstacles that he or she encountered along the way and how he or she managed to overcome them. I like this method of assessment because it is mainly about self-reflection. Each student in learning from his or her own work. The portfolio is simply a vehicle to make that learning tangible. It is a wonderfully, tangible way to show someone how much he or she has improved over the course of a semester.

I was recently speaking with Larry Farmer from Brigham Young University School of Law. He mentioned that he uses portfolios too. But in his case, they are videos. At the beginning of his course on Interviewing, before any class has been conducted, he asks each student to conduct a mock interview, which is videotaped. The students then spend the semester learning about, practicing, and refining their interviewing techniques.

Then, at the end of the semester, they are asked to review that first interview and to reflect upon their own improvement over the semester. Like the written portfolio that I use, this one also uses a student’s own work to demonstrate learning and progress. I plan to try it next semester.

Are there other ideas out there? Do you use portfolios? If so, how? How can I improve my process? I’d love to hear your thoughts in the comment section below.

Starting with WHY — Building Curriculum for Clinic-wide Orientation

My clinical colleagues and I are planning to convert an Orientation that we currently jointly teach into a 2-credit Clinic Orientation module. The Orientation typically includes a mixture of joint classes and smaller individual clinic-focused sessions.

Since we are developing this new course from scratch, it provides an opportunity to think deliberatively about how we design the course and to clarify our objectives and learning outcomes. In light of the changes in ABA accreditation standards, including the need to define learning outcomes and to assess according to our stated objectives, I thought it could be helpful to document the process we are taking as we develop the course.

My faculty colleagues and I met for the first time this week to start brainstorming for development of this new Clinic Orientation course. We started by brainstorming about WHY we want to develop the course. (I was inspired to Start with Why by Simon Sinek. Here is his inspiring TED talk on that topic).

Here is what we came up with as to WHY we want to develop a new jointly-taught, credit-bearing, Orientation module:

  1. Students need to be able to do certain activities early in the semester/hit the ground running:
  • Interviewing
  • Office procedures
  • Reflection/self-critique
  • Professional responsibility 101 (when working with clients)
  • Research
  • Fact investigation (including reading/maintaining files)
  • Working with interpreters
  • Persuasion
  • Attention to cultural difference/ competency/empathy
  1. Explain the WHY of our pedagogy (explain clinical pedagogy to students)
  • Active and engaged learning
  • Direct responsibility – WHY? Autonomy, mastery, purpose
  • Collaboration – across the board, with team, fellow clinic students, students in other clinics, support staff, faculty
  • Acting for Lawyers
  1. Reinforce “one firm” culture – clinical courses are different, collegial, work together, spaces where you can learn while having fun!
  1. Service Mission of Clinics
  1. Set our expectations for students
  1. Efficiency of teaching resources

As we developed this list, our goal was to brainstorm and include as wide a scope of objectives as possible. We decided to leave for another set of meetings the tasks of thinking about how to achieve these goals and what the classes designed to achieve them would look like. Keeping the conversation on task was a challenge; the temptation was to move onto thinking about how or what. We found it easiest when we designated a person to draw us back to the WHY task when the conversation started to branch off into thinking about HOW or WHAT.

Our next step is drawn from the IDEO Design Thinking for Educators Toolkit. IDEO is a design firm. It looks at systems from a design perspective. I am excited to start applying their theories and practices to legal education. I’ll keep you posted as that project develops.

Call for Talks – Igniting Law Teaching 2015

LAW PROFESSORS: Are you doing innovative things in the classroom? I would love to showcase your ideas at Igniting Law Teaching, a TEDx-styled conference on law school innovations.

The Call for Talks for Igniting Law Teaching 2015 is out, http://legaledweb.com/ilt-2015-call-for-talks. We’ll be reviewing proposals on a rolling basis, until January 15th.

The conference is March 19-20, 2015 (stay tuned for registration information) in Washington DC at American University Washington College of Law.

Last year’s conference brought together more than 40 law school academics in a TEDx-styled conference to share ideas on law school innovations. LegalED’s Teaching Pedagogy video collection includes many of the talks from last year’s conference (others are being produced and will be available soon).

The topics we addressed last year are: Flipping A Law School Course, Using the Classroom for Active Learning, Simulations, Feedback and Assessment, The Craft of Law Teaching, Applying Learning Theory to Legal Education, Beyond Traditional Law Subjects, and Teaching for the 21st Century.

We would love to hear more on these topics and also expand the horizons a bit. We designed the conference to create a forum for professors like you who are experimenting with cutting edge technologies and techniques in law teaching with the goal of spreading your ideas to the broader community. We see the conference as a way to showcase you as a leader in teaching innovation and to inspire innovation by others as well.

The Igniting Law Teaching conference is unlike other gatherings of law professors. Here, talks will be styled as TEDx Talks, with each speaker on stage alone, giving a well scripted and performed talk about an aspect of law school pedagogy. In the end, we will create a collection of short videos on law school-related pedagogy that will inspire innovation and experimentation by law professors around the country, and the world, to bring more active learning and practical skills training into the law school curriculum. The videos will be available for viewing by the larger academic community on LegalED, a website developed by a community of law professors interested in using online technologies to facilitate more active, problem-based learning in the classroom, in addition to more assessment and feedback.

This is a great opportunity to showcase your innovations to the legal academy. Consider joining us for Igniting Law Teaching 2015!

Cross-posted on the LegalTech Blog

The Baby Has Finally Been Birthed!

Comprehensive revisions passed

The ABA House of Delegates passed the comprehensve revisions with “minimal  fuss” according to the ABA Journal linked  above.  One area, however, garnered  significant attention and also resulted in  an odd, though perhaps meaningless ,  procedural move.  The House voted  to send back to the Section on Legal Education for further consideration the comment to standard 305 which prohibits payment to students for credit-based courses.

What does this mean? Law schools which have not already done so must start identifying, articulating publicly and assessing student learning out outcomes, providing every student six  credits of clinic or clinic-like experiential courses and requiring students to take two credit hours worth of professional responsibility coursework.

Well, it’s a start……

Assessment Tales: The Bluebooks That Stayed

It’s that time of year when we all have the grading of our last semester’s bluebooks well behind us and the last few students have come in to review their exams. So we have packed up the bluebooks to be archived and they are out of sight and out of mind.

But wait! In the world of assessment, the bluebooks come back! These papers and exams have a wealth of assessment information for us to mine if we only take the time to gather, reflect and use that information.

How can you use your bluebooks for efficient assessment and improvement of student learning?

Many faculty gather holistic impressions as they grade about the performance of the students overall and the areas of difficulty and strength. To improve on this reflective process, faculty can take a few more simple steps:

1. Improve your data collection.

Rather than gathering general impressions as we grade bluebooks, we can mine the bluebooks for some more concrete data. Examine the distribution of performance on individual questions or issues. Note that you need not gather every data point possible from the bluebooks. Often it is helpful to begin with two or three items to analyze. For example, what is the one thing that nearly every student did well on the exam? What were the one or two questions/issues/approaches that many students had problems on? What percentage of the students had these problems?

2. Analyze your data.

For issues students appear to have learned well, look again at your questions. How confident are you that the question truly tested the student understanding? In this respect, essay questions are often easier to evaluate than multiple choice questions, because you can see the students reasoning on the former, whereas consistently correct answers on the latter can be the result of distractors that are patently wrong. What materials and techniques did you use to prepare the students for that question? When during the semester did you teach those matters? If the student performance is improved from prior exam administrations, what, if anything, did you change that may have caused this improved learning?

For issues or questions on which a significant percentage of student performance was deficient, again, begin by reexamining the question, its placement in the examination and the time allocated for responses, to identify other possible reasons for poor performance that are less related to student learning and more related to exam conditions. Look for patterns in the student errors or misconceptions that can help you diagnose what learning conditions led to the student poor performance. What materials and methods did you use to teach this doctrine?

3. Plan for the next class

When students are performing well on a doctrine or concept, especially when that competent performance appears to have been the result of your prior efforts to target and improve learning activities for that material, you may be tempted to rest on your (and your students’) laurels. However, consider that any change to one part of a course can affect other parts and each class brings with it different experiences and preparation.

To improve student learning on areas that have presented difficulties for students, consider not only improving teaching materials or methods related to that area, but also incorporate more formative assessments during the term to help you and the students identify earlier and more clearly the learning deficiencies.

4. What my bluebooks told me this semester:

To illustrate this process of mining bluebooks for assessment, I will discuss this semester’s Professional Responsibility exam. From this semester’s bluebooks, I gathered a range of data on materials well understood and poorly understood. I will share three examples of data to illustrate the process of using bluebooks for an assessment process.

The doctrinal winner this year in terms of student performance was multijurisdictional practice of law. Is this because the students understood these aspects of the course better than others? Reviewing the exam, I noticed that the question testing this subject called for a fairly low level of mastery (basic issue spotting and knowledge of rule) without any sophisticated analysis required. This was a topic for which I had provided a number of practice problems to the students and I had tested the issue in a similar fashion on a prior year’s exam, which I had made available for student review. Moreover, it is a subject that, because my law school is located on a state line, with dramatically different variations on this rule, the students understood that this was a rule that would impact their immediate future, as they chose which state bar exam to take first. What I learned from this is the fairly unremarkable understanding that my law students can and will master at a knowledge-level those topics for which they know they will be tested and for which they also have a more personal motivation to learn well. I concluded that I would and could generalize these understandings to not only raise the bar on testing this doctrine, requiring a more sophisticated understanding, but also would look for other areas in which I could improve student motivation by identifying the specific need-to-know circumstances looming in their immediate future for other rules.

A second topic about which I have been tracking student learning performance for many semesters is the student understanding of the distinction between the evidentiary attorney-client privilege and the ethical duty of confidentiality (among other doctrine). When I first began tracking, as many as 30% of students were demonstrating fundamental confusion on this topic – using language of “privilege” when the subject was confidentiality (or vice versa) or confusing the exceptions to the ethical duty with the crime-fraud exception to privilege. I knew from speaking with other Professional Responsibility teachers that this is a common area of confusion for students. Over the course of several semesters, I worked to improve student learning in this area: including more problems in course materials, writing and assigning a CALI lesson on the subject, and explicitly telling the students that this is something that I am tracking and cheering them on to “make this the 100% mastery year.” The efforts are bearing fruit. This semester was the best yet – only four out of 72 students used the vocabulary of the two doctrines improperly and three of these applied the correct rule even though they were not using the correct terminology in doing so.

An area on which I had thought I was making progress in student learning turned out to be a continuing problem. Students commonly are confused by the rule governing an attorney’s right to withdraw from representation. I have made the same efforts on this doctrine as I have with the privilege v. confidentiality confusions: increasing problems, providing additional outside resources (again, I wrote a CALI lesson on the subject); and providing in-class quizzes to assess understandings while there was still time to improve learning. However, I was puzzled to see 13 of the students declare that an attorney may not withdraw from representation if it would harm the client. What could have been the source of this confusion? Searching through my course materials and lesson plans, I uncovered the problem. A powerpoint lecture on withdrawal from representation when the client fails to pay the attorney contained a page with a bulletpoint list of reasons that courts might deny an attorney permission to withdraw even though the rules would permit the withdrawal. One of the bullet points listed “degree of harm to the client” as a factor the court would consider. Obviously some students had transferred the powerpoint slide into their notes on the general withdrawal rule rather than recognize that these factors were connected only to the judicial discretion to deny an otherwise permissible withdrawal. Again, a well-worn lesson learned anew: as helpful as powerpoint slides can be for organizing discussions and providing visual cues for learning, students will study text of these slides as definitive statements of law rather thumbnails of larger discussions and understandings. Conclusion: no shortcut summary slides!

Interactive Assessment Program at the AALS Annual Meeting

I am really excited about moderating and serving as commentator of the second session of the Joint Program sponsored by the Clinical and Professional Responsibility Sections to be held in San Diego on Wednesday, January 7.   It is on assessment and is going to be very interactive.  At the International Clinical Conference this summer, I had the distinct pleasure of meeting Professor Kevin Kerrigan and Dean Philip Plowden of Northumbia who were the organizers of the conference in Cork, Ireland. (I discussed this in an earlier post, “Two American Keynotes.” )  The plan for the AALS session will involve how they assess learning across the law school curriculum in the UK, and they plan to involve the group in an interactive assessment exercise.  They will ask the group to use a criteria-referenced protocol used at Northumbria.  I have blogged about how important it is to measure students learning by their performance with regard to a specific desired outcome rather than simply ranking them against each other, so I am very interested in seeing how a group of clinicians will use the protocol.  And, Lawrence Grosberg of New York Law School will discuss  multiple assessment methods in clinical and skills courses  as a result of his work using of “standardized clients”.  He will involve the group in an interactive assessment exercise.   Since I have been working with the  UNM Medical School using standardized clients in joint training about domestic violence, I have very interested in seeing how Professor Grosberg’s assessment techniques have developed.  I get to play “Oprah” and get the audience involved and hopefully in my best “Oprah like” way ask provocative and interesting questions!  And, if you have been following my posts, you know that I am very interested in the impact of outcome based assessment on law students of color.  My theory is that clearly articulating learning outcomes and evaluating the extent to which students acheive those objectives will benefit all students.   And, I hope that measuring performance on criteria referenced objectives will prove to be a much better way of evaluating students rather than ranking them against each other on their performance of the same skills that they demonstrated on the Law School Admissions Test.  And, true to a foundational principle of clinical teaching…this session will not involve “talking heads”…we will all learn by doing! 

%d bloggers like this: