Expanding the Curriculum to Build Better Lawyers

Just finished reading a fascinating piece in the Journal of Legal Education by Pamela Bucy Pierson, Bainbridge Mims Professor of Law at the University of Alabama School of Law.  Her article, Economics, EQ, and Finance: The Next Frontier in Legal Education,  describes the results of her survey of how law schools are covering the topics of: (1) economic trends in the legal profession; (2) emotional intelligence in issues such as managing stress, building resilience in the practice of law; and (3) personal financial planning for tomorrow’s lawyers.  The article outlines why each of these topics is essential to preparation of tomorrow’s lawyers.  Ninety-one law schools responded to the author’s survey on this topic, and the results are encouraging. piersonchart  The descriptions of courses and ancillary programs addressing these topics provides a rich source of ideas for approaches to building these topics into the curriculum. So grab that copy of the Journal that is likely piled in your mailroom or read the article on SSRN

A Reluctant Assessor’s Conversion

At the Conference on Building an Assessment Plan from the Ground Up, Professor Judith Daar of Whittier Law School presented her journey through assessment from “Reluctant Assessor” to a professor who values assessment. Professor Daar described how she first began thinking about assessment in the context of Assisted Reproductive Technology. She described for the audience the effects of statutory requirements that doctors be required to report multiple births resulting from IVF and the effect of that “assessment” in lowering IVF multiple births. Having seen that assessment makes a difference in medical settings, she was then ready to try some assessment in her own classroom. She created a pre- and post-test survey instrument in adverse possession unit that was intended to assess preparedness and comprehension. In creating this approach to assessment, she drew on the model required to award continuing education credit in medical education. While in the pre-test, many students accurately stated the holding in an adverse possession case, but after her class in which she criticized the court’s opinion, with the intent that students would learn critical analysis. However, her post-test demonstrated that many students thought that critique meant that they had misstated the holding in the case. Thus, she was surprised to see that she thought she was teaching a skill of critical analysis but the results caused her to realize that the students thought that in modeling that critique she was teaching substance. This caused her to realize how much students relied on the didactic of the classroom to “deliver the law” – that they do not trust their own ability to learn. A fascinating challenge !

A Dean’s Perspective on the Costs and Benefits of Assessment

More from the Building an Assessment Plan from the Ground Up Conference:

Dean Susan Duncan, Dean and Professor of Law at the University of Louisville Louis D. Brandeis School of Law, presented from the dean’s perspective on assessment. She began her remarks by reinforcing the notion that all faculty have to be involved. She noted her concern that the conference is largely attended by legal writing professors and the disproportionate number of women at the conference and urged attendees to be sure that this does not become a service obligation that is “farmed out” to female faculty and legal writing faculty. She also noted the few number of deans in the audience, reiterating the message that deans must be fully supportive of assessment for it to be successful.

Dean Duncan noted that the new accreditation requirements (referring to both assessment and the experiential course requirement) will have an impact on resources, including:
• Devoting of new faculty positions to teach the experiential learning courses
• Paying stipends for field-work supervisors
• Training faculty for increased skills teaching or for better assessment
• Increasing teaching assistants (research assistants are a usual part of the law school environment, but she suggested that teaching assistants may become a more important part)
• Bringing in outside advisors or consultants
• Reducing teaching and service loads to assist faculty in transitioning to the new environment of assessment and skills instruction
In addition, the positive resource result may be that Deans can use the results from assessment plans in fundraising.
She suggested that one of the many benefits of assessment may be the increased formation of professional learning communities in law schools, in which faculty with shared interests will work together to share assessment practices and data.

Advise on Building an Assessment Plan from Dean Maryann Jones

More good stuff from the Building an Assessment Plan conference today: Maryann Jones, Dean Emerita, Western State College of Law, and Educational Consultant, spoke of the “paradigm shift” that legal education faces from a focus on teaching and faculty to a focus on student learning and accomplishments. Dean Jones related her experience in having to make that shift early on because her law school was an independent free-standing law school accredited by the regional accreditor and so was not “given a pass” on assessment as are many law schools that are part of larger university systems. She noted, however, that increasingly regional accreditors are paying attention to law schools, with more than one accreditation report noting that “The law school lags substantially behind the rest of the institution in terms of assessment of student learning.”

She provided a list of very helpful hints in building an assessment plan:

  • The plan has to be an integral part of the whole institution, not housed in an “assessment office” and not only in the curriculum, but must also include student services and co-curricular programs.
  • There must be an institutional commitment of resources
  • The faculty must own assessment. The plan must be built from the ground up, from the beginning, led by the faculty. You cannot simply hand the faculty a plan and tell them to “implement it” She emphasized the importance of one-on-one and small group conversations to develop that plan.
  • One size does not fit all. An assessment plan cannot be “plug and play” but must address its own mission and within its own resources and culture. On the other hand, you don’t have to reinvent the wheel. There are many good models that can provide a starting point.
  • Be creative. Work with what you have.
  • Do not make too many learning outcomes. Keep it manageable.
  • Get training. Regional accreditation programs often provide good training. There are great books available.
  • When you start the dialogue on assessment, bring in a facilitator who is familiar with assessment in the law school context.
  • Do not underestimate student involvement. Talk about student learning outcomes in classes. Widely disseminate your program learning outcomes.
  • Include institutional research – assessment is data driven, so you have to determine how you are going to gather that data from the beginning.
  • Close the loop on assessment. How will you use your results. Is it on the agenda of faculty meetings? Do you have quality improvement or action plan at the end of an assessment cycle? Do you regularly conduct program review (comprehensive review periodically)?

Thank you Maryann!

Resources from the Building an Assessment Plan Conference

In the morning session of the conference on Building an Assessment Plan from the Ground Up, presenters shared an overview of assessment in action.

Professor Andrea Curcio, Professor of Law and Co-director Externship Program at Georgia State University College of Law, presented on “The Purposes of Assessment” by providing two powerful examples of how assessment challenges faculty to change and improve their teaching. The first of these targeted a learning outcome of developing cultural sensitivity. She described challenging survey results from her law school regarding student ability to perceive their own cultural lens and the teaching challenges those results presented. The second was the learning outcome of analytical reasoning and her efforts in her civil procedure class to improve student skill through formative feedback. For both of these, Andrea has published articles that provide resources and analysis of these assessments.

Professor Vicki VanZandt of the University of Dayton School of Law and Professor Kelley Mauerman of Whittier Law School presented a primer on “Assessment 101” including much of the language of assessment, providing examples from their own schools off their curricular planning process.  They emphasized the importance of starting small and building an assessment plan from activities faculty already engage in. They reminded the audience that not every outcome needs to be assessed every year, providing examples of their law school’s own curriculum maps and assessment plans. Read the learning outcomes and performance criteria from Dayton here: https://www.udayton.edu/law/_resources/documents/academics/learning-outcomes-and-performance-criteria.pdf and from Whittier here:

https://www.law.whittier.edu/index/experience/mission-and-goals/institutional-goals/

Professor VanZandt is the co-author of Student Learning Outcomes and Law School Assessment: A Practical Guide to Measuring Institutional Effectiveness (Carolina Academic Press 2015) available for purchase here: http://www.cap-press.com/books/isbn/9781611632668/Student-Learning-Outcomes-and-Law-School-Assessment

More resources will be available, including slideshows and handouts from the conference.  Keep watching!

Building an Assessment Plan Conference off to a great start!

The conference on “Building an Assessment Plan from the Ground Up” is being held today at Whittier Law School. Professor Andrea Funk, Associate Dean for Lawyering Skills and Institutional Assessment
Professor of Lawyering Skills at Whittier, has created a comprehensive program for the sold-out crowd of faculty, most of whom have some responsibility for assessment planning at their law schools. For your use and enjoyment, I will be blogging from the conference today. Professor Judith Wegner,  Burton Craige Professor of Law at the University of North Carolina School of Law, began asking us to all think about words that begin with “A” and reflect on how many of those are disagreeable: aspirin, ache, anger, accreditation, assessment! She provided a powerful overview of three of these “A-words” that many faculty find very foreign: pressures for accountability in accreditation processes that require assessment of student learning outcomes. You can read more about the history and context of accreditation here:  .  Download her slideshow here (scroll to “Assessment”).

 

What Makes Your Subject Distinctive?

As law schools continue to develop their learning outcomes, an important question we all should consider is, “what makes my course distinctive?”  For example, in my research on assessment in legal research courses, I was struck by how much the analytical and problem solving skills developed by legal research instruction are the same as those developed by many other courses in the law school curriculum.  That led me to ask, “what makes legal research instruction distinctive?”  The answer was not simply, as an outsider might suggest, that legal research classes teach tools for finding law (digests, Westlaw, etc.).  Rather, I was struck that legal research instruction is distinctive in the extent to which an effective legal researcher must have an appreciation for the power of taxonomies, must exercise imagination in the context of realistic boundaries of time, cost, and purpose, must be able to ask for help, and must develop strong metacognitive practices (to continually question “is this process working?”).  The difference is of degree rather than kind of course, but it is a distinctive difference nonetheless.

Given the narrow focus of legal education, it seems that this question of distinctiveness or “value added” is the most critical question I can ask in planning my courses.  Not that the distinctive outcomes of my courses should be the sole, or even dominant outcomes.  Legal education outcomes require an iterative process and cross-curricular experiences for students to become competent and to enable transfer of learning to new settings.  Yet, understanding what makes my outcomes distinctive forces me to justify my outcomes and consider their connections with other law school outcomes.

So what makes my outcomes in Professional Responsibility distinctive?  Certainly the identity of the anticipated uses of the doctrine we are learning leads me to choose to emphasize professional identity formation outcomes as important if not distinctive.  In most law school courses, students are learning the law to serve others and are encouraged to use, interpret, and advocate about the law to achieve a client’s objectives.  In Professional Responsibility, the students will be using the law to advise themselves.  My outcomes include expecting that students will be able to clarify their observational standpoint when considering issues of professional ethics; recognize that self interest clouds judgment and ways to gain more objectivity; and differentiate the approaches to interpretation of law that one might use to advocate for a client regarding past conduct from approaches that are wise, ethical, and effective when interpreting the law to guide our own future conduct.  Finding effective methods to assess students development of these perspective is a challenge but I have found that simply asking students to read cases of attorney discipline and ask, “what went wrong with the attorney’s thinking?” is a good place to start.

What makes your course outcomes distinctive?  How has that led to distinctive assessment practices?

Workshop on Measuring Learning Gains June 22-24

Registration is now open for the AALS Midyear Workshop of Measuring Learning Gains. The Workshop will address assessment of learning and evaluation of programs.  The workshop promises to be a hands-on program for legal educators to develop assessment plans and the tools and techniques to make those plans a success.
Register at the AALS website.

Five Problems to Avoid in Writing Student Learning Outcomes

As law faculty across the country strive to improve student learning and meet ABA standards of accreditation through the assessment process, it is perhaps appropriate to stop and assess our efforts in that regard.  Here are five common problems that occur when first writing learning outcomes for a course:

1. Don’t focus on you – focus on the students
Student learning outcomes are designed to give students an idea of what they will be learning.  Avoid learning outcomes that describe what or how your will teach and instead focus on what the students will be able to know, do, or believe.

NOT: UMKC457  Trees as Thought
Student learning outcome:  In this course, I will be exploring the philosophical thought experiment “If a tree falls in a forest and no one is around to hear it, does it make a sound?”  I will explain my book “Trees as Focal Points for Reality” and refute critics of the proposals presented therein.

BETTER: UMKC457  Thought Experiments
Student learning outcome:  At the end of this course, students will be able to think critically and communicate effectively the metaphysical theories regarding the existence of that which cannot be perceived. Students will be able to describe how the theory of subjective idealism has impacted religious and scientific philosophy.  Through discussion and written reflection, students will demonstrate clarification of their individual values.

2. Avoid Vague Verbs
Probably one of the most common verbs found in student learning outcomes is “understand,” as in “students will understand [course content].” The problem with this as a learning outcome is that it is difficult to know what evidence would demonstrate that understanding.  A student learning outcome that uses more active and concrete verbs can unpack the type and degree of “understanding” that a professor expects.

NOT:   LAW8000  Family Law
Student learning outcome:  Students will understand the law regarding marriage regulation and the constitutional constraints on that regulation and the law of divorce, including child custody.

BETTER:  LAW 8000 Family Law
Student learning outcome: At the end of this course, students will be able to:
• identify the legal issues raised by a fact pattern involving a marriage regulation, make critical and effective arguments regarding the meaning of that regulation and its constitutional validity, and confidently predict the outcome of a challenge to that regulation
• identify relevant facts necessary to gather from a client seeking a divorce and child custody with property including real estate and pensions; draft a complete and legally effective petition for that divorce and custody action, including a parenting plan; and identify legal issues and make critical and effective arguments, applying the statutory and case law, to determine the divorce, property division, child custody and economic support in the case.
To read more about it, see Chapter Two. Understanding Understanding, of GRANT WIGGINS & JAY MCTIGHE, UNDERSTANDING BY DESIGN (2nd Ed. 2005).

3. Avoid “elementitis”
A student learning outcome should not merely summarize the syllabus or be a list of topics the course will cover.  Rather, the student learning outcomes should focus on thematic elements that tie these topics together or ways in which the students will be able to use this knowledge.  As David Perkins of the Harvard Graduate School of Education notes:
We educators always face the challenge of helping our students approach complex skills and ideas. So what to do? The two most familiar strategies are learning by elements and learning about. In the elements approach, we break down the topic or skill into elements and teach them separately, putting off the whole game until later — often much later….to have a little fun I call it ‘elementitis.’
DAVID PERKINS, MAKING LEARNING WHOLE: HOW SEVEN PRINCIPLES OF TEACHING CAN TRANSFORM EDUCATION (2010).  Avoid student learning outcomes that are plagues by “elementitis” and describe instead what it is students will be able to do with course coverage.

4. Don’t Always Expect Mastery
Student learning outcomes should indicate not only the content the students will learn but how well they will learn it.  We cannot aim for mastery of all aspects of the course.  Rather, learning outcomes in some courses are necessarily going to be at an introductory level (students will “recognize” or “describe” or “identify”) while other outcomes may be aimed at higher levels of mastery.  An effective tool to determine the proficiency level of your learning outcomes is Bloom’s Taxonomy of Educational Objectives, which provides a hierarchy of increasingly sophisticated learning outcomes.  To read more about it and see a list of verbs associated with differing levels of learning, see Rex Heer, A Model of Learning Objectives from Iowa State University Center for Excellence in Learning & Teaching (2012). To read an application of this model to law school, see Paul Callister, Time to Blossom: An Inquiry into Bloom’s Taxonomy as a Hierarchy and Means for Teaching Legal Research Skills 202:2 LAW LIBRARY JOURNAL 191 (2010-12).

5. Don’t Avoid Outcomes that May be Difficult to Measure
Student learning outcomes for a classroom rarely will focus entirely on the acquisition of knowledge.  At a minimum, most classes expect students to develop their cognitive and communication skills in using the knowledge base of the course.  Courses may also help students to clarify values, reconsider beliefs, appreciate new perspectives, or develop greater self-awareness.  Some faculty recognize that these skills and values are some of the most important benefits that students take away from the courses, but are reluctant to state these as learning outcomes because they are unable to “test” these outcomes.  However, any important skill or value can be assessed – even if there is a good deal of subjectivity involved in that assessment.  By stating these objectives as learning outcomes, faculty members can challenge themselves and their students to more clearly describe the dimensions of this learning.  Measurements of this learning may be through written reflections, observations of performance, or surveys of opinions.  These are perfectly valid assessment tools.

Unlearning as Learning Outcome

As the newly revised ABA accreditation standards 301 and 302 now require law schools to clearly articulate and publish their learning outcomes for their students, so individual faculty members must do likewise. Yet it is not uncommon to see these learning outcomes statements that read like the table of contents of the textbook used to teach the course. To truly be effective in driving learning and teaching, learning outcomes must be targeted, concrete, measurable and active (not “learning about” but “learning how to”).

How do we most effectively choose and articulate these learning outcomes? In MAKING LEARNING WHOLE: HOW SEVEN PRINCIPLES OF TEACHING CAN TRANSFORM EDUCATION 83-89 (2010)., educational specialist David Perkins emphasizes that learning is most effective if learners “work on the hard parts.” Similarly, the UNDERSTANDING BY DESIGN framework, originally developed by Grant Wiggins and Jay McTighe, emphasizes beginning the search for course goals by looking for the “Big Idea” in the course. These are the ideas or themes that can be used throughout a legal career and that require a lot of work to master.

One of the most effective ways to uncover these “big ideas’ or ‘hard parts” is to focus first on unlearning outcomes – that is, preventing and addressing predictable misunderstandings in the course. Thus, for example, much of the first year of law school is devoted to “unlearning” the positivist philosophy of students who believe the law is resolutely determinate. These fundamental misunderstandings are persistent, difficult to overcome and block learning of new ideas. Students construct knowledge by building on prior understandings. If those prior understandings are incomplete or incorrect, new learning will be flawed as well. As summarized by NATIONAL RESEARCH COUNCIL, COMMITTEE ON DEVELOPMENTS IN THE SCIENCE OF LEARNING, HOW PEOPLE LEARN: BRAIN, MIND, EXPERIENCE, AND SCHOOL: EXPANDED EDITION 11 (2000), “teachers need to pay attention to the incomplete understandings, the false beliefs, and the naive renditions of concepts that learners bring with them to a given subject.”

In her new book, Building a Better Teacher: How Teaching Works (and How to Teach It to Everyone) ( 2014), Elizabeth Green reviews the research concluding that effective teachers (as measured by student learning gains) are those who are able to identify the reasons that students misunderstand and help them to unlearn those misunderstandings.

Some of the most fundamental misconceptions that students bring to a subject from their own experience (or from bad course outlines passed around from prior semesters) must be discovered in the classroom. Brief classroom assessment devices such as “minute papers” or statements for the students to complete can easily generate a range of incorrect or incomplete understandings for any given topic.  The mission to discover student errors leads faculty to many of the best practices in teaching: regular interaction with students, frequent and meaningful feedback, and active learning strategies.

The power of an “unlearning” perspective on assessment improves student learning, but also quickly leads faculty to a deeper understanding of what assessment of student learning oucomes means.  Assessment is not an end-point, a box to be checked, reported and forgotten, but is an iterative process of discovery and experiment that drives students and faculty learning alike. Assessment tools (such as quizzes, socratic dialogue, essays, simulations, and reflections) might be used to unearth student misconceptions.  These misconceptions then become the basis for the learning outcomes around which one can build a course and assessments then can be used to determine the extent to which one is successfully dislodging misunderstanding and misconception and replace it with a solid framework mastery.

A rose by any other name: Evaluation and Assessment at Cross Purposes

A barrier to developing, improving, or sharing our assessment practices is the confusion surrounding the vocabulary of assessment.  Whenever it occurs or by whatever method, assessment is simply the process of discovering what and how well students have learned and then using that information to improve. One can quickly become mired in a sea of words that feel like jargon, with assumptions that confuse and distort the real meaning of this otherwise familiar practice of all good instructors. Part of the problem is that the language is not our own and so, by its very adoption, reinforces the impression that assessment is an intrusion into our classrooms.  Because these confusions are so destructive to the ability of an institution to move forward with assessment, we must either work to make this vocabulary our own or develop a different vocabulary for the same ideas.

Even though discovering what students have learned in order to improve teaching is a natural part of a good teacher’s practice, law schools are having difficulty in knowing exactly what this talk of assessment means. Faculty frequently mistake outcomes assessment for something more complex, unusual, or even sinister. “Assessment” becomes confused with “evaluation” (as in program or teacher evaluation) or “standardized testing,” and, before long, we are thinking of K-12 school district funding decisions based a “No Child Left Behind” external control of education.

There is a fundamental difference between assessing student learning for the purposes of program or teacher evaluation and assessing student learning for the purposes of improving that learning.  If we are assessing for accountability, we collect data (e.g., pass rates) about students learning outcomes that we do not necessarily control (e.g. bar exams) so that we can report that data to external constituencies (e.g. accreditors). In contrast, if we are assessing for student learning, we observe evidence (e.g., essays, performances) of student learning outcomes that we have designed ourselves so that we can interpret and use that evidence to improve the learning of our students.  When accountability to those outside the learning process is the driving force behind assessment, the temptation may be to assess only those learning outcomes that we know students have mastered and avoid looking for places where learning could be significantly improved.  We might skew our teaching and curricula away from learning outcomes we truly care about to more closely match the learning outcomes we believe outsiders consider important.  Of course that already does happen to some degree.  The influence of ABA standards of accreditation and bar examinations on curricula is so obvious we may not even recognize the degree to which our faculty control of the program of legal education is directed by these learning outcomes and assessment methods.

It is against this backdrop of fear that some law teachers approach the topic of outcomes assessment.  However, resisting assessment out of a concern that others will rob law faculty of their freedom means giving up one of the most powerful tools to protect that freedom. If a faculty can clearly communicate the learning goals they have for their students, and can demonstrate how their program of legal education leads to more students accomplishing those learning goals at higher levels of mastery, that proof of learning can become powerful tool for demonstrating accountability: to the students, the academy, the bar, and the public. That is not to say that assessment for accountability will not be required or should not be undertaken with seriousness of purpose and honesty in method.  Assessment for improving student learning, however, should be just as important, if not more so, so that we can be accountable to ourselves and our students.

Best Practices in Teaching Legal Research (The Boulder Conference)

The American Association of Law Librarians is meeting this weekend in Boston Massachusetts and best practices in teaching legal research is a hot topic.

In the two days preceding the AALL meeting, an ad hoc working group of librarians (which I was privileged to join as an emissary) met at Harvard Law School for the Fourth Conference on Legal Information: Scholarship and Teaching (known as “The Boulder Conference”). The first Boulder Conference was held to respond to the Carnegie and Best Practices reports, neither of which addressed information literacy and legal research education with much depth at all. The group produced the Boulder Statement on Legal Research Education and the Signature Pedagogy Statement (which you can find at http://www.utexas.edu/law/faculty/pubs/bb26663_pub.pdf). In subsequent conferences, the group has gone on to encourage significant scholarship on the pedagogy of legal research education and will be producing a book on the subject, which will include a template for creating legal research classes based on the signature pedagogy. For more information, contact Susan Nevelow Mart at susan.mart@colorado.edu or talk to your local law librarian!

Curriculum Materials for Access to Justice

A recent white paper  authored by Professor Deborah Rhode and Dmitry Bam for a consortium on access to justice concluded that access to justice issues are insufficiently covered in many law school curricula. For example, one national survey found that only one percent of law school graduates recalled coverage of pro bono obligations in their professional responsibility class or orientation program. Although many students are exposed to access issues in their clinical courses, rarely do these classes find time to “provide in-depth coverage of structural concerns in the delivery of assistance.”

To help address this situation, the Stanford Law School Center for the Legal Profession is compiling on its website law syllabi and course material relevant to key access to justice issues, including (but not limited):

• Limitations in the right to counsel and its enforcement;

• Landlord tenant, environmental justice, consumer, discrimination, immigration, and urban development concerns;

• The role of alternative delivery structures and non-lawyer providers of assistance;

• Professional responsibility, poverty, and public interest law;

• Pro bono responsibilities.

The Access to Justice Project welcomes contributions relevant to standard first year courses (such as civil procedure, torts, contracts, property, criminal law, and constitutional law) to advanced core courses such as corporations, family law, administrative law, civil rights, and professional responsibility, as well as clinics and skills classes. In other words, any curricular offerings that either incorporate access to justice as a central theme, or where the issue is threaded in part into a course that may not typically cover such issues.

Professors with questions or materials that are relevant to our project can contact the project’s research assistant LaToya Baldwin Clark at lbclark@stanford.edu.

An Experiment in Laptop Usage Policies in the Classroom

My colleague David Achtenberg has used a unique policy for regulating student use of laptop computers in his large enrollment Civil Procedure class.  He designates the back three or four rows in the classroom as an “Internet Usage Zone” where students can, within limits of reason and legality, use their laptops in any way they wish.  The rest of the classroom is designated an “Internet Free Zone” — students can use their laptops only for class notes and for referring to class-related materials on their laptop or a flashdrive.  These students are directed that they may not access the internet for any reason and they may not use their laptop for recreation, diversion, or “doodling.”    He emphasizes to the students that there is no relative advantage or disadvantage to their choice (and in fact he was unable to find a difference in grade outcomes related to the choice in prior years). 

This is Professor Achtenberg’s third year implementing this policy. He conducts a survey after grades are submitted.  The survey results this year indicated that approximately:

  • 71% thought the policy made the learning environment slightly (46%) or significantly (25%) better,
  • 25% thought it had no effect
  • 3.5% thought it made the learning environment slightly worse.  (No one responded that it made the environment significantly worse.)

Asked whether they would favor such a policy in future classes

  • 93% would somewhat (36%) or strongly (57%) favor such a policy
  • 3.5% didn’t care
  • 3.5% somewhat disfavored such a policy. (No one responded that they strongly disfavored the policy.)

More information about Professor Achtenberg’s approach, a copy of his policy and the way in which he introduces it to the class, and the complete survey results, including student comments, is available athttp://law2.umkc.edu/faculty/profiles/glesnerfines/laptopusage.pdf

 

 

 

Tales from the Assessment Trail

Like many schools, here at UMKC Law we have been working steadily on our assessment plan.  After two retreats, six focus group meetings with attorneys, countless meetings and even more emails, we have narrowed our outcomes down to 126 skills and values outcomes.  Each faculty member has exercised their six “votes” on those outcomes that they would like to first target for comprehensive assessment across the curriculum. 

It will come as no surprise that outcomes in the category of legal analysis garnered the most votes.  Devising comprehensive assessment for this outcome will simply be a matter of some conversations to insure we agree on a shared rubric. 

But coming in a close second in the polling was the outcome “be able to listen actively.”   It is a fascinating outcome on which to focus.  Apart from courses providing clinical skills training, active listening isn’t taught as pervasively as is analysis.  Even less so do we regularly assess our students’ ability to listen actively.  When one considers the amount of time students sit in classrooms listening (or at least hearing), it seems there could be ample opportunity to test the “listening” part of active listening.  The empathy part, on the other hand, could prove to be a game changer in our curriculum development.

Our next task, then, will be to determine where we currently teach this skill, where else we will want to incorporate this teaching, and – perhaps most challenging of all – how we can assess the skills of the entire student body.  Somehow the image of massive piles of bluebooks, most of which say “What I hear you saying is…” doesn’t quite cut it.  Suggestions?