Why Bar Examiners Should Eliminate Essay Questions and Focus on Performance Tests

The NCBE has announced that in four to five years, there will be a different bar exam.  In the interim, why not make changes to the existing exam so that it better reflects the skills needed in law practice?  One easily implementable change: eliminate the essay questions and use the time allotted to those questions for the performance test questions.

This idea was raised by  Professor Deborah Merritt in her talk at the BYU Annual Law and Leadership conference. The idea stems from her groundbreaking study on the skills new lawyers actually need to competently represent clients.  Professor Merritt and researchers at IAALS analyzed data from 50 focus groups conducted in twelve states with a diverse group of new lawyers and their supervisors across a range of practice areas. 

The study data confirmed what we intuitively knew: that the bar exam MPT performance test most resembles what new lawyers do in practice. 

On the other hand, bar exam essay questions require examinees to memorize significant amounts of information beyond what they must already memorize for the multiple choice questions [MBE].  Many of the subjects memorized for the current essay questions will be eliminated from the NCBE’s proposed new exam.

The essay questions also do not allow examinees to research or review the applicable legal rules before answering, and they have little relationship to how new lawyers approach problems in practice.  On the other hand, the performance test questions do relate to the skills new lawyers use.

As Sara Berman notes in her book, Bar Exam MPT Preparation & Experiential Learning for Law Students mastering how to take the performance tests is an opportunity to practice skills necessary for law practice.  This book offers a logical and easy to follow process for studying for, and taking, the performance tests. 

Given the existing bar exam format, Berman appropriately devotes a significant portion of the book to strategies that help examinees understand how to write an effective answer in the time allotted.  Her excellent advice and strategies are particularly useful given the tight time frame examinees have to answer these questions and she provides a lot of sample questions for practicing skills, including reading all the materials and drafting a document in ninety minutes. 

Berman’s strategies for managing the MPT time constraints make sense because, as Professor William Henderson’s study confirmed, test taking speed on tests such as the bar exam is an independent variable when it comes to assessing test performance.  Thus, it is important to teach students to quickly answer test questions.  However, is that the message we should be sending?  Should we be encouraging speed over careful reading and ensuring accuracy?  

If the NCBE eliminated the essay questions and allotted that time to the MPTS, or even if it simply gave states the option to adopt this change, it could study its impact.  Currently, we don’t know whether three hours versus ninety minutes to answer an MPT question makes a difference, and if so, for whom. Does the difference affect those who come close to, but do not achieve, the arbitrarily set passing cut scores?   These questions are particularly important given the significant impact cut scores play on the profession’s racial and ethnic diversity. 

This minor change, which could have a major impact, is just one of the many innovative ideas presented by Professor Merritt at the BYU conference on law licensing reform.  Her remarks, as well as the remarks of other conference speakers, are available if you scroll to the bottom of this website.

Washington’s Emergency Diploma Privilege: A Practical Solution for Troubled Times

The Washington Supreme Court stands as a model for other courts to follow.  Recognizing the “extraordinary barriers” facing bar applicants this year, it recently issued an order allowing for licensure via diploma privilege as an option for graduates of ABA accredited law schools who are currently registered for either the July or September 2020 Washington bar examination. The order encompasses first time and repeat takers.  Examinees may still take the UBE if they want portable scores.

This order comes from the same court that recently wrote a powerful letter to the judiciary and legal community about our responsibility to recognize systemic issues that underlie and perpetuate racial injustice and to dismantle and disavow those systems.  As the justices noted:

“Too often in the legal profession, we feel bound by tradition and the way things have ‘always’ been.  We must remember that even the most venerable precedent must be struck down when it is incorrect and harmful”.

Numerous scholars have long argued that the existing bar exam, with its discriminatory impact, and its lack of relationship to skills needed for law practice, is a tradition that should be abandoned in favor of a licensing scheme that better measures minimum competence to practice law.

Scholars also have argued that we particularly need this year’s new lawyers because under-served communities have significantly increased pandemic-related legal needs and new lawyers disproportionately serve those needs.

As Dean Annette Clark wrote in an eloquent letter explaining why her faculty unanimously voted in favor of asking the Court to adopt a diploma privilege – this year is unlike any other.  This year, she writes, law graduates face pandemic-related health and financial issues  – issues that disproportionately impact communities of color.  This year, she notes, the murders of George Floyd, Ahmaud Arbery and Breonna Taylor, and the social unrest and calls to action that followed have affected graduates and have taken a particularly heavy emotional and mental toll on graduates of color.

This year, the mere act of taking the bar exam endangers examinees’ and their loved ones’ health– something states recognize because they are requiring examinees to sign waivers.  This year, graduates worry that after months of studying, public health issues may cause states to cancel the exam at the last minute.

While a pure diploma privilege may not be an appropriate long-term law licensing solution for most states, it is a practical solution given today’s world.   As Dean Clark states:

 “While requiring a bar exam for licensure is supported by long precedent, doing so now may be insupportable given the extraordinary circumstances these graduates are facing. The burdens of the coronavirus pandemic and the racial unrest we are experiencing are being disproportionately borne by our graduates of color as they struggle to prepare for the bar exam.  Removing the exam barrier to admission would be a step in responding to our graduates’ concerns and in bringing “greater racial justice to our system as a whole.”

Dean Clark’s  letter should be read in its entirety.  It sets forth strong arguments that should be presented to all state supreme courts with a request for re-consideration of the decision to hold a traditional bar exam this year.  It also lays the groundwork for why each state should establish a post-pandemic study group to evaluate whether the existing bar exam is the best way to determine admission to the bar.

Kudos to the Washington Supreme Court, as well as to the deans, faculty and students at the Washington law schools for their advocacy.  Their solution addresses problems faced by all of this year’s graduates while also addressing inequities that cannot be denied.

 

Risking Illness and Death For The Chance To Become A Lawyer


In this blog post Professor Allie Robbins eloquently sets out why, during this crisis, states should implement alternative law licensing methods. She argues states should forego in-person exams this year to avoid the inevitable health risks to bar-takers, their families, and the members of the public who come into contact with them. The entire blog is set out below. It’s a powerful piece that hopefully will find its way to state boards of bar examiners and state supreme court justices as “food for thought”.

“Please do not hold the bar exam this year. Do not hold it in July. Do not hold it in September. Maybe not even next February. Figure out a better way to license attorneys. There are many options.

Recent liability waivers from Mississippi and North Carolina demonstrate that you understand the serious dangers posed by taking a two-day exam in person surrounded by a lot of other people. The fact that you are finding it difficult to find proctors and are asking for help from “young attorneys” says that your own people are telling you it is too risky.

Putting hundreds, or even dozens, of stressed out people in a room together for hours and hours in the midst of a global pandemic is a public health disaster. Temperature checks only tell you if someone has a fever, not whether someone has the virus. Many transmissions if COVID-19 occur from asymptomatic individuals. Masks are helpful, but not 100% effective. Some of you are permitting masks but not requiring them, leaving many potential carriers the option of exposing those around them. Have you tried to do 100 MBE questions with a mask on? The anxiety surrounding the bar exam is immeasurably high in normal times. This, is taking it to an unimaginable new level.

There doesn’t have to be a bar exam. But if you think there does, you can do it online, as Nevada and Indiana will be doing. Those of you in UBE states may feel that you no longer have the capacity to administer your own exam. But you did it not that long ago. You can do it again. There are expert exam writers at your local law schools who can help. You could even work together and have each state draft one essay question. There are many options. Please explore them.

There are concerns about exam security. But if that is the driving force for having an in person exam, then you are prioritizing fears about cheating over the protection of human life. Do you really think so little of the next class of attorneys that they must risk their lives to prove to you that they would not cheat on an exam?

Ask yourself honestly, would you take the bar exam during this pandemic? Would you encourage your child to? Do you really want to be responsible for the next surge in your community?

You may counter that hundreds of applicants have registered so they must want to take it. But what they want is to be a licensed attorney. Their registration is a product of employer pressure, fear, and the perverse incentives of the legal profession. Despite their registration, they are terrified.

We owe it to the newest members of our profession not to sacrifice their health for the sake of the status quo. Nothing is the same right now, and it shouldn’t be. We are in the midst of a worldwide health crisis. Is the bar exam actually worth risking their lives for? Is there really no way to determine who is minimally competent to be an attorney, other than to have them all gathered together in a room for two days?

The purpose of the bar exam is, ostensibly, to protect the public. What about protecting law graduates?

These waivers may protect you from legal liability, but if you go forward with an in-person bar exam, you will still be responsible for spreading COVID-19 throughout the population of new law graduates, their friends, families, and communities. How can this possibly be the right decision?”



A NY bar exam option that doesn’t require fighting for seats

 

In this article, a group of scholars and educators identify a practical and viable alternative licensing path for this year’s New York bar takers. The authors suggest that New York use the MPRE, its existing New York exam, an expanded Bridge the Gap program, and supervised practice to license new lawyers.  They posit that this alternative path to licensure would ensure knowledge of New York law, the Rules of Professional Conduct, key subjects covered in Bridge the Gap courses, and would be a true performance test of minimum competence.  Their plan, and its underlying reasoning, is worth a read and it merits serious consideration for numerous reasons.

 

First, it sets out a way to avoid constitutional challenges to New York’s recent announcement that there is limited seating for the 2020 bar exam [assuming the exam can be safely administered] and that in-state law school graduates will have registration priority.   Second, it provides a way to ensure that New York has a sufficient number of attorneys to address the increasing pandemic-related legal needs people face now and in the immediate future.   Finally, and most importantly, the suggestions lay out a viable alternative path for New York to ensure newly licensed lawyers possess minimum competence in the range of skills necessary to represent clients.

 

New York is not the only state facing limited seating issues.  Massachusetts recently announced a plan for seating that mirrors New York and other states have noted that seats at their exam tables are already full.

 

The limited seating dilemma many states face because of public health issues suggest states need to seriously consider ways beyond the bar exam to license this year’s law graduates.

 

The creative thinking these scholars demonstrate is the kind of work we need during this pandemic. The model they suggest for New York could be adapted by other states.  States could offer an option like this in addition to, or in lieu of, the traditional bar exam if public health reasons make it unsafe to administer the exam.

 

It’s a whole new world out there and sticking to the status quo just isn’t going to work – as the current competition for limited bar exam seats illustrates.

Care-Taking, Gender & Scholarly Productivity During the Pandemic

In normal times, research suggests that women faculty take care of the academic family, providing more institutional service and student care work than their male colleagues.  This cuts into time for scholarship.  In a world where scholarship leads to rewards and advancement, studies document the problematic nature of this division of responsibilities.  Emerging research suggests that pandemic related care-taking responsibilities exacerbate the problem.

In many families, women carry a heavier burden when it comes to childcare, housework, grocery shopping, meal planning, and other household responsibilities. During the pandemic, these responsibilities have increased. While both spouses may be home, anecdotal stories suggest that employer expectations are resulting in gender inequalities when it comes to pandemic household and child-care related responsibilities.

The pandemic also has not alleviated the need to work with students or perform other institutional service.  In fact, for some faculty, those responsibilities have increased because many of us have seen an increase in student care needs.  While no data exists during the pandemic, pre-pandemic literature suggests students have higher nurturing expectations from female faculty.  There is no reason to expect the pandemic turned the table on this gendered expectation.

The upheaval caused by the pandemic, including increased care-work,  likely is negatively impacting some faculty members’ ability to engage in scholarship and the impact may not be evenly distributed across the board.  Initial data in higher education, based on journal submissions, suggests that the pandemic has had a more profound effect on women’s scholarly productivity than on that of their male colleagues.

The ability to be as productive a scholar as colleagues less burdened by familial, student, and institutional needs and expectations is an admittedly small problem compared to those dealing with deaths, significant health challenges, lack of food and shelter, and a host of other pandemic related issues.

It also is important to note that in some cases, the care burdens are falling to male faculty and that many women faculty have been incredibly prolific in the last couple of months.  Nonetheless, initial data suggests that, on average, women faculty members’ ability to produce scholarship during this pandemic is not equivalent to that of their male colleagues, thus potentially exacerbating a system already rife with gender inequities.

While the data looks only at gender, given the pre-pandemic literature on heightened student care work and institutional service expectations for faculty of color, it is reasonable to think that further study will show disparities there, too.

Below are some suggestions for addressing the added disparities created by this pandemic.

First, many universities have extended the tenure clock by one, and in some cases, up to two years.  Faculties should press those universities that have not yet done so to change their tenure clock policies. To help persuade those reluctant to make the change, it may be useful to point to emerging studies on how the pandemic exacerbates already existing gender inequities when it comes to time available for scholarship. Pushing back the tenure clock policies acknowledges the unique circumstances all faculty may face, and helps lessen some disparities resulting from pandemic-related care-taking.  Faculties also should ensure that those who take the extra time are not judged poorly for doing so.

Second, faculties and universities, in assessing tenure and promotion portfolios, should generally acknowledge that biases exist in evaluating faculty with similar qualifications.  In the Covid-19 era, it is even more important to identify the realities facing many faculty members, and the implicit biases that affect judgments about scholarly quality and productivity.

Third, in making promotion, tenure, or merit award decisions, those assessing scholarly productivity need to consciously fight the instinct to engage in comparisons of faculty members’ scholarly productivity during the pandemic.

Finally, all faculty need to take a deep breath and recognize the stress we all are under.  We need to give ourselves, and each other, permission just to get through this and not also feel the need to be prolific scholars in a time when, for some folks, making it through the day takes everything they have.

 

From Professor Griggs: A thank you and a request to bar examiners

Professor Marsha Griggs wrote an eloquent blog on the ASP blog site.  She both thanks bar examiners for their hard [and often under-appreciated] work and she urges them to listen to ideas from students, deans, and the legal community when it comes to thinking outside the box  during this pandemic.  The blog is worth a read.  

Covid 19 and the 2020 Bar Exam

See this link for a short policy paper on alternatives to licensing this year’s law graduates and why state supreme courts should consider extraordinary measures in extraordinary times. If you know a state bar examiner, state bar leader, or state supreme court justice, please consider sharing this with them. 

Thinking outside the box for this year makes sense on many levels. As the paper notes, legal needs will increase as result of this crisis. Today’s graduates have a technological savvy that makes them  particularly well equipped to handle moving to online delivery of legal services. Additionally, delaying their entry into the profession will cause them to  suffer serious financial and emotional issues beyond those they already confront because of this virus.  And, any changes to state licensing requirements is a relatively low risk proposition given that the change would be a one-time emergency action.  

The paper hopefully lays the groundwork for  discussions on the state level about options beyond postponing this year’s bar exam [an option that, as the paper explains, carries its own risks and uncertainties].  Again, if you think that at least discussing options has merit, please share widely.

 

 

Dean Gerken’s Vision Versus Malcolm Gladwell’s Experience

“When we decide who is smart enough to be a lawyer, we use a stopwatch.”           Malcolm Gladwell

“Law school should be a time to luxuriate in ideas, to test their principles, and to think critically about the law and the profession.”  Dean Heather Gerken

On the same day I listened to Malcolm Gladwell’s fascinating podcast about the LSAT and test-taking speed, I also read Yale Dean Heather Gerken’s insightful Commentary, “Resisting the Theory/Practice Divide: Why the “Theory School” Is Ambitious About Practice.” Both are wonderful.  Together, they shine light on a dialectic tension within legal education.

Dean Gerken’s article inspires us to think about legal education in its biggest and broadest sense.  She posits that, “At its best, a J.D. is a thinking degree, a problem-solving degree, a leadership degree” and she notes that for students, “law school should be a time to luxuriate in ideas, to test their principles, and to think critically about the law and the profession.”

She envisions law school as a place where students engage in deep critical thinking about the law and the profession – both in the classroom and in clinics, and she discusses the interdependent relationship between the deep learning that should occur in both.

Dean Gerken eschews a mechanistic approach to both classroom and clinical teaching.  She points out that as doctrinal and clinical faculty, our collective, and symbiotic, goal should be to train our students to read closely, think deeply, skeptically, and critically.  She notes that we should help our students learn to question legal rules and principles in context of messy facts, to challenge existing legal rules, and develop new rules or applications of those rules, or as Dean Gerken puts it, to spend as much time thinking about “the ought as the is”.

Contrast Dean Gerken’s understanding of legal education with Gladwell’s podcast about his experience taking the LSAT.  In it, he posits: “when we decide who is smart enough to be a lawyer, we use a stopwatch.”   He notes that who gets into law school, and what law school they get into, rests largely on LSAT score differences – differences that may depend in part upon one’s ability to answer questions quickly rather than thoughtfully.

Gladwell recounts his experience with an LSAT test prep coach who urged him not to read the passages closely because he had no time to do that.  Amazed, Gladwell reflects on how, to get the best score, he must not spend time truly thinking about the issues raised by the problems he must answer.

In the podcast, Gladwell talks to Professor Bill Henderson, the author of a seminal article providing empirical evidence that test-taking speed is an independent variable in both the LSAT and timed law school exams.  Henderson, a former firefighter, talks about the times in his life he felt most time pressured.  As Gladwell remarks, Professor Henderson’s most time-pressured performances were not when responding to life-threatening emergencies.  Instead, they were when he took the LSAT and law school exams.

Gladwell’s podcast meanders into the world of championship chess.  Gladwell analogizes how the chess world decided not to value speed, and how that decision changes who is a top-ranked international chess champion.  He notes that the arbitrary value placed upon speed when it comes to the LSAT and law school exams defines who we consider smart.  He wonders what would happen if the ability to answer questions quickly were not in the mix.  The podcast then returns to Professor Henderson who talks about how allowing law students more time to take law school exams can change the outcome of who gets the best grades in a law school class, and hence who thinks of themselves as a smart person, and who gets hired by top law firms, etc.

As I listened to Gladwell’s podcast, I thought about the bar exam.  In an article Professors Chomsky, Kaufman and I wrote, it took us nearly 500 words to deconstruct the analytical process one must go through to answer one tort multiple choice bar exam question.  That analytical process begins after examinees read a question.  Bar examinees have approximately 1.8 minutes to read and answer each of the exam’s 200 multiple choice questions.

While perhaps quickly identifying the correct response is a necessary skill for some litigators, speedy answers to legal problems are not the cornerstone of most good lawyering.  Yet, starting with the LSAT, continuing in law school, and ending with the bar exam, as Gladwell observes, we reward the hare instead of the tortoise.  He asks “why”?

Dean Gerken’s vision speaks to why I became a law professor.  Gladwell’s observations speak to the experience of my students.  I am not sure how to reconcile the two beyond noting that we must first acknowledge the dialectic.  Only then can we decide if we want to  judge future lawyers’ potential and abilities based upon Gerken’s vision or  Gladwell’s experience.

Letters raise concerns about changes to the bar pass accreditation standard

Early next week, the ABA House of Delegates will again vote on whether to approve a revised bar passage accreditation standard [Standard 316]. The Society of American Law Teachers and the ABA Diversity Entities both have written to the ABA House of Delegates setting forth significant concerns about the proposed standard change.  Both letters are worth a full read.

Amongst the issues the letters raise about the proposed change are the following:

1.  There is incomplete data about how it will affect HBCU’s and other law schools with significant enrollment of people of color;

2.  It fails to account for state bar exam cut score differences and differences in state bar exam pass rates;

3.  It may result in even greater reliance on LSAT scores in the admissions process despite studies showing the scores’ limited predictive value for academic or bar exam success at many schools and despite warnings from the LSAC about how to use the scores properly in the admissions process;

4.  It may negatively impact schools willing to take a chance on students who are poor standardized test takers but who will be excellent lawyers and leaders if given the opportunity to attend law school and the coaching necessary to pass the bar exam;

5.  It does not consider the effect of transfer students on bar pass rates for schools that admit students who otherwise would not be admitted to law school, who perform well, and who then  transfer to other institutions;

6.  It eliminates some important aspects of the current Standard that take into account varying state pass rates, a school’s mission, the transfer issue, and the fact that improving bar passage is a complex and nuanced issue that requires study and experimentation [something currently underway at many schools];

7.  Now is not the right time for change given current studies about the validity of the bar exam as a licensing method and work being done to explore law licensing assessments that better measure who will be a competent attorney.

Proponents of the proposed change to Standard 316 believe it is necessary to protect consumers from law schools that admit students without devoting the necessary resources to ensure bar passage or that admit and retain students who have no chance of obtaining a law license.  The letters cited acknowledge the importance of the consumer protection issue but argue that issue can, and should, be addressed in other ways.

If you have concerns about the proposed change to Standard 316, contact your state ABA delegate.  The delegate information starts on page 13 of the ABA 2018-2019 Leadership Directory.

Concrete Suggestions for Bar Exam Reform

Many of us have spent decades critiquing the bar exam, and particularly the MBE’s multiple choice question content and format. In How to Build a Better Bar Exam, a short essay written with Professors Carol Chomsky and Eileen Kaufman, we discuss two law licensing exams that provide concrete examples of how to address some of those critiques.

Abstract

“As a licensing exam, the purpose of the bar exam is consumer protection–-ensuring that new lawyers have the minimum competencies required to practice law effectively. As critics point out, however, the exam, and particularly the multiple-choice question portion of the exam, has significant flaws because it assesses legal knowledge and analysis in an artificial and unrealistic context, and the closed-book format rewards the ability to memorize thousands of legal rules, a skill unrelated to law practice.

This essay discusses how to improve the exam by changing its multiple-choice content and format. We use two law licensing exams to illustrate how bar examiners could utilize an open-book format and develop multiple-choice questions that assess a candidate’s ability to engage in legal reasoning and analysis without demanding unproductive memorization of so many detailed rules of law. The first example, the case file approach, is drawn from a 1983 California “Performance Test” in which test-takers received a case file and a series of multiple-choice questions testing the candidates’ ability to read, understand, and use cases to support their legal positions. The second example discusses the current licensing exam administered by The Law Society of Upper Canada (LSUC), an open-book multiple-choice exam that tests the use of doctrinal knowledge in the context of law practice.

These two licensing exams demonstrate how we could re-structure the bar exam’s multiple-choice questions to measure legal analysis and reasoning skills as lawyers use those skills to represent clients. They also demonstrate that we can do a better job of testing some aspects of minimum competence, while still using a multiple-choice exam format.”

Andi Curcio

Gathering Institutional Learning Outcomes Data

As law schools engage in outcomes assessment a key question involves how to collect institutional data on student achievement.  In A Simple Low Cost Institutional Learning Outcomes Assessment Process, I suggest one way to engage in data collection that requires relatively little additional law faculty time, relatively minimal expense, and does not require faculty to change how we teach or assess in our own courses.

Law school institutional learning outcomes require measuring nuanced skills that develop over time. Rather than look at achievement just in our own courses, institutional outcome-measures assessment requires collective faculty engagement and critical thinking about our students’ overall acquisition of the skills, knowledge, and qualities that ensure they graduate with the competencies necessary to begin life as professionals. Even for those who believe outcomes assessment is a positive move in legal education, in an era of limited budgets and already over-burdened faculty, the data collection necessary to engage in the new mandated outcomes assessment process raises cost and workload concerns.

To address those concerns, the article describes a process being used by Georgia State University College of Law [GSU COL] to collect institutional learning outcomes data.  GSU COL has developed a rubric method to assess a wide array of learning outcomes.

We modeled our process on work being done both by the American Association of Colleges and Universities [AAC& U] Values Rubrics Project and medical educators’ Milestones Project .  Those educators use rubrics to assess a wide range of nuanced skills such as critical thinking, written and oral communication, problem-solving, intercultural competence, teamwork, and foundations and skills for life-long learning.

Below, I briefly describe  GSU COL’s process for collecting institutional learning outcomes data.

The Institutional Data Collection Process

After identifying our institutional learning outcomes, we developed a five step institutional outcomes assessment process to collect data from GSU COL faculty.  The faculty data focuses on law student performance in various courses.

  1. Draft rubrics

First, we engaged our assessment committee and, in some cases, ad hoc faculty committees, in drafting rubrics. The rubrics had to be general enough that they were usable across a wide range of courses and adaptable to various types of course assessments. To draft the rubrics, we looked to our own experience and other sample rubrics such as those developed by AAC&U and medical educators as well as those developed by legal educators.  The article’s appendix contains GSU COL’s draft rubrics for our eight learning outcomes.

  1. Pilot test rubrics

Second, we identified courses that would use the rubric – courses where the skills being measured were already being assessed.  For example, for the basic legal knowledge and analysis outcomes, we chose first year and upper level doctrinal courses.  For self-reflection and client interaction outcomes, we chose clinics, etc.

We are pilot testing each rubric with faculty who will use the rubric and using their feedback to refine the rubric.  Because our assessment process is cyclical, each year, we pilot two rubrics and use two rubrics for actual data collection.  Thus, our rubric development process remains a work in progress and it engages a significant number of faculty members.  This helps ensure validity, engages faculty outside the assessment committee, and, hopefully, builds faculty buy-in.

  1. Use the rubrics

Third, every year, we ask faculty in designated courses to assess and grade as they usually do, adding only one more step – completion of a short rubric for each student.  Most faculty members have said this process adds very little additional time to grading.

Given our different outcomes and the cyclical nature of our assessment process, each year, different faculty will use the rubrics.  For example, one year, legal knowledge and analysis rubrics will be completed by doctrinal faculty.  The next year, legal research and writing faculty as well as  seminar faculty who assign papers will complete rubrics focused on legal research and writing, etc.  Thus, we spread the workload and engage as many faculty members as possible in the institutional outcomes assessment process.

  1. Enter the data

Fourth, we enter the rubric data from each course into a computer. This data entry process can simply involve an Excel spreadsheet, an SPSS program, or it can be more complicated.  For example, we worked with GSU university computer programming graduate school GRAs to develop software compatible with the university computer system to allow us to manipulate the data in numerous ways. We currently are working on developing the software program so that it can be used by other institutions.

  1. Use the data to analyze student learning and make changes if necessary

Finally, we are using the data to prepare reports about institutional level student outcome achievement.  In order to increase the validity of our findings, our reports contain information collected from multiple sources. For example, for each institutional outcome we have data from the rubrics faculty complete and externship site supervisor evaluations. Additionally,  LSSSEE survey data has information relevant to many of our outcomes.  The results from all that data are included in the faculty learning outcomes assessment report.

This Fall, our faculty will discuss our findings on the first two outcomes we measured – legal knowledge and legal reasoning and analysis.  While I noted at the outset that the data collection process does not require faculty to change how we teach or assess, our discussions in light of the data we have gathered may lead us to collective decisions that  some of us will adjust our teaching and assessment processes in an effort to improve student learning.  That is the entire point and purpose of the learning outcomes measurement process. However, before we can begin that work, we had to figure out how to get  information that allows us to have informed discussions.  The steps summarized above, and described in more detail in the article, are one way to do that.

Other resources

The data collection method above can be used both to measure institutional and even course level learning outcomes.  However, multiple ways to collect data exist.  Other good resources with concrete data collection methodologies include Andrea Funk’s excellent book: The Art of Assessment, and Lori Shaw and Victoria Van Sandt’s seminal work, Student Learning Outcomes and Law School Assessment.

 

Drafting Exams With Test-Taking Speed in Mind

It’s time to write final exams again. It’s also time to struggle with what role test-taking speed should play in our assessments.

William Henderson’s Study

As Professor William Henderson’s ground-breaking study demonstrates, test-taking speed – how fast students can read and answer test questions– is often an independent variable when students take in-class timed law school exams.

As Professor Henderson cautions, given the high stakes nature of law school exams, we need to be cognizant of the test-taking speed variable and consciously decide whether it is one we think is important in terms of the substance or skills being assessed. As he notes, test-taking speed may have limited relationship to how lawyers use doctrine and legal analysis and it also may have a discriminatory impact.

Potential Solutions

Some faculty members address the test-taking speed issue by giving take-home exams or papers. Others do not want to give take home exams for a host of reasons, including the belief that the material does not lend itself to a take-home exam or concerns about policing academic honesty.   For those of us giving timed, in-class exams, are there ways to decrease the impact of test taking speed? Below I share a few ideas. I also invite those of you who have grappled with this issue to share what you are doing.

A. Pre-release exam instructions

One way to help students at least prepare for time constraints is to release exam instructions ahead of time. In many classes, the exam  instructions identify how many questions and provide suggested time allocations. This information, shared in advance, can help students plan their time before they take the test.

B. Pre-release a couple of questions

In addition to pre-releasing instructions, 24 hours before an exam, I pre-release a couple of short answer questions [worth 10-15% of the total raw score points]. This allows students to prepare answers to those questions ahead of time.  For those questions, I minimize the test-taking speed issue.

I allow collaboration on the pre-released questions so I do not need to police students. I also warn students about the dangers of collaboration and letting others lead you down the wrong path.

C. Time yourself

I also take the test myself, timing how long it takes me to answer a question or set of questions. I double or triple the time allotted based upon how long it took me, or in some cases how long it took a colleague who gave my exam a test run. For example, if it takes a professor about two minutes per question for a set of multiple choice questions, I allot four to five minutes per question for my students.

Some may argue that allowing five minutes per multiple choice question does not prepare students for the bar exam – an exam in which students have under two minutes to answer often fairly complex multiple choice questions. That point is correct and it raises the interesting question of whether one’s course grade should be capturing bar exam taking skills.  It also raises the bigger question about whether test-taking speed is a variable that we, and bar examiners, should be assessing.

Reliability and Validity concerns

Significant time to answer a question necessarily means a test with fewer questions. For some, this prompts reliability and validity issues [fewer questions may reduce the reliability and thus the validity of an exam]. On the other hand,  test-taking speed as an independent variable also raises questions about test validity.

Making Sure We Understand the Issues

These are complicated issues and different faculty may have different answers to the test-taking speed questions. The key is not necessarily how we answer these questions – it is that we are asking ourselves the questions as we draft our exams.

Podcasts on law student well being

Dean David Jaffee shares exciting news about a new podcast series on law student mental health.  Below is his announcement:

“I write to share with you a new podcast series, The Path to Law Student Well-Being,sponsored by the Law School Assistance Committee to the American Bar Association Commission on Lawyer Assistance Programs (CoLAP).

The inaugural two-part episode is available here, just below the live Twitter Town Hall taking place this Wednesday.

This episode features two short conversations with Dean & Professor of Law Michael Hunter Schwartz of the University of the Pacific’s McGeorge School of Law and Professor Larry Krieger of the Florida State University College of Law and is moderated by Professor Susan Wawrose of the University of Dayton School of Law.

  • In the first part of this episode, Dean Schwartz and Professor Krieger suggest ways individual faculty members can notice, engage with, and support students they suspect are in distress.
  • The second part identifies steps faculty can take to promote student well-being through their teaching in the classroom and includes simple actions for law school administrators.

The podcast series is a response to the call for action in the 2017 National Task Force Report The Path to Lawyer Well-Being: Practical Recommendations for Positive Change, which was sent to all law schools last fall and sets out specific action items for the legal community, including some specific steps for judges, regulators, employers, bar associations, lawyer assistance programs, and law schools.

We hope you will share the news of these new podcasts widely with your faculty, staff and colleagues to help bring awareness to the ABA’s Law School Mental Health Day tomorrow, March 28, and to encourage discussion and action within the law school community around the critically important of topic of law student well-being.”

 

 

Disparate Institutional Service Workloads- Recognizing and Addressing the Problem

Faculty Governance and Academic Freedom Requires Committee Work

“Because faculty self-governance is integral to the effective functioning of law schools, and because that self-governance requires productive committees, the “reward” for efficient and strong faculty service work performance is often more service work. The opposite is also true. Faculty members who demonstrate lack of competence or responsibility when engaging in committee work are not called upon to serve.”

This quote comes from Addressing Social Loafing on Faculty Committees, an article Professor Mary Lynch and I recently published in the Journal of Legal Education.  The article identifies an institutional equity problem with significant career consequences.

Some Colleagues Do the Work While Others Reap the Rewards

As we note, “Socially responsible faculty members who fully engage in committee work help sustain a robust system of faculty governance. However, they do so at the expense of time available for their own scholarly pursuits. By ensuring the work gets done, they also provide some colleagues the freedom to disengage and focus on individual career-enhancing scholarly endeavors with no penalty and potentially significant individual rewards. This can create significant institutional inequities.”

The time spent on legal scholarship frequently  results in significant rewards.  Merit raises often depend largely upon scholarly productivity.  Prolific scholars get speaking invitations and enhance their national reputations, leading to potential additional job prospects.  Productive scholars also often get course releases, and some are light-loaded on committee work.  These workload releases allow faculty members the opportunity  to enhance their scholarly productivity and continue the cycle of rewards based upon that scholarship.

These benefits seldom inure to those who engage in the institution-sustaining work necessary to support faculty governance and the academic freedoms it protects.  In fact, the opposite is true.  Stellar committee work performance often results in additional committee work assignments, thus cutting ever more deeply into time available for scholarship.

Women Faculty Shoulder A Disproportionate Share of “Institutional Housework”

Institutional inequities when it comes to allocation of committee work and other internal service activities also raise potential gender equality issues.  Although many men do more than their fair share of committee work, multiple studies have found that that the lion’s share of what some have dubbed “institutional housework” falls on women faculty members’ shoulders.

For example, one study found that although both male and female undergraduate associate professors averaged a sixty-four-hour work week, institutional service work and other institutional commitments resulted in the women professors having 220 fewer hours than their male counterparts to devote to scholarly endeavors during the academic year.   Another study  found women undergraduate faculty at all levels of their career, and across disciplines, on average, spent more time on internal service work than their male counterparts.

I don’t know of any formal studies focused on legal education.  However, over ten years ago, Nancy Levit gathered anecdotal evidence suggesting that law schools are not strangers to “institutional housework” gender equity issues.  And, those equity issues may be on the rise.  At many schools, committee workloads have increased due to rapidly changing legal education models.  Some schools also have seen a reduction in faculty.  Thus many schools now look to clinical and legal writing faculty to take on significant committee work responsibilities. This inclusiveness in faculty governance is important.  However, clinical and legal writing faculty members are disproportionately women.  Asking these faculty members to shoulder a growing share of the institutional housework means they have less time to spend on the external work and scholarship upon which their promotions, and their academic reputations, often hinge.

Proposed Solutions

Disparate workloads often are not about who is assigned to particular committees, but rather they exist in context of who actually does the committee’s work.  In Addressing Social Loafing on Faculty Committees, we explore some of the potential reasons for internal committee workload disparities and propose some solutions.  For example, we suggest using “committee work contribution evaluations to highlight communal responsibilities, set clear expectations, communicate that certain behaviors are valued and important, and motivate change by setting normative standards for committee work participation.”   The article provides a sample rubric faculties could use to set normative expectations about committee work contributions for all faculty members.

We also suggest recognizing committee workhorses with more than a round of applause at a faculty meeting.  We encourage faculties and deans to consider the contributions to faculty governance made by committee workhorses and to reward those who perform outstanding service with release time from committee work to ensure they have time to engage in scholarship.

These are just two potential solutions. Many more likely exist.  The question is not whether there is a way to address the workload inequity problem.  Rather, it is whether deans and faculties are willing to openly admit the problem exists and to take the steps necessary to remedy it.

Scholarship on Bar Exam Alternatives Needed

This week, our students sat for bar exams across the country.  Many of us may have once again been thinking about the myriad of critiques we have about the current bar exams – critiques that have been published time and again.  Over the years, we hoped the critiques would motivate the National Conference of Bar Examiners [NCBE] to explore and pilot test alternatives.  Thus far, that has not happened.

While there likely are many reasons the NCBE has failed to develop and pilot test alternatives, one oft-heard reason for not changing bar exams is that while the current exams are not perfect, they are the best that can be done.

As Professors Eileen Kaufman, Carol Chomsky and I recently wrote elsewhere – this is simply not true.  We note that for the last ten years, the Law Society of Upper Canada has been administering an open book multiple choice test that covers a much wider range of competencies than is currently tested in the U.S. and that asks questions in context of how lawyers use information when representing clients.  That is just one example of a viable alternative.

As academics, while most of us don’t have the same psychometric background as NCBE employees, we do have the ability to engage in scholarly research and publish what we find.  What are other countries doing?  What licensing methodologies from other professions could we adapt? What are we doing in our own courses that could be adapted to a law licensing process?

There is momentum for change. States such as California have begun to look at the bar exam’s content validity as well as bar exam passing scores.  Professor Deborah Merritt and Dean Nick Allard have both made persuasive arguments as to why bar leaders should convene a  national task force to examine potential bar exam reforms.

The current momentum for change recognizes that existing bar exams have fundamental flaws that should be addressed.  As academics, we can build upon this momentum by researching and writing about alternatives. We can encourage our law review student editors to consider symposium issues focused on bar exam alternatives. We can create pressure for meaningful change by showing that change is possible. Let’s do it.