Menstrual Products and the Bar: Advocacy Seeks to Create Equal Bar Exam Testing Conditions for Menstruators

By: Elizabeth B. Cooper, Fordham Law School; Margaret E. Johnson, U. Baltimore Law (visiting at American); and Marcy L. Karin, UDC David A. Clarke School of Law

We can all recall the stress, fear, and worry that accompany taking the bar exam.  About half of us also were anxious we would have to manage our period in the middle of this awful two-to-three-day ordeal.  Bar examiners across the country have made this prospect far more daunting than it needs to be by not treating menstruation as the natural biological process that it is.

Without proof of any test-taker having ever cheated using a tampon or pad, some states have chosen to stigmatize and potentially penalize people who have their periods with draconian policies prohibiting bar examinees from bringing their own menstrual products with them.  Other states have failed to adopt or communicate clear policies on the subject, increasing test-takers’ anxiety: one should not have to waste time researching the Bar Examiners’ hard-to-find policies  or calling their offices for answers—which may, or may not, yield consistent information. 

The harm here is four-fold: 1. It is wrong to make test-taking conditions more challenging for people based on the fact they menstruate; 2. It is wrong to limit test-takers to random products selected by Bar Examiners that could put test-takers’ health and menstruation management at risk; 3. It is wrong to exclude text-takers from any menstrual products simply because they do not use the women’s restroom; and 4. It is wrong to convey the harmful message that all people who menstruate are untrustworthy and do not belong in the legal profession. 

Some states, including Texas and Pennsylvania, prohibited exam-takers from bringing in their own menstrual products, offering to provide a limited and unpredictable set of products in the women’s bathroom.  (After much advocacy, Texas changed its rule for the September exam, though it is unclear if this is a permanent change.)  This does not solve the problems these states created in the first place by banning test-takers from bringing in their own products.  People who menstruate need their own products because menstrual products are not “one size fits all”: menstruaters require different sizes and levels of absorbency in their products to best fit their body and menstrual flow.  

Use of the wrong size product can lead to everything from pain and discomfort to toxic shock syndrome (if too large) and time-consuming, uncomfortable, and disruptive leaks (if too small). Further, some individuals require hypoallergenic products to protect against allergic reactions.  If not provided, applicants may experience vaginal itching or other problems caused by using allergen-containing tampons or pads inside or adjacent to their bodies.  All of these consequences are awful enough on their own; here, they create an unconscionable risk of derailing exam performance.

In addition, by limiting test-takers from bringing in their own products and then providing products only in the women’s restrooms, Bar Examiners relegate transgender men and nonbinary persons who may menstruate, and who may use the men’s restrooms or all-gender restrooms, to having no access to menstrual products during the bar exam.

Other states allow test-takers to bring their own products, but require them to be packaged in a clear plastic bag—with some states mandating that the product be unwrapped.  This last requirement makes no sense: the wrapper both keeps the product hygienic before being inserted into or placed adjacent to one’s body and provides an efficient way to safely dispose of used products, reducing janitorial staff’s exposure to bodily fluids.  Further, removing the wrapping exposes the adhesive on the bottom of some pads, rendering them practically useless when the menstruator tries to unstick them from the clear plastic bag.

As much as we want to destigmatize menstruation and eradicate the embarrassment and taboo of being seen with a tampon or pad, it remains an invasion of privacy to require test-takers to carry their products in a clear plastic bag, revealing to a proctor (and possibly a classmates, colleagues, or future opposing counsel) that one has or expects to get their period during the exam.  (One North Carolina bar exam test-taker reported that a proctor asked her if she “really needed those” while inspecting her plastic bag of menstrual products.)  Finally, this intrusion is even more painful for, and potentially outs, transgender men and non-binary law graduates who may not be public about their biological sex.  It may even set them up for bigoted harassment—during the biggest exam of their lives.

Other states allow test-takers to bring their own products and do not require them to be carried in a clear bag—but, they must check them with a proctor or retrieve them outside the exam room before heading to the restroom.  This “solution” means that a menstruating person with will have to take vital time away from the exam (or a break between sections of the exam) to obtain their menstrual products before using the restroom.  This “time tax” is as unacceptable as the other approaches described above.

At least some states treat people who menstruate without such bizarre suspicion, allowing them to bring in and keep their own products with them during the exam, and use them as needed during the test—without having to ask a stranger for their own personal possessions.  To date, there have been no known accusations of test-takers trying to do the impossible: write helpful information on a pad or tampon to give them an edge on the exam or smuggle in written answers inside the product’s wrapping.

The lack of uniformity of equity-based rules permitting access to one’s own menstrual products is unacceptable and must be changed. Thankfully, in the age of social media, law graduates have taken the lead on this advocacy, sharing the hurdles they are facing on social media and asking state Bar Examiners to eliminate these outrageous rules, largely under the #bloodybarpocalypse hashtag. 

Once we saw their posts, the three of us, working with fantastic former clinic students of Fordham and UDC, began advocating that all state Bar Examiners adopt better menstrual products policies.  We drafted a letter to the National Conference of Bar Examiners (NCBE)—co-signed by over 2800 law professors, law students, law school graduates, and lawyers in under 24 hours.  We also sent letters to Bar Examiners in each state that administered an in-person July bar exam and did not have a clear, acceptable policy in place.  All of these efforts led to some quick changes. 

The NCBE contacted state Bar Examiners and informed them that menstrual products were not included in the NCBE’s “prohibited paper” category and that test-takers should be able to bring in their own products.  The press started asking questions of the state Bar Examiners.  And state Bar Examiners began changing or clarifying their policies, with some confirming to examinees that they could bring personal menstrual products to the exam.  For instance, West Virginia Bar Examiners insisted that they permitted products in the exam room, even though their website said differently. Texas state Bar Examiners changed their policy from not permitting products to permitting them at its September exam.  (The state has issued contradictory statements, however, about whether this change is permanent.)

This positive change is not, however, uniform: even those states that have adopted equitable policies must be monitored to ensure they are adopting best practices.  In our efforts to get accurate and honest information from state Bar Examiners across the country, it has been deeply disconcerting to learn how many jurisdictions are silent on whether examinees may bring in their own menstrual products; have informal policies that contradict written statements about what items are allowed in the exam (e.g., not listing menstrual products in the list of items test-takers can bring in, but informally allowing them); or have stubbornly held onto their recalcitrant policies.  

Equally unacceptable, many Bar Examiners will not share the documentation that they say embodies their policies (e.g., generic letters to test-takers informing them what they can and cannot bring into the exam; postings on their web sites behind a security wall).  Without this proof, there is no accountability and the true practices of these states remain unknown.   

As we reach out to jurisdictions administering in-person exams in the coming months, our demands are clear: Bar Examiners must issue explicit policies permitting examinees to bring their own menstrual products in to bar exams, in an opaque container or on their person, and to publish these policies on their websites.  Other bar-related policies that can have disproportionate effects also must be changed.  For instance, examinees needing to pump their breastmilk must be given ready accommodations and bathroom access must not be limited as it affects both pumpers and menstruators.

To learn more about all of the advocacy efforts in this area, check out Menstrual Equity and the Bar Exam: Round Up of Op-Eds and Other Media Coverage on the Feminist Law Professors blog and follow the hashtag #MPandTheBar.  If you want to get involved in this work, let us know. And no doubt other activists working on the pumping and bathroom access issues would welcome assistance too. There is, unfortunately, plenty of work to be done.

Let’s Take This Period of Unprecedented Change to Consider How Grading Practices Can Affect Issues of Diversity and Inclusion in Our Law Schools

Jennifer S. Bard, Visiting Professor, University of Florida, Levin College of Law

For the last half of spring semester 2020, law schools all over the country were forced to change their method of instruction, delivery of final exams, and (in many cases) grading practices because of the demands for physical isolation following the outbreak of Covid-19.  Now that the semester is over, there is a further round of disruption as many states have delayed or even cancelled their bar exams, some have granted graduates diploma privileges, while others bravely go ahead in the face of a possibility that they will have to cancel at the last minute because of ever-rising rates of infection. 

Like the opportunities that may arise when a river is drained and a ship revealed, there may never again be such an opportunity for us to consider what role we play in the glacially slow diversification of the legal profession and how we can make our law schools more equitable, inclusive, challenging, and effective for all of our students—not just those for whom it has been particularly well suited.

With many things to choose from, my starting point for looking at things we rarely question is the marrow deep belief that we owe it to our students to sort them for the benefit of large law firms—even when our employment profile shows that very few of our students will ever work at such a place.  Since the threshold for this opportunity is a top 5 or perhaps 10 percent class rank, it may seem odd, on reflection, that we have designed a curriculum designed to compare students that may have many undesirable consequences including undermining self-esteem, discouraging learning for learning’s sake, and contributing to the lack of diversity in the legal profession.  

Over the years, other justifications have been added such as the need to motivate students or assess their progress but never have we had such a good opportunity to see what law school is like without grades or, more to the point, comparative curves.

Here are some Practices We Might Question

The Primacy of First Semester Grades

One result of the decision to go pass/fail (or some variation of the words) was to “freeze” first year first semester class ranks because it was impossible to produce comparative curves

The resulting phenomena gives us a chance to ask ourselves  some tough questions:

  1. Do First Semester Grades Reflect What Students Bring to Law School Rather Than What We Bring to Them? OR Do Students Who Come in Knowing the Rules Get Better First Semester Grades?

Many students, very often First Generation Students, but also some facing racial or gender identity or expression based discrimination, frequently tell us (and the many researchers who study first generation college students) some version of “everyone knew the game but me and by the time I figured it out, it was too late.” And while students living with disabilities might intersect with any of these groups, they also are often using new adaptive equipment and certainly facing new challenges that they may have been able to mitigate in college.

Certainly many of our students do know the game from the start.  The recent AALS survey “Before the JD” found a disproportionate number of students who ended up going to law school had parents who were either lawyers or professionals. While students have, themselves, created organizations to support each-other usually with the enthusiastic support of the law school it may not be enough.

Our challenge going forward is that history is told by the victors.  We can see the students who were not comfortable the first semester but then continued to graduate “at the top of their class” (a vague term that usually means somewhere in the top 20%), but we don’t hear from the ones who didn’t draw attention through academic distress, but also didn’t thrive.

It would be helpful to know more–and many schools do know more about their own students.  But so little of this information is published.

Much is being done in supplemental programs- to name them is to leave many out- such as pre-first semester programs, orientation programs  and excellent pre-law institutes like the Tennessee Institute for Pre-Law , and in wonderful conferences organized by the National Black Law Students AssociationLavender Law, the National Association of Law Students with Disabilities,  and so many others.  

But how much more effective would it be to have a curriculum that was truly equitable and inclusive – all the way through?

2. Did Pass/Fail Grading Help Learning, Hinder Learning, or None of the Above?

Across the board pass/fail grading that makes no effort to compare students to each other is so unusual as to make any observations worth considering. The expectation was a distressing list of bad results-students putting in less effort during class, performing worse on exams — but did that really happen?

3. Ditto Open Book Exams

As above, it would be interesting to test, in the fall, the content knowledge of students who took open exams.  Not so much as to compare them with past classes, but to see what how much they learned.

4. What Will Be the Long Term Effect of the Delayed or Cancelled Bar Exams–and How Might that Change Our Curriculums?

The opportunity presented by the necessary changes to the bar exam is already in very good hands, (thank you AccesLex) but it’s still worth considering what the future will look like in states which choose provisional or full licensure.  Even decisions to delay the bar exam could raise issues of an on-going, career long licensing process, much as many doctors (but not all) must take requalifying exams every ten years to retain their “Board Certificate.” What would that mean for law schools?

To Be Continued: Part II: What Can We Learn from the Delay of Fall On-Campus Interviewing?   

Disparate Impact Magnified: Holding a Bar Exam during the COVID 19 Pandemic year of 2020

Yesterday the Harvard Law Review blog posted an excellent piece by a powerhouse group of legal educators who describe the prospect of a “licensing abyss” just when non-privileged folks and small businesses will need extra legal assistance to navigate the health, employment, housing and government benefits legal landscape.  On the same day, the ABA also urged states that cancel or delay the bar exam to  license law grads on an emergency basis “to help address the increase in legal needs for individuals and businesses caused by this pandemic.”

The Harvard blog authors note, in addition, the the reluctance of bar examiners and courts to find alternatives to the “closed-book, two-day exam anchored in 200 multiple-choice questions” despite the option of so many good alternatives that may well better predict competence to practice law. The authors ask,

Why do our courts and bar examiners place so much faith in this high-stakes exam to predict who is competent to practice law?

This question has puzzled readers and contributors of this blog particularly in light of the discriminatory nature of “speeded” exams  and the economic call for practice-ready lawyers. It is also puzzling when the profession itself is so deficient in diversity and standardized tests are used in ways that preference the privileged.

For 2020, the issue of disparate impact with respect to timed, closed-book exams anchored in multiple choice questions is further exacerbated by law students’ quarantine and sheltering conditions while studying for the bar exam- see the excellent piece in the NYT on how students returning home to attend classes removes the veneer that all are equal. Even more disturbing and heartbreaking is the information surfacing this week about the horrific disparate impact of COVID19 deaths on Americans of color.  Pre-existing disparities in trauma, housing, employment, healthcare, opportunity, discrimination and historical DNA exacerbate the distress and fatalities for communities of color and for those whose families and friends are populated by people of color.  Some of us – particularly our students of color – will be affected in disproportionate ways and in ways no one can predict or control over the course of the coming months.

As the authors of the Harvard Law Blog wrote, “Crises challenge assumptions and demand action. For this year, emergency licensing based on diplomas and periods of supervised practice would offer proof of competence.”  To do otherwise would demonstrate an inability of our profession to adapt and experiment, and a shocking refusal to recognize and correct disparate impacts.

NYS Law Students Urge Highest Court to Expeditiously Announce Alternatives to September Bar Exam

Throughout the country all of us are being asked to change, adapt and respond in unprecedented ways as we experience global pandemic, quarantine, loss, fear, empathy and grief.  New York’s situation seems at this moment most dramatic as the deaths due to the virus surpass those from September 11th.

Two days ago, on April 1st,  law students from the 15 New York law schools eloquently and compellingly argued for the highest court to recognize this unprecedented moment and act accordingly in their Student Letter to Chief Judge DiFiore . In addition, the 15 deans of New York Law schools co-wrote and submitted a similarly persuasive Letter from New York Law Deans to Chief Judge DiFiore.

Yesterday, April 2nd,  the National Law Journal published Judith Wegner’s An Immodest Proposal. Professor Wegner, the innovative co-author of the pathbreaking Carnegie report Educating Lawyers calls for state bars and courts to:

  1. Recognize the factors that are impeding meaningful responses;
  2. Mitigate immediate problems through supervised practice rules;
  3. Consider adopting an emergency provisional licensing system; and
  4. Recognize that other options also deserve attention.

It is incumbent upon the New York Court of Appeals to act swiftly and with innovative  measures to effectively respond to the informed voices of  New York’s law students and law deans.

Is Mandatory P/F An Opportunity to More Accurately Assess Competency to Practice Law and For Bar Admission?

As our knowledge of COVID19 and its impact becomes more extensive each day, each workplace, profession and community is facing some common and some unique questions. Those working on the front lines in hospitals – such as several of  my relatives in NYC and NJ – are experiencing the kind of trauma, shortages, emotional overload and duress that is usually experienced in wartime. It can only be weakly imagined by the rest of us.   For those of us not experiencing  people suffering and dying in front of us on a daily basis, some less horrific choices are before us:  How do we modify “business as usual”?  How do we evolve and adapt with each days new tsunmai of information and data?  How do we support our best selves and our core values in this historically momentous time on our shared planet? 

Before turning to the topic of grading and assessment, I want to pause to give a shout-out to my home institution. Our multi-talented leader Dean Alicia Ouellette has been holding  community town halls every day since Friday March 20th. (BTW Dean Ouellette  just shared on Facebook  that she had been suffering from “presumptive COVID 19” fever and symptoms but thankfully is now symptom free). During our daily town halls, my faculty colleagues and I have expressed our wonder and gratitude for the  character, resilience and grit of our law students who are balancing so much right now, and facing so many financial, tech-related, health and extended family burdens. Our students’ engaged and forgiving response to “tech-curious but not necessarily tech-savvy” teachers and their community-minded empathy for those hardest hit keeps the faculty motivated and inspired.

One of the COVID19 decisions for legal educators involves whether and how we assess and sort — which in reductive  vernacular means “grade and rank.”  Maintaining appropriate expectations, options, rigor and excellence in law teaching  may assume primacy for those  who have been long focused on ensuring that law students receive real value for the time, talent and treasure they expend on law school.   For others focused on fairness in law placement,  transparent employer signals about how they will view Spring 2020 legal education may be most influential.  For those concerned about our profession’s  reputation for lack of wellness and lack of diversity, those concerns are elevated at this moment when those least advantaged are most hard pressed.  For those struggling with equity, there are so many permutations and consequences of COVID19 – whichever choice a school makes – that voting faculty could become as immobilized as Chidi Anagonye on THE GOOD PLACE. (BTW Good idea for escape television for those who love philosophy or Kristen Bell).

On the other hand, might this be a moment to look for the opportunities for reform and improvement that only come when the status quo is disturbed and rocked to its foundations as is happening now.  Here is what I am thinking:

Might Mandatory P/F force educators and employers to admit that traditional law school grading and ranking is a misleading and reductive proxy for measuring potential success as a lawyer?

Could it force employers to use other ways to learn about the WHOLE STUDENT with all her strengths, gaps, and individual aptitudes including the situation she faced during law school?

Might it accelerate a move to a more qualitative than quantitative assessment of each law student? Or, at least might it prioritize learning which enables a school to assemble a portfolio of student recommendations ( demonstration of knowledge, skills, aptitudes, and professionalism)?

Foundational resources include of course Educating Lawyers, Best Practices in Legal Education, and Building on Best Practices: Transforming Legal Education in a Changing World, which also provide helpful wisdom points. In addition, looking back through the dozen or so years of this blog’s existence, there are lessons from which we can pull core knowledge and core values to assist in our continued educational deliberations at this turbulent time. 

CORE KNOWLEDGE AND REFLECTIONS

Valuing Legal Education over Sorting – For example, focus on the difference between assessment and grading.  Educating Tomorrow’s Lawyers conferences have brought employers, law schools, and legal education stakeholders together to tackle the disconnect between our current sorting systems (primarily used to help elite employers looking for a simple and reductive initial screening system) and the needs of society and most employers for competent new attorneys and the needs of students and the profession for fairness.

Focus instead on formative and evaluative assessment of law students and graduates

Focus on growth mindset, on reflection and learning from mistakes or experience

Recognize the limits and problems with GPA’s or LSAT scores to create a more competent profession with more able and diverse learners.

Acknowledge that the media and the academy is still stuck in a mindset that focuses on sorting methods rather than on better preparation and assessment of law students to serve clients and society.

Class rank does not predict who will become a competent, healthy and ethical lawyer

Effective Education includes

CORE LEARNING VALUES

Growth Mindset 

Inclusion and Diversity

Student-centered Learning  and the Introduction to the original Best Practices – “One of our basic tenets is that law schools should become more student-centered”

Wellness  

Collaboration and Innovation

Integrity 

Character 

Justice

Situational Excellence

There is a common theme here: P/F with alternative assessment information and measures should be seen not as temporary emergency expedients to “sort and rank”, but rather as long overdue components of a better educational program and more nuanced assessment paradigm.

I would love to hear your thoughts in the comments below.  I wish all our readers and citizens of our little blue planet moments of peace, love, safety, and compassion. May someone be kind to you today and let’s pay it forward.

 

 

 

After All These Years: Another Bar Exam Over, Another Entering Class, but Still a Disconnect between the Licensing Exam and What We Need Lawyers to Be and Do

I was never a Journey fan but I truly am astonished that after all these years of preparing lawyers for practice, and after two years of an unprecedented undermining of  the rule of law in our nation, law schools still live with a disconnect between the profession’s  licensing exam and what business, government and society needs lawyers to be and do, which includes protecting  the rule of law. 

The National Law Journal recently discussed two new major studies which will analyze whether the current exam is the best measure of new lawyer competence.  The National Conference of Bar Examiners (NCBE) is in the midst of a three year study  to “ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in the 21st century.”  (Hmm, continues? that’s a bit biased) and has already held 30 listening sessions.  

The second study, “Building a Better Bar: Capturing Minimum Competence” is an initiative of  the Institute for the Advancement of the American Legal System in partnership with Ohio State Law Professor Deborah Merritt, and aspires to develop a “fair, evidence-based definition of minimum competence” to improve the current licensing process.  Funded by Access-Lex, the researchers:

will be holding 60 focus groups in 12 locations around the country. While these focus group participants will primarily be new lawyers, we will also hold a number of specialized groups with supervisors. Additional specialized groups will include only women and only people of color, as well as groups in rural areas; traditional job analyses can mask the views of these lawyers, yet their perspectives are essential to create a more fully representative view of minimum competence and how to test for it effectively. Through these focus groups, we will be able to capture key information from a diversity of perspectives and provide concrete data on the definition of minimum competence that the profession can use to improve the bar exam and how lawyers are licensed.

 

Readers may remember that IAALS has provided helpful research in the past through its Foundations for Practice  research, which identified the  competencies over 24,000 legal employers value in new hires (most of which go untested by the current licensing process) as well as the evaluation of the graduates of the Daniel Websters Honors alternative to the bar exam in “Ahead of the Curve:  turning Law Students into Lawyers

I suppose I should be delighted that more studies are being launched. They are addressing the exact issues so many of us have raised for decades. However, my reaction is uncharacteristically pessimistic.  (Readers here who have tolerated my enthusiastic use of exclamation points and emphasis will agree it is uncharacteristic).  Perhaps it is the August humidity. Perhaps, it is the sorrow surrounding our nation after a week of grief from senseless gun violence But more likely, it is the fact that I am feeling frustrated that we have already studied this to death! For example, working with state bar associations The Foundations for Practice Project already studied new lawyer competencies with 24,000 lawyers from all 50 states participating and found

… the foundations that entry-level lawyers need to launch successful careers in the legal profession.

In a first-of-its-kind survey, we asked, “What makes a new lawyer successful?” More than 24,000 lawyers from all 50 states answered.

What we learned is that new lawyers need more than IQ and EQ to be successful. They also need CQ: Character Quotient. In fact, 76% of characteristics (thinks like integrity, work ethic, common sense, and resilience) were identified by a majority of respondents as necessary right out of law school.

Beyond character, new lawyers are successful when they come to the job with a broad blend of legal skills, professional competencies, and characteristics that comprise what we call the “whole lawyer.”

So why is the NCBE, who clearly has a stake in the outcome, refusing to respond to the outcome of that 3 year old study but instead promising only to do its own study. JEESH! We tweak here and there, we add more pro bono or experiential requirements, but no one truly influential will admit that our insistence on anchoring the gateway to the profession to a timed, written exam instead of clinical excellence is the problem.

Starting as early as 2008, this blog has discussed the problems with the bar exam and its role as an unhelpful, anxiety producing, discriminatory, skewed, and unnecessarily speeded, gate-keeping device.  For a sporadic history of posts between then and now, in fairly chronological order, click on the links below.

Did You Know That “Bar Courses” Don’t Matter? 

New Article: No Excuses Left for Failing to Reform Legal Education

Working with State Bar Associations on Best Practices

Bar Passage and Best Practices for Legal Education

One BAR to rule them all?

The Daniel Webster Scholar Honors Program

NYSBA Task Force on the Future of the Legal Profession Report

New Requirements for Bar Exam Stress Clinical Education

Existential Crisis and Bar Exams: what is really cruelest?

The Bar Exam Inhibits Curricular Reform

NEW YORK STATE BAR ASSOCIATION VIGOROUSLY OPPOSES PROPOSAL TO BRING UBE TO NY THIS JULY

Preparing Students for the Multistate Bar Exam

Musings on the Bar Exam and Legal Education’s Attitude toward it

Bar Exam Musings, Part II: Skillfully Changing the Bar Exam Narrative

Experts in the Legal Field Question the Bar Exam…

What’s going on in California? “TFARR- recommended” 15 credits of competency training

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Examining the Bar

Keeping an experiential identity in bar passage reform

Whither Clinical Courses and Bar Passage – by Prof. Robert Kuehn

DO LAW SCHOOLS ADEQUATELY PREPARE STUDENTS FOR PRACTICE? SURVEYS SAY . . . NO! – Robert Kuehn, Washington University School of Law

Professor Merritt’s Blog post on attorney discipline and bar exam WORTH A READ!

Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (Besides the Cut Score Debate)

Scholarship on Bar Exam Alternatives Needed

ABA Commission on Future of the Profession & ABA Vote on Bar Passage Proposal

Drafting Exams With Test-Taking Speed in MindConcrete Suggestions for Bar Exam Reform

We have to talk about the bar exam

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

More Resources Re Teaching, Learning, and Bar Passage

A Fresh Look at the Uniform Bar Examination

Letters raise concerns about changes to the bar pass accreditation standard

Time to Remedy the Ills Afflicting ABA Council’s Standard 316 Proposal

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Shifting the Focus of Legal Education Back to Just That: Education

How Practice Tests Reduce Anxiety in Bar Preparation and the Exam

Quite a listing, huh? I suspect that the IAALS and Merritt project will provide us with extraordinarily helpful insights into measuring minimum competence. But political clout is also needed. Will this BLOG simply be adding more posts for years to come on the unfairness and inappropriateness of a slightly modified, unnecessarily stressful, timed, bar exam — a continued hazing tradition?  I hope the NCBE and other institutional influencers proves me wrong.

How Practice Tests Reduce Anxiety in Bar Preparation and the Exam

Sara Berman and I recently did a podcast in the ABA’s Path to Law Student’s Well-Being Podcast series. See https://www.spreaker.com/show/path-to-law-student-well-being. Anyone associated with helping applicants prepare for the Bar exam knows that the challenges they face can affect their well-being.  In the podcast, we share our experience that applicants who practice tests regularly learn not only content and skills, but also the ability to manage anxiety as they get closer to and take the exam.

            In bar preparation, students take seemingly endless sets of multiple-choice Multistate Bar Exam questions. In addition, their bar preparation companies provide opportunities to practice essays and Multistate Performance Tests (MPTs).  Applicants need to follow the Bar company’s suggestions and to get feedback on submitted work.  They should welcome critiques and suggestions, assess their own strengths and weaknesses, and improve by building on strengths and addressing weaknesses.  If allocation of time to different study methods is an issue (and it always is), applicants need to do more—not less—practice testing than reading and re-reading outlines, flash cards, and the like.  Cognitive science indicates that people retain information better when they learn a concept by applying it in a problem-based approach. See Dani Brecher Cook & Kevin Michael Klipfel, How Dow Our Students Learn?   An Outline of a Cognitive Psychological Model for Information Literacy Instruction, 55 Reference & User Services Quarterly 34 (Fall 2015).   In studying legal rules, applicants retain more by doing practice essays or MPTs because they are learning and applying rules in the context of the facts that trigger them.

            This message may be not be welcomed by an applicant who, when she writes a practices essay or MPT answer, experiences anxiety during the practice test.   Applicants often do not want to face the reality that they do not know a rule.  They falsely believe that they must have mastered a subject area before doing practice tests.   The podcast encourages applicants to practice essays and MPTs regularly and often even if they are do not feel that they have fully mastered a subject.  Using the open-book method for practicing can help someone get the process going.  The key is to break through the resistance to doing practice tests.

            If applicants get past the reluctance to embrace practice tests, they can experience reduced anxiety as they move forward.  Again, the context of our recommendation presumes that an applicant is receiving feedback from a Bar Company representative, an academic support advisor, or both.  When applicants respond to feedback in new practice test answer and see their work product improving, that reduces their anxiety.  The anxiety does not go away but remains at a reduced level—a level at which it can motivate performance rather than interfere with it.   At such a point, it is fair to say that an applicant is managing anxiety.  

            In the podcast, Ms. Berman implored law students who might be listening to apply these principles in law school.   Practicing tests—whether essays, multiple-choice, or other tests—will benefit a student.  The student of course needs to seek feedback, recognize areas in which she can improve, and be working toward that goal.   Those students who I have seen take such an approach report (1) less anxiety on graded tests and (2) that they believe they performed more effectively.  Although the days of a class hinging on one grade at the end of the semester seem to be fading, the final exam still forms a major part of student’s assessment in many courses.  Of course, ABA Standard 314 encourages formative and summative assessment and students are receiving meaningful feedback.   By doing practice tests, such as writing an answer to a potential essay, the student can apply what she has learned from feedback and seek more.

            An excellent article on practice tests concluded that such tests may improve student performance.  See Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Does Practice Make Perfect? An Empirical Examination of the Impact of Practice Essays, 35 Fla. St. L. Rev. 271 (Winter 2008).   The question explored in the article is whether practice essays improve performance.  The inquiry in our podcast is different.  We ask whether practice tests allow students to manage anxiety.  We entitled our podcast “Practice Makes Passing,” to counter the view that applicants must be perfect (or have completely mastered) most subjects.  Applicants need to do their best. However, they will increase their chance of passing by recognizing that practice may well be what gets them to “good enough”—i.e., a passing score.

            The ABA’s series on student well-being is an important look at a problem once viewed solely as an attorney well-being problem. Many now accept that law schools and students are an environment that can diminish or enhance student well-being, depending on choices by the school and by the students.  By learning to manage anxiety through practice tests, law students can choose to improve their well-being. Bar applicants can do the same. By spending their time wisely in bar preparation, and including a healthy dose of practice tests, the applicant will ultimately experience less anxiety and likely perform more effectively. 

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Thank you so much to Best Practices for Legal Education for inviting me to blog again and to Elizabeth Murad for her remarkable work in keeping contributors in touch and on track.  So much is written about the very real decline in bar passage that it is easy for schools with high pass rates–or at least high in relation to other schools in their state– to ignore the need to change what goes on in the classroom and dismiss the excellent work being done in effective law teaching as a problem for “lesser schools” in “lower tiers.”

We know, as legal educators , members of the bar and even members of the public, that bar passage rates have been falling.  And we also know that many, if not most, law schools are admitting students today with LSAT scores lower than those that they  admitted ten years ago. So it’s easy to see a correlation between lower scores and falling rates.  After all, the bar exam is a test much like the LSAT–why wouldn’t there be a relationship?   But even if students are failing the bar exam for the same reasons they are getting low LSAT Scores,  we still have the opportunity to intervene in ways that we know raise pass rates.  This blog contains so many resources for those who want to teach more effectively.   Why wouldn’t we want this for all our students?

Everyone at a school with a “bar passage problem” is well aware that we cannot continue to do the same things we always have when they are no longer working the way they used to.  But we hear this less at schools satisfied with their bar passage  Perhaps the students who are failing are really canaries in the coal mine and a warning to all of legal education that all of today’s law students find it more difficult translating their legal education into the very peculiar format required for bar passage-regardless of LSAT score? Everyone who has ever studied for the bar exam remembers it as a grueling, unpleasant, and highly intensive process–but until very recently that process started after graduation and barring personal disaster almost always resulted in passage.  Even when it didn’t, the consequences of were lower.  Today, students safely employed in September find themselves fired if October brings news of failure.  We need to consider bar passage as an issue both for students who fail and for those who pass–after all, both groups spend the same three years in law school.

Anecdotal evidence (which we could easily substitute for actual data by doing some surveys) suggests that bar passage anxiety spreads well beyond those students most at risk.  All students know that the stakes are high and many believe that their chances of passing are lower than students in the past.  Does that affect their choices while in law school?  Could they be doing more to prepare for their future careers if we could provide them more effective instruction?

Medical students and educators are expressing the same kinds of concerns about their curriculum being shaped by a test as we should be about ours.   We can’t easily change the bar exam–but we can adopt more direct methods of instruction that support not just bar passage but create time for the more complex and less exam focused thinking that we want to be going on in class.

I hope over the week to share resources that would encourage everyone to consider how studying for a very old fashioned test is negatively shaping the education of all of today’s law students. (and because it always warrants reposting-here is a recently revised article by, Louis Schulze of what they have done at FIU to apply the “science of learning” across the curriculum in support of higher bar passage.

 

A Fresh Look at the Uniform Bar Examination

The bar exam is back in the news. Later this month the ABA House of Delegates will consider a proposal to raise the bar-passage standard. The proposal would require that 75% of an accredited law school’s graduates pass a bar exam within two years of graduation.

In most states, bar exam means the National Conference of Bar Examiners’ Uniform Bar Exam. The UBE has been adopted by 35 jurisdictions and is under consideration in others. It carries the endorsement of many ABA-related entities. They include the Conference of Chief Justices, the Law Student and Law Practice Divisions, the Section of Legal Education and Admissions to the Bar, and the House of Delegates.

As a reminder of the features of the UBE, and an update on the pace of its adoption, I have accepted Mary Lynch’s gracious invitation to reblog my recent post “Will the Uniform Bar Exam Come to Michigan?” from the Western Michigan University-Cooley Law School blog. (Spoiler alert: no time soon.)

More Resources Re Teaching, Learning, and Bar Passage

Thank you to Best Practices for Legal Education Blog for having me as a blogger this week.  I hope the examples I’ve provided about methods medical schools use to evaluate their curriculum, test the effect of new programs, and look for factors that affect success on licensing exams.  As I mentioned at the end of my last post, the most comprehensive source for research based information about bar passage programs as well as a source of funding for sources is AccessLex.  There is a growing literature of articles from schools which have implemented successful bar passage programs.  Here’s an article by Louis Schulze about his work at FIU.

You might also be interested in a series of articles from back in 2009-2010 when those at the front lines of legal education, first year faculty and legal writing and research faculty, began to see significant differences in performance between the students they were teaching and those in the past.  These articles provide information about how substantial changes to the k-college education system in the U.S.A. impacts law students’ transition to law school. This article by Rebecca Flanagan is a good overview.  Prof. Patricia Grande here.  A literature review of law learning strategies by Profs Jennifer M. Cooper and Regan A.R. Gurung.   One more by Profs Susan Stuart and Ruth Vance

Here are the proceedings of a 2014 Symposium entitled “Teaching the Academically Underprepared Law Student” and I invited readers to take advantage of the comments section of this blog to share other publications—including the many more recent ones.  My point here is historical, not bibliographical.  And here, as a quick reminder of one of the crucial skills the bar doesn’t test– research.  Caroline L. Osborne

Finally, something I’ve mentioned elsewhere, the new collaboration between LSAC and Khan Academy providing free, on-line, very high quality LSAT preparation may have something to offer law students.  The skills underlying LSAT performance, close reading and legal reasoning, are not immutable—students can get better at them after enrolling in law school and may find some time with these materials a helpful and interesting way to brush up on these skills.

 

 

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

It’s not unusual for a provost or a colleague or a relative at Thanksgiving to ask a legal academic why law students have so much trouble passing the bar exam when the pass rates for medical students are usually in the high 90th percent.  The short answer to that question is that the two processes are completely different—and there’s no obvious trick, technique, or intervention that could convert our bar passage rates into their licensure passage rates.   For one thing, it’s the wrong question.  “Passing” the medical licensing exams is certainly important, but unlike the “all or nothing” process of passing the bar exam, the score achieved on Step 1 affects medical students’ entire career path.  But there is a lot to learn about the methods that medical schools use in studying the very few students who have trouble as well as how they evaluate the effect of changes to their curriculums on scores on the licensing exams.

Quick recap on professional licensing—future doctors take a series of three exams over the first six years of their undergraduate medical education and the start of their residency.  (more links in a post I wrote earlier this year here).  The exams are almost entirely national although the actual process of being licensed is conducted on a state by state basis.   Law students take a licensing exam in the state where they intend to practice upon graduation.  For purposes of this post, the closest analogy to the bar exam is the more academic Step One students take during their second year of medical school.  Like  our NCBE, the National Board of Medical Examiners which produces United States Medical Licensing Examination works with medical licensing boards and depends on their confidence.  It issues annual reports.

The focus of this post is on the methods that medical schools use to study the small number of their students who do have trouble passing the licensing the exams as well as the factors that can affect the scores students achieve.  I’ve tried to focus on articles outside of paywalls, and would certainly encourage you to conduct your own searches in the various data bases to which you have access.  There are several journals devoted directly to studying medical education—although these articles can pop up anywhere.

Medical educators use a wide range of research techniques to learn more about students who struggle with licensure exams.  Like us, medical schools would prefer students pass the first time and many articles like this one look for characteristics who fail the first time but eventually pass.  Others look for characteristics of students at risk for failure here and here  or even  what students think of the exam.    Another area for inquiry involves the role stress plays in the score students achieve.   In partnership with social scientists at our schools or in our communities, we too could be conducting studies to help us learn more about students who face difficulty passing the bar exam.  These studies can be part of graduate student work or may even be funded by groups like Access which is making money available to study bar passage.

 

The actual reason the medical school pass rates are so high, though, may not be all that helpful.

It’s not just because they are able to limit admission to students who have already demonstrated an ability to score very highly on the MCAT.  A test that is much more similar to step 1 than the bar exam is to the LSAT.  Indeed, medical schools have direct input in both the MCAT and the Licensing Exams—so when one changes, the other can too. And it’s not clear that anything in the curriculum makes a difference at all—the industry offering study aids and licensure prep courses dwarfs the bar prep and study aid market to a point where students often start studying for the licensing exams before the first day of medical school.

But if it is the curriculum, it’s important to remember the vast difference in time scale between medical and legal education.  We have students for three years post B.A. Medical schools in the U.S. plan their curriculum based on  8 plus years of increasingly specialized medical education.  They are therefore comfortable holding off on the direct teaching of practice skills for the first two years while they are aligning their curriculum with the content of the Step 1 exam.

Even Step 1, though, is far more focused on practice than on knowledge accumulation or deliberately confusing question formulations that characterize the bar exam. Step 2,  the second round of licensing exams prior to graduation medical school,  go past paper and pencil in that they actually test students’ ability to conduct exams and exercise medical judgement.  Another reason for the high pass rate is that most medical schools have stopped developing their own tests and instead use assessment instruments (shelf exams) provided by the same company that produces the exam.   Sure, there is grumbling and criticism about content & timing of the licensing exams, but medical schools work hard to make sure that their curriculums are aligned with the content of the exams.  Finally, medical education is extremely self-reflecting–they are constantly aware of the risks that come from confusing correlation and causation.  How do you know that a change in one part of the curriculum is the cause of a change in test scores?  You run Pearson correlations followed by stepwise linear regressions.  Seeing is not believing when comes to identifying factors that affect performance on licensure exams.   Look here, here, here, and here for studies evaluating curriculum changes.  They take nothing for granted—does attendance make a difference, does flipping classrooms really work? Does reducing the number of hours spend in the anatomy lab reduce USMLE scores?

Another standard practice in medical schools is curriculum mapping— an essential first step for any school that wants to understand what they are teaching—let alone make changes.   Like all maps, curriculum maps are DESCRIPTIVE, not PROSCRIPTIVE.  Here is   Harvard’s curriculum map, but you can find examples on the home page of just about every U.S. Medical School.This is a an article walking through how to map a curriculum.

So what’s helpful to us isn’t so much what medical schools are doing, but how they are evaluating themselves. 

In recap, neither I nor anyone else who has ever practiced law thinks it would be a good idea to emulate medical schools by fully aligning our curriculum with the bar exam so as to turn the three years of law school into one extended bar prep course.  Among other reasons, the material tested on the bar is quite static and doesn’t reflect the realities of today’s law practice.   It also wouldn’t make much sense for schools whose students take the bar exam in many different jurisdictions.   Also, the bar exam is just not equivalent to the three rounds of USMLE exams in actually testing both the knowledge and application of knowledge needed to be a successful lawyer.  If it was, we wouldn’t hear so many complaints about how students who have passed bar are never-the-less not “practice ready.”

Tomorrow—where can we get the help we need to find out this information, and who is going to pay for it?  Spoiler--Access Lex has a program.

We have to talk about the bar exam

Thank you very much to the team at Best Practices for Legal Education for inviting me to blog this week.  My particular thanks to Elizabeth Murad for administering the blog, Professor Mary Lynch, Kate Stoneman Chair in Law and Democracy & Director, Center for Excellence in Law and President & Dean Alicia Ouellette of Albany Law School for hosting this blog.  It is an honor to join such a distinguished group of scholars and teachers.

We knew it was going to be a bad bar year when on Sept 14, 2018 the NCBE announced that “the national average MBE Score for July 2018” had decreased “about 2.2. points from the July 2018 average.”  And, indeed, as states have announced the pass rates based on their own individual formula of MBE plus essays plus the MPT (multistate performance test) plus their own individualized questions, the results were bad.  A lot of our students failed the bar exam.  Pass rates were down in New York, in California, Georgia, Florida, in Texas, and lots of other places.  Yet at the same time, individual schools saw significant success in raising pass rates in the face of grim news all around them.  All of this makes for glib headlines and much handwringing, but in the context of a blog post on “Best Practices for Legal Education” it is more helpful to take a step back and assess the tools we, as legal educators, have available to us in addressing bar passage in our individual schools.  I do so from my Ph.D. studies in higher education as well as from my experience as a dean, associate dean, law professor, and medical school professor.

One of my main themes this week will be to argue for individualized problem solving.  If anyone comes to you with a product to solve all your bar passage problems, I hope after this week you will be able to ask some questions about the data on which they base their claims.    Because a productive discussion of bar exam passage really rests on two questions—1. Why aren’t the students at your law school passing the bar exam at the rate they “should” and 2. What should you do about it?

I am going to use this week to share with you some of the resources available to law schools, to individual faculty members, and even to law students who want to increase their chances of passing the bar the first time.  Along the way, I hope to address some of the unhelpful myths that have arisen and to endorse a very old idea borrowed from the then revolutionary 1960s era child rearing techniques of Dr. Benjamin Spock: These are your students—and you know more than you think do.  Trust your judgement.  Ask questions.  That doesn’t mean that you can do everything yourself—it’s fine to consult with experts, but in the end addressing bar exam passage issues is a school wide effort and everyone has relevant information to add and a valuable role to play.

To get started, it’s helpful to have an overview of the players.  As a matter of foundational Constitutional Law, each state retains the power to license and regulate professionals.  (more detail here).   As a result, every state and territory has its own process for setting criteria for bar passage.   Almost every state contracts with the National Conference of Bar Examiners which develops the annual exam, grades it, and spends a lot of time explaining itself.  If you have any interest in this topic, a free subscription to The Bar Examiner will quickly bring you up to speed.

Tomorrow–how a test from the 1950’s trips up today’s digital natives (or “Do we need a Tardis to match law school curriculum to the bar exam?”)

Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (Besides the Cut Score Debate)

In addition to her post on Law School Café about alleged correlations between bar exam scores and lawyer discipline (discussed on this blog here), Professor Deborah Merritt recently offered another bar exam-related post. This one provides intriguing historical perspective on the current need to expand the range of skill sets tested on the bar exam. Following up on points made by Professor Derek Muller, Professor Merritt discusses a 1980 study by the California Committee of Bar Examiners, cosponsored by the National Conference of Bar Examiners (NCBE), on adding a clinical component to the bar exam. Several hundred applicants who had taken the July 1980 California Bar Exam volunteered to complete an additional clinical evaluation requiring them, among other things, to interview a client and examine witnesses. Professional actors played the role of clients, akin to the standard patient role that actors perform for clinical evaluations in medicine. The applicants were scored based on an elaborate protocol.

Delving into the statistical results of the study, including comparisons between outcomes on the conventional bar exam and outcomes on the clinical evaluation, Professor Merritt illuminates how crucial it is nearly 40 years later for bar examiners to study and implement alternative assessments of skills not currently evaluated by the bar exam. She points out that, while the study’s results were by no means definitive, they at least suggest “the disturbing conclusion that a significant percentage of conventional bar passers (about two of every five) lack basic practice skills that are essential in representing clients.”

I find this discussion particularly apt in 2017, the 20th anniversary of the first administration of the Multistate Performance Test (MPT), the written skills test now a part of the bar exam in 40 states and D.C.  What started the path toward written performance testing and the MPT? A study conducted by the California Committee of Bar Examiners (cosponsored by the NCBE), possibly the same one referenced by Professor Merritt.  On the occasion of the MPT’s 10-year anniversary in 2007, the Bar Examiner, a magazine published by the NCBE, briefly described the California-based origins of the performance test and indicated that the MPT was ultimately based largely on “the California model.” (The piece, in the November 2007 edition of the Bar Examiner, is apparently not retrievable online.)

Written performance testing was the last meaningful innovation in bar exam testing. In thinking about who might lead an effort toward the next one that introduces greater clinical evaluation, including possibly of oral skills, I think not of a top-down effort from the resolutely conservative NCBE. It is focused on getting as many jurisdictions as possible to adopt its Uniform Bar Exam (26 and counting as of today). Rather, I think of a bottom-up effort by individual states—perhaps with California in the lead—serving as laboratories for testing methods that could ultimately spread to other jurisdictions, thereby persuading or forcing the NCBE to join.

The history of written performance testing is illustrative of my point. Long before the NCBE went forward with the MPT in 1997, not just California but also Alaska and Colorado devised performance tests of their own and administered them on the bar exam. Indeed, those three states were administering performance tests in the early 1980s, playing an important initial role in advancing the cause of a needed bar exam reform. Here, for example, is a follow-up study of the 1983 California Bar Exam, discussing its two performance tests.

The biggest barrier to innovation at the state level is the NCBE’s influence, which increases with each state that adopts the UBE and thereby constrains itself to offer the conventional bar exam that the NCBE requires it to. Indeed, both Alaska and Colorado, two of the original performance test states from the 1980s, have adopted the UBE, meaning neither of those states will be doing any more bar exam innovation. That leaves California (and any of the other 23 states that have yet to join the UBE, none of which matches the influential profile of California).

Why the California study or studies did not lead to some form of clinical evaluation beyond written performance tests is unclear, though two obstacles that come to mind are expense and testing reliability. Indeed, the 1980 study that Professor Merritt references summarized one of its findings as follows: “[T]he relatively low reliability, administrative difficulties, and high costs associated with most (but not necessarily all) standardized oral tasks probably precludes even considering them as possible components of a general bar examination. Written tests of clinical skills, on the other hand, are relatively easy to construct, administer, and score. Further, unlike oral tasks, the score on written tasks are moderately correlated with one another.”

It seems worthwhile to revisit those conclusions, given the passage of time and possible advances in testing methods, and given that the medical profession requires clinical evaluation of its applicants.  Today, 24 years after the MacCrate Report, 20 years after the advent of the MPT, and 10 years after the Carnegie Foundation Report, the legal profession needs a better bar exam.  I join Professor Merritt’s call for a national task force on the bar exam, sponsored by AALS, the Conference of Chief Justices, the ABA Section of Legal Education and Admissions to the Bar, and maybe even the NCBE.  As Professor Merritt writes, such a task force could “study current approaches to the bar exam, develop a more realistic definition of minimum competence, and explore best practices for measuring that competence.”

But I also come back to the states, and to California specifically. There is a vigorous debate going on about whether California should lower its bar exam cut score. That’s an important discussion to have. But I might suggest another discussion to have about the California Bar Exam: Shouldn’t California resist the UBE and instead conduct a new study of alternative methods for assessing today’s relevant lawyering skills that are not encompassed by the UBE?

Professor Merritt’s Blog post on attorney discipline and bar exam WORTH A READ!

Our blog has often posted about many issues related to licensing lawyers, experiential requirements for admission, the monopolizing power of the NCBE and the pros and cons of the UBE.  Thus, I recommend to our readers an excellent post by our blogger friend Professor Deborah Merritt over at Law School Cafe on bar exam scores and lawyer discipline. Professor Merritt analyzes an article by Pepperdine Professors Robert Anderson and Professor Derek Mueller entitled The High Cost of Lowering the Bar Exam.   Professors Anderson and Mueller opine that “lowering the bar examination passing score will likely increase the amount of malpractice, misconduct, and discipline among California lawyers.” Merritt objects to any causal inference noting,

Two key facts, however, weigh strongly against drawing that type of causal inference. First, as Anderson and Muller point out, “[t]here is virtually no discipline in the first 10 years of practice.” If the bar exam measured qualities related to attorney discipline, one would expect to see disciplinary cases emerge during those 10 years. Wouldn’t attorneys with marginal competency (as measured by the current bar exam) reveal their deficiencies during their early practice years?

Second, attorney discipline almost never rests on lack of knowledge about legal doctrine, poor reasoning skills, or bad writing–the skills currently measured by the bar exam. Levin and her colleagues reported that attorneys most often received discipline for failing to communicate with clients (20.0%), lack of diligence (17.93%), and failure to safeguard client property (11.26%). Only 4.14% of disciplinary sanctions related to “competence”–and even some of those cases may have reflected incompetence in areas that are not tested by the bar exam.

My favorite comment by Professor Merritt provides another example from which we should not infer causality (however tempting it might be to some of us who have been hurt by patriarchy),

We should not exclude individuals from a profession based on qualities that merely correlate with misconduct.

To underscore that point, consider this: The strongest predictor of attorney discipline is the y chromosome. Male attorneys are much more likely than female ones to be disciplined. If we want to use correlations to reduce instances of attorney discipline, it would be much more efficient to ban men from the profession, subject them to special character exams, or require them to achieve a higher bar exam score than women. Those actions, of course, would raise special issues of gender discrimination–but they illustrate the drawbacks of predicting malfeasance based on correlations.

These questions and assumed correlations are important ones. Many defend the decreasing bar passage statistics as appropriate market correction to prevent “undesirables” from entry into the profession — a consumer protection argument. However, as Professor Merritt points out, there is so much more to unpack here. For example, most misconduct challenges occur against solo practitioners or small firms. This raises overlapping socio-economic questions: which lawyers could be perceived as easiest to challenge, which lawyers have the best legal defense teams, and which kind of clients have the most reason to complain.

After teaching for over 28 years and observing which graduates pass the bar on the first try and which do not , I am skeptical of the Anderson-Mueller argument. I would love to see the NCBE and other scholars engage in a socio-economic analysis of bar passage and of disciplinary misconduct.

Legislation & Regulation and the Bar Exam

Most readers of this blog will be familiar with the performance test (PT), a portion of the bar exam in 42 states and D.C. (Forty states use the Multistate Performance Test (MPT); examiners in Pennsylvania and California write and administer their own PT.) For states using the Uniform Bar Exam (UBE), the MPT counts for 20 percent of the overall exam score.

I wrote about the performance test previously here. I extolled its virtue as the only part of the exam that exclusively tests lawyering skills, requiring zero memorization of legal rules; and I bemoaned its status as the ugly step-child of the bar exam that gets next to no attention in conversations about exam reform.

Over time, bar examiners have concluded that certain substantive subjects have grown or lessened in importance to law practice such that they have added subjects to the MBE (e.g., Federal Civil Procedure) or dropped subjects from essays (e.g., Secured Transactions, in some jurisdictions). Why not the same with skills on the PT? Is it not fair to say, for example, that a greater percentage of beginning lawyers today work in fields dominated by regulations than did in 1993 when the MPT was born? Yet the vast majority of PTs to this day test the ability to reason from cases, not from statutes or regulations without the aid of cases.

The anti-regulation bent of the current administration notwithstanding, we live in a heavily regulatory state. Lawyers in numerous specialty areas, including health care law and environmental law; lawyers working for government agencies; or lawyers serving as in-house compliance officers—among the most important skill sets for all of them are reading, interpreting and applying statutes and regulations. (Compliance, by the way, has been a growing field, and positions in compliance are J.D. preferred jobs increasingly being filled by newly licensed lawyers.) Many law schools have responded to this reality by adding a 1L course on legislation and regulation to provide law students the needed foundation for practicing law in our heavily regulatory state. (A running list, accessible from here, indicates that about 30 law schools are offering a course of this nature in the first year.)

In reviewing summaries of the last 28 MPT items (covering the last 14 exams back to February 2010), I found only one among the 28 that provided only statutes and regulations and no cases as part of its law library. Typically, PTs presenting issues of statutory application have both statutes and cases in the library, and the cases provide the statutory interpretation needed to answer the issue posed. That’s still common law reasoning—a very important skill, to be sure, but not very helpful for a lawyer when the only applicable law is a statute or a regulation.

All of the above helps to explain how pleasantly surprised I was to see a purely statutory issue on the February 2017 performance test on the Pennsylvania Bar Exam. The assigned task was to write a memorandum analyzing and supporting the client’s position on three legal issues raised by opposing counsel in a motor vehicle accident. One of the issues was whether a driver had violated the state’s law banning texting while driving. The text of the law appeared in the materials, and applicants had to dissect its language and apply it to the facts—all without the aid of cases in the materials, each of which was relevant only to other issues. This is basic stuff, but exactly the kind of basic stuff that beginning lawyers must be able to do well.

%d bloggers like this: