Racial Inequity on the Bar Exam

By Professor Deborah Jones Merritt, The Ohio State University Moritz College of Law

Stark racial disparities mark our profession’s licensing system. Last year, 88% of White candidates passed the bar exam on their first try. For BIPOC candidates, pass rates were significantly lower: 66% for Black candidates, 76% for Latinx candidates, 78% for both Hawaiian and Native American candidates, and 80% for Asian candidates. These racial disparities have existed for decades. Why do they persist? And why do we, as a profession, tolerate them?

New research from the AccessLex Institute confirms what many scholars have suspected: candidates who have the resources to devote extensive time to bar preparation are more likely to succeed than those who lack those resources. The bar exam is a test of family and financial resources, rather than minimum competence to practice law. White candidates are more likely to benefit from those resources than BIPOC ones, and the bar exam mirrors those differences.

But the AccessLex study goes further: it shows that, even after controlling for a host of factors (including study time and LSAT score), White candidates are more likely than candidates of color to pass the bar exam. What explains that racial tilt?

In a recent op-ed, Claudia Angelos, Carol Chomsky, Joan Howarth, and I suggest that at least part of the answer lies in the nature of high-stakes testing. Research repeatedly demonstrates that stereotype threat affects performance on these exams. Test-takers who belong to groups that our culture stereotypes as low-performing on a particular test will perform less ably than they would absent that stereotype. The threat, notably, hits high-achieving individuals particularly hard. And it stretches across race and gender. White men, for example, perform less well on math tests when they are reminded that “Asians outperform Whites” in math. Our profession, in sum, has created a licensing system that predictably and inexorably favors White candidates. At the same time, it overlooks knowledge and skills that we know are essential to protect clients: We don’t test client counseling, fact investigation, cultural competence, or negotiation. Nor do we test doctrinal knowledge in the subjects most likely to assist clients of color: immigration, juvenile law, employment law, fair housing, and other civil rights statutes. It’s time to acknowledge and rework the racist impact of our licensing system. For more, please click here.

Could We Create a New Bar Exam?

Deborah Jones Merritt, Distinguished University Professor and John Deaver Drinko/Baker & Hostetler Chair in Law, The Ohio State University.

A few weeks ago, Logan Cornett and I published a major study of the work that new lawyers do—as well as the knowledge and skills they need for that work. John Lande has already offered two thoughtful posts about the study (thanks, John!). We think the research offers important information for legal educators, bar examiners, and workplace supervisors: we need to align education, licensing, and supervision to improve the service that new lawyers provide clients.

Will it be possible to achieve that alignment? In particular, can we make the bar exam a more valid assessment of the knowledge and skills that new attorneys need? I hope so. Neither states nor the National Conference of Bar Examiners (NCBE) have ever validated the current exams; that means we have no evidence that the skills and knowledge assessed on the exams match the ones that new lawyers use in practice. NCBE’s current attempt to validate the exam—through survey evidence—does not capture sufficient detail about the ways that new lawyers use their knowledge and skills in the workplace. Our study, based on 50 focus groups held in locations across the country, provides those more detailed insights. Combining our findings with those from NCBE’s study and other research could yield a valid licensing process.

An invalid exam would be embarrassing enough for a profession that prides itself on logic and reason, but our current bar exams have another flaw: they pass white candidates significantly more often than candidates of color. An exam that has never been validated, yet discriminates against candidates of color, is unthinkable in the modern age—and yet, we have tolerated these exams for decades.

It’s time to change, but will we have the courage to do so? Our profession has a deep attachment to closed-book exams, multiple choice questions, and time-pressured tests. The research that Logan Cornett and I did convincingly shows that none of these assessment methods are appropriate for the legal profession. Entry-level lawyers work from sources, not memory; they gather information to solve open-ended problems rather than choosing one canned answer from four; and, although they often practice under time pressure, the time constraints on the bar exam are dangerously unrealistic.

We need to challenge all three of these obstacles to a valid licensing process. Our report suggests many ways to accomplish that end. Here’s just one modest proposal that would significantly improve the validity of the bar exam:

  • Maintain the Multistate Professional Responsibility Exam (MPRE), which tests knowledge of basic principles of professional conduct, but make the exam open book. No one can wind their way through the dense rules of professional conduct and commentary without previous study, so an open-book exam won’t make the test “too easy.” On the contrary, an open-book exam would encourage new lawyers to check the rules and commentary whenever they face a conduct issue. That’s a habit we want to encourage, not discourage.
  • Maintain two performance tests like the ones currently prepared by NCBE, but allow 3 hours (rather than 90 minutes) for each test. Expanding the time frame would make these tests more realistic measures of minimum competence. It might also make grading more reliable because graders would be faced with real-world products produced under realistic time constraints.
  • Create a 3-hour research exam that consists of multiple-choice or fill-in-the-blank questions. This exam would test the kind of quick research that lawyers do routinely: What is the statute of limitations for medical malpractice in Ohio? Does a will need witnesses to be valid in Texas? Give candidates access to any online tools they desire to do this research.
  • Create a 3-hour, multiple-choice exam that tests (a) basic understanding of U.S. legal processes and sources of law OR (b) a single substantive subject (such as civil procedure, contracts, business law, or family law). If the latter, consider giving candidates a choice of the area in which they wish to test.

Our research suggests that the final doctrinal component of this exam should be open-book; new lawyers simply don’t work from memory. Instead, they internalize basic principles that allow them to identify issues in a client problem and find the specific rules they need to address that problem. But if bar examiners insist on a closed-book exam component, they could make this portion of the exam closed-book. Memorizing the rules related to a single subject—especially if the subject reflects an area in which the lawyer hopes to practice—more closely parallels the work that new lawyers do than memorizing the rules that govern ten or more fields.

A bar exam with the four components listed above could be administered according to our current schedule. I.e., candidates could take the MPRE on one occasion and the other three components over two days in late July or February. Alternatively, the proposed exam could be divided into five different components (the MPRE, each of two performance tests, the research test, and the doctrinal test) and states could allow candidates to take the components at different times and in any order. Breaking the exam into components would relieve some stress and give candidates more flexibility. It might also allow candidates to determine, while still in law school, that they lack skills needed for law practice—and either choose a different career or remedy those deficiencies through more coursework.

Our report offers other options for licensing, including a rigorously structured diploma privilege. We also recommend that states complement any written exam by requiring candidates to complete law school clinics and courses in client counseling and negotiation. Those experiences assure instruction, practice, and feedback on skills that are difficult to test through a written exam.

It’s time to get serious about aligning legal education and licensing with the work that new lawyers do. Only then will we fulfill our mission of protecting the public.


Let’s Take This Period of Unprecedented Change to Consider How Grading Practices Can Affect Issues of Diversity and Inclusion in Our Law Schools

Jennifer S. Bard, Visiting Professor, University of Florida, Levin College of Law

For the last half of spring semester 2020, law schools all over the country were forced to change their method of instruction, delivery of final exams, and (in many cases) grading practices because of the demands for physical isolation following the outbreak of Covid-19.  Now that the semester is over, there is a further round of disruption as many states have delayed or even cancelled their bar exams, some have granted graduates diploma privileges, while others bravely go ahead in the face of a possibility that they will have to cancel at the last minute because of ever-rising rates of infection. 

Like the opportunities that may arise when a river is drained and a ship revealed, there may never again be such an opportunity for us to consider what role we play in the glacially slow diversification of the legal profession and how we can make our law schools more equitable, inclusive, challenging, and effective for all of our students—not just those for whom it has been particularly well suited.

With many things to choose from, my starting point for looking at things we rarely question is the marrow deep belief that we owe it to our students to sort them for the benefit of large law firms—even when our employment profile shows that very few of our students will ever work at such a place.  Since the threshold for this opportunity is a top 5 or perhaps 10 percent class rank, it may seem odd, on reflection, that we have designed a curriculum designed to compare students that may have many undesirable consequences including undermining self-esteem, discouraging learning for learning’s sake, and contributing to the lack of diversity in the legal profession.  

Over the years, other justifications have been added such as the need to motivate students or assess their progress but never have we had such a good opportunity to see what law school is like without grades or, more to the point, comparative curves.

Here are some Practices We Might Question

The Primacy of First Semester Grades

One result of the decision to go pass/fail (or some variation of the words) was to “freeze” first year first semester class ranks because it was impossible to produce comparative curves

The resulting phenomena gives us a chance to ask ourselves  some tough questions:

  1. Do First Semester Grades Reflect What Students Bring to Law School Rather Than What We Bring to Them? OR Do Students Who Come in Knowing the Rules Get Better First Semester Grades?

Many students, very often First Generation Students, but also some facing racial or gender identity or expression based discrimination, frequently tell us (and the many researchers who study first generation college students) some version of “everyone knew the game but me and by the time I figured it out, it was too late.” And while students living with disabilities might intersect with any of these groups, they also are often using new adaptive equipment and certainly facing new challenges that they may have been able to mitigate in college.

Certainly many of our students do know the game from the start.  The recent AALS survey “Before the JD” found a disproportionate number of students who ended up going to law school had parents who were either lawyers or professionals. While students have, themselves, created organizations to support each-other usually with the enthusiastic support of the law school it may not be enough.

Our challenge going forward is that history is told by the victors.  We can see the students who were not comfortable the first semester but then continued to graduate “at the top of their class” (a vague term that usually means somewhere in the top 20%), but we don’t hear from the ones who didn’t draw attention through academic distress, but also didn’t thrive.

It would be helpful to know more–and many schools do know more about their own students.  But so little of this information is published.

Much is being done in supplemental programs- to name them is to leave many out- such as pre-first semester programs, orientation programs  and excellent pre-law institutes like the Tennessee Institute for Pre-Law , and in wonderful conferences organized by the National Black Law Students AssociationLavender Law, the National Association of Law Students with Disabilities,  and so many others.  

But how much more effective would it be to have a curriculum that was truly equitable and inclusive – all the way through?

2. Did Pass/Fail Grading Help Learning, Hinder Learning, or None of the Above?

Across the board pass/fail grading that makes no effort to compare students to each other is so unusual as to make any observations worth considering. The expectation was a distressing list of bad results-students putting in less effort during class, performing worse on exams — but did that really happen?

3. Ditto Open Book Exams

As above, it would be interesting to test, in the fall, the content knowledge of students who took open exams.  Not so much as to compare them with past classes, but to see what how much they learned.

4. What Will Be the Long Term Effect of the Delayed or Cancelled Bar Exams–and How Might that Change Our Curriculums?

The opportunity presented by the necessary changes to the bar exam is already in very good hands, (thank you AccesLex) but it’s still worth considering what the future will look like in states which choose provisional or full licensure.  Even decisions to delay the bar exam could raise issues of an on-going, career long licensing process, much as many doctors (but not all) must take requalifying exams every ten years to retain their “Board Certificate.” What would that mean for law schools?

To Be Continued: Part II: What Can We Learn from the Delay of Fall On-Campus Interviewing?   

Refuting the False Trope on Clinical Courses and Bar Passage

Robert Kuehn, Washington University School of Law

It has been observed that “the fewer the facts, the stronger the opinion.” Until recently, this could be said about the possible influence of enrollment in clinical courses on a student’s likelihood of passing the bar examination. While there was a shortage of empirical studies on any possible relationship, there have been plenty of opinions on how taking those courses might be harmful, opinions often reflected in graduation restrictions on clinical courses and requirements for bar subject-matter courses.

But, there are now significantly more facts to refute those opinions. Two recent, large-scale studies have both found no relationship between the number of law clinic or externship courses or credits a law graduate took and her likelihood of passing the bar exam.

In a forthcoming article in the Journal of Legal Education, academic records of ten years of law school graduates of Washington University in St. Louis and Wayne State University were reviewed for any relationship between the number of law clinic, externship, or, more generally, experiential courses or credits and bar passage. After first accounting for the possible influence of law school grades on bar passage (the most significant predictor of bar success), the study found no correlation at either school between law clinic or externship enrollment and bar passage — no relationship between participation in a law clinic or externship and passage, none between the number of clinical courses and passage, none between the number of clinical credits and passage, and no evidence that students graduating with lower GPAs were disproportionately enrolling in those courses as a way to avoid doctrinal courses (another not uncommon trope). This lack of any relationship was in spite of increased enrollment in experiential courses at both schools over the ten-year period and decreased enrollment in courses teaching material tested on the bar (referred to as bar subject-matter courses).

The article notes that nationwide data on experiential course enrollment and bar passage also belie any claim the two are related. That data indicate that as enrollment in experiential courses was increasing from 2006-2016, bar passage percentages were fairly steady and that the recent decline in passage coincided with decreased, not increased, enrollment in those courses.

A recent study commissioned by the California State Bar found a similar lack of relationship between law clinic and externship courses and bar exam performance. The study reviewed law school coursework and performance on three July exams for over 7,500 bar applicants from eleven California schools. It found no relationship between the number of academic credits from law clinic courses and exam performance, either across all schools or even when reviewing schools separately. Similarly, there was no relationship between the number of externship or internship credits and performance, again when examined across all schools or within schools. The broad range of entering credentials at the eleven schools, and lack of a relationship even within those schools, indicates that the results should be applicable to most law schools, including those with lower LSATs and undergraduate GPAs for entering students.

The study results from Washington University/Wayne State and the California State Bar are similar to smaller studies at Texas Tech and the University of Denver that also reported no statistically significant relationship between enrollment in a law clinic or externship course and bar passage.

The Washington University/Wayne State and California State Bar studies further revealed that opinions about the value of bar subject-matter courses should be moderated. There were small correlations at both schools between the number of bar subject courses and bar passage. But this result (explaining less than 5% of the variability in bar outcomes) was only for low performing students and additional courses showed no marginal benefit once students took the school’s average number of bar courses.

The California State Bar study focused on whether taking a specific course was related to performance on the bar exam topic taught in those courses. It found that neither attendance nor performance in courses covering any of the 13 bar-related topics was related to performance on the corresponding California bar exam or Multistate Bar Exam content covering that subject.

Studies at other schools also indicate that enrollment in bar subject-related courses do not support broad claims about the benefit of taking those courses.

It is time to put away the misinformed trope of participation in law clinic and externship courses harming a student’s chances of passing the bar exam and let the facts do the talking. Law schools should recognize and students should be told they can obtain valuable preparation for the practice of law by enrolling in clinical courses without affecting their likelihood of passing the bar exam.

An Overview of “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes”

Robert Kuehn, Professor of Law at Washington University School of Law in St. Louis, and David Moss, Associate Clinical Professor at Wayne State University Law School, recently conducted a large-scale study looking at the relationship between clinical/experiential or bar subject-matter courses and bar passage success in a paper entitled, “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes.” As a law student currently enrolled in a clinic, this study immediately piqued my interest.

This study was in response to fear that bar passage rates were down because of rising enrollment in “experiential courses” as opposed to “bar-subject courses.” Law schools began pushing students to enroll in more of these bar-subject courses to correct this so-called issue. However, Professor Kuehn and Professor Moss observed that there was no evidence to suggest that taking more bar-subject courses was appropriate advice for all students. Their study looked at this missing evidence between bar-subject courses and experiential learning and bar exam outcomes for ten years between two law schools: Washington University School of Law (WashU) and Wayne State University Law School (Wayne State). Both schools only require the designed first year courses and the upper-level writing courses mandated by ABA accreditation standards.

Previous studies performed in states like Texas, Colorado, and California looked at the effects of coursework and bar passage rates. These studies did not support the claim that taking more bar-tested law school course improve chances of passing on the first attempt. Notably, a study done in Indiana concluded, “simply forcing lower-performing students to take more upper division bar-subject courses will not solve the bar examination failure problem.”

The first goal of the present study was to determine whether a graduate’s enrollment in elective experiential courses was related to first-time bar passage success. Next, it was to assess whether enrollment in elective courses that cover bar subjects was related to bar success.

Data was collected from law school graduates from 2006-2015. The following table outlines the number of graduates with LSAT scores and bar passage rates between the two schools:

The next table looked at graduate characteristics such as undergraduate GPA, LSAT score, 1L GPA, and law GPA and their correlation with bar passage:

It wasn’t until 2005 that the ABA began requiring graduates to receive professional skills instruction with as little as one credit satisfying the requirement. In 2014, the ABA changed this to require six credits beginning with 2019 graduates. The study authors decided to track enrollment in skills courses versus bar passage over this time period.

The table above reveals a solid line depicting that average bar passage percentages were steady from 2006-2013 (this is when experiential course enrollment increased by over 50%). During the significant rise in experiential enrollment, bar passage percentages were largely steady. “Therefore, efforts to link declining nationwide bar passage rates to the rise in experiential course enrollment are not supported by national statistics.” A more likely contributing cause for bar passage declines since 2014 is weaker credentials of incoming 1Ls.

At WashU, it was found that while taking at least the average number of bar courses is associated with increased likelihood of passing the bar, there was no statistically significant increase in bar passage associated with bottom-quartile LGPA graduates who took more than the school’s average. This was similar with graduates in the bottom half of their class at Wayne State. Results for both schools indicate that graduates in bottom quartile who take fewer than the average number of bar courses at their school were associated with a significant increase in bar failure. Further, at both schools, students entering with scores lower than 150 were associated with pass rates significantly below the school’s average.

This study concluded that the claim that the dramatic decline in bar passage rates is due to law students taking more experiential courses or fewer bar-related courses is not supported. It characterized efforts to cap experiential credits in order to improve bar passages rates are “misguided,” warning that schools should not expect that “mere exposure” to more bar courses will significantly improve bar passage rates.

Also see “Legal Skills Prof Blog” and “TaxProf Blog” for more posts on this study

%d bloggers like this: