Racial Inequity on the Bar Exam

By Professor Deborah Jones Merritt, The Ohio State University Moritz College of Law

Stark racial disparities mark our profession’s licensing system. Last year, 88% of White candidates passed the bar exam on their first try. For BIPOC candidates, pass rates were significantly lower: 66% for Black candidates, 76% for Latinx candidates, 78% for both Hawaiian and Native American candidates, and 80% for Asian candidates. These racial disparities have existed for decades. Why do they persist? And why do we, as a profession, tolerate them?

New research from the AccessLex Institute confirms what many scholars have suspected: candidates who have the resources to devote extensive time to bar preparation are more likely to succeed than those who lack those resources. The bar exam is a test of family and financial resources, rather than minimum competence to practice law. White candidates are more likely to benefit from those resources than BIPOC ones, and the bar exam mirrors those differences.

But the AccessLex study goes further: it shows that, even after controlling for a host of factors (including study time and LSAT score), White candidates are more likely than candidates of color to pass the bar exam. What explains that racial tilt?

In a recent op-ed, Claudia Angelos, Carol Chomsky, Joan Howarth, and I suggest that at least part of the answer lies in the nature of high-stakes testing. Research repeatedly demonstrates that stereotype threat affects performance on these exams. Test-takers who belong to groups that our culture stereotypes as low-performing on a particular test will perform less ably than they would absent that stereotype. The threat, notably, hits high-achieving individuals particularly hard. And it stretches across race and gender. White men, for example, perform less well on math tests when they are reminded that “Asians outperform Whites” in math. Our profession, in sum, has created a licensing system that predictably and inexorably favors White candidates. At the same time, it overlooks knowledge and skills that we know are essential to protect clients: We don’t test client counseling, fact investigation, cultural competence, or negotiation. Nor do we test doctrinal knowledge in the subjects most likely to assist clients of color: immigration, juvenile law, employment law, fair housing, and other civil rights statutes. It’s time to acknowledge and rework the racist impact of our licensing system. For more, please click here.

Why Bar Examiners Should Eliminate Essay Questions and Focus on Performance Tests

The NCBE has announced that in four to five years, there will be a different bar exam.  In the interim, why not make changes to the existing exam so that it better reflects the skills needed in law practice?  One easily implementable change: eliminate the essay questions and use the time allotted to those questions for the performance test questions.

This idea was raised by  Professor Deborah Merritt in her talk at the BYU Annual Law and Leadership conference. The idea stems from her groundbreaking study on the skills new lawyers actually need to competently represent clients.  Professor Merritt and researchers at IAALS analyzed data from 50 focus groups conducted in twelve states with a diverse group of new lawyers and their supervisors across a range of practice areas. 

The study data confirmed what we intuitively knew: that the bar exam MPT performance test most resembles what new lawyers do in practice. 

On the other hand, bar exam essay questions require examinees to memorize significant amounts of information beyond what they must already memorize for the multiple choice questions [MBE].  Many of the subjects memorized for the current essay questions will be eliminated from the NCBE’s proposed new exam.

The essay questions also do not allow examinees to research or review the applicable legal rules before answering, and they have little relationship to how new lawyers approach problems in practice.  On the other hand, the performance test questions do relate to the skills new lawyers use.

As Sara Berman notes in her book, Bar Exam MPT Preparation & Experiential Learning for Law Students mastering how to take the performance tests is an opportunity to practice skills necessary for law practice.  This book offers a logical and easy to follow process for studying for, and taking, the performance tests. 

Given the existing bar exam format, Berman appropriately devotes a significant portion of the book to strategies that help examinees understand how to write an effective answer in the time allotted.  Her excellent advice and strategies are particularly useful given the tight time frame examinees have to answer these questions and she provides a lot of sample questions for practicing skills, including reading all the materials and drafting a document in ninety minutes. 

Berman’s strategies for managing the MPT time constraints make sense because, as Professor William Henderson’s study confirmed, test taking speed on tests such as the bar exam is an independent variable when it comes to assessing test performance.  Thus, it is important to teach students to quickly answer test questions.  However, is that the message we should be sending?  Should we be encouraging speed over careful reading and ensuring accuracy?  

If the NCBE eliminated the essay questions and allotted that time to the MPTS, or even if it simply gave states the option to adopt this change, it could study its impact.  Currently, we don’t know whether three hours versus ninety minutes to answer an MPT question makes a difference, and if so, for whom. Does the difference affect those who come close to, but do not achieve, the arbitrarily set passing cut scores?   These questions are particularly important given the significant impact cut scores play on the profession’s racial and ethnic diversity. 

This minor change, which could have a major impact, is just one of the many innovative ideas presented by Professor Merritt at the BYU conference on law licensing reform.  Her remarks, as well as the remarks of other conference speakers, are available if you scroll to the bottom of this website.

The Next Generation of the Bar Exam, NCBE Style

The Testing Task Force of the National Conference of Bar Examiners (NCBE) recently completed the three phases of its “future-focused study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in a changing legal profession.” Earlier this month, as tumultuous events in Washington, D.C. grabbed most of the headlines, the task force quietly released an Overview of Preliminary Recommendations for the Next Generation of the Bar Exam. The recommendations are a big deal—possibly a game changer—and deserve some discussion.

I see three positive aspects of the task force’s preliminary recommendations:

First, quoting the document directly, “Our decisions were guided by the prevailing views expressed by stakeholders . . . : that the bar exam should test fewer subjects and should test less broadly and deeply within the subjects covered, [and] that greater emphasis should be placed on assessment of lawyering skills to better reflect real-world practice and the types of activities newly licensed lawyers perform . . . .” So many critics of the current bar exam, myself included, have argued exactly these positions for so long, but to read them coming from the task force directly affiliated with the NCBE is no small matter.

Indeed, the task force recommends a significant reduction in the number of legal subjects to be tested—just Civil Procedure, Contracts, Evidence, Torts, Business Associations, Constitutional Law, Criminal Law, Criminal Procedure (limited to constitutional protections), and Real Property. These reductions, if they were adopted, would cut from the current Multistate Essay Examination (MEE) subjects, not the Multistate Bar Exam (MBE) ones. Many will understandably quibble with the retention of all seven MBE subjects, and indeed I will do just that below. Still, one can hope that, if the task force’s recommendations are adopted by the NCBE, the bar exam of the future will in fact cover the several MBE subjects “less broadly and deeply” than the extent to which they are currently tested.

One reason to be hopeful that the next generation of the bar exam would in fact test MBE subjects less broadly and less deeply is that the task force is recommending a fundamental shift away from the model of discrete exams like the MBE, MEE, and Multistate Performance Test (MPT). It is instead calling for an integrated exam that would test “knowledge and skills holistically, using both stand-alone questions and item sets, as well as a combination of item formats (e.g., selected-response, short-answer, and extended constructed-response items).” As always, the devil will be in the details, but this preliminary recommendation sounds at least somewhat promising.

Second, the task force rightly recognizes the need to expand the scope of the foundational skills covered on the bar exam to include more than just legal analysis and writing. Among the skills to be encompassed by the projected bar exam of the future are legal research, factual investigation and evaluation (including fact gathering), client counseling and advising, client relationship and management, and negotiation and dispute resolution. It might be hard to imagine how some of these skill sets could be tested, but the task force foresees “uniform text- or video-based scenarios that require candidates to construct a written response or select the correct response.”

Third, the task force recommends that the next generation of the exam should be “a computer-based test, administered either on candidates’ laptops in jurisdiction-managed facilities and/or at computer testing centers managed by a suitable vendor.” The pandemic forced many state bar examiners to abruptly adopt computer-based testing. It makes sense for myriad reasons to move in the direction of computer-based testing and to do so as part of a deliberate and careful process.

Having said all that, there are at least two aspects of the task force’s recommendation that strike me as problematic—or at least present the risk of inadequate or counterproductive reform to the bar exam.

First, as mentioned earlier, there is the question of why the bar exam should continue to test legal knowledge in all seven of the MBE subject areas. Testing knowledge of such a large number of subject areas is especially problematic when considering that, by all appearances, the exam would remain closed-book. As Professor Deborah Merritt has cogently explained here, if the bar exam of the future needs to remain a closed-book exam, then the doctrinal portion of the exam should cover no more than a basic understanding of U.S. legal processes and sources of law or a single substantive subject. In arguing for an open-book exam, Merritt correctly points out that “new lawyers simply don’t work from memory. Instead, they internalize basic principles that allow them to identify issues in a client problem and find the specific rules they need to address that problem.”

Moreover, the empirical data on job activities of newly licensed lawyers, gathered during Phase 2 of the task force’s study, reveals that both experienced and newly licensed lawyers rated knowledge of Torts, Constitutional Law, Criminal Law, Criminal Procedure, and Real Property Law as only moderately important (all receiving an average rating of 2.2 or lower out of 3). They rated knowledge of all those subject areas as less important than knowledge of Statutory Interpretation Principles or Statutes of Limitations (both receiving an average rating of 2.3), neither of which is included as a stand-alone subject to be tested on the task force’s vision of the bar exam. To be sure, Civil Procedure (2.6), Contract Law (2.6), and Rules of Evidence (2.5) all received noticeably higher average ratings. Perhaps one or more of only those subject areas could be tested.

Second, on a more micro level, is the peculiar fate of one specific knowledge area: Statutory Interpretation Principles. In phase 3 of the study, the task force proposed that Statutory Interpretation Principles should not be tested as a stand-alone subject but rather could be subsumed under the Constitutional Law subject area and the skills portion of the exam. The suggestion that statutory interpretation can be subsumed under Constitutional Law is a dubious one. Rarely does a court do a deep textual dive into the precise phrasing of a constitutional provision as courts routinely do when confronted with a question of statutory construction. Moreover, questions on substantive knowledge of constitutional law will routinely call on knowledge of Supreme Court precedents, not on dissection of the text of a constitutional provision.

Statutory interpretation is plainly distinct from constitutional interpretation in critical ways. Hence, I trust that the skills components of the bar exam of the future will include not just cases as sources of law, but also statutory and regulatory provisions (cutting across many different substantive subject areas) which applicants have to dissect and interpret. Based on what I’ve heard from the task force, I am optimistic on this point. I discussed the extent to which statutory sources are used on the performance test, currently the only exclusively skills component of the bar exam, in an earlier post here.

In sum, much is uncertain and will take shape over time. But the early indications give some reason to be cautiously optimistic about the NCBE testing task force’s vision of a new bar exam.

Could We Create a New Bar Exam?

Deborah Jones Merritt, Distinguished University Professor and John Deaver Drinko/Baker & Hostetler Chair in Law, The Ohio State University.

A few weeks ago, Logan Cornett and I published a major study of the work that new lawyers do—as well as the knowledge and skills they need for that work. John Lande has already offered two thoughtful posts about the study (thanks, John!). We think the research offers important information for legal educators, bar examiners, and workplace supervisors: we need to align education, licensing, and supervision to improve the service that new lawyers provide clients.

Will it be possible to achieve that alignment? In particular, can we make the bar exam a more valid assessment of the knowledge and skills that new attorneys need? I hope so. Neither states nor the National Conference of Bar Examiners (NCBE) have ever validated the current exams; that means we have no evidence that the skills and knowledge assessed on the exams match the ones that new lawyers use in practice. NCBE’s current attempt to validate the exam—through survey evidence—does not capture sufficient detail about the ways that new lawyers use their knowledge and skills in the workplace. Our study, based on 50 focus groups held in locations across the country, provides those more detailed insights. Combining our findings with those from NCBE’s study and other research could yield a valid licensing process.

An invalid exam would be embarrassing enough for a profession that prides itself on logic and reason, but our current bar exams have another flaw: they pass white candidates significantly more often than candidates of color. An exam that has never been validated, yet discriminates against candidates of color, is unthinkable in the modern age—and yet, we have tolerated these exams for decades.

It’s time to change, but will we have the courage to do so? Our profession has a deep attachment to closed-book exams, multiple choice questions, and time-pressured tests. The research that Logan Cornett and I did convincingly shows that none of these assessment methods are appropriate for the legal profession. Entry-level lawyers work from sources, not memory; they gather information to solve open-ended problems rather than choosing one canned answer from four; and, although they often practice under time pressure, the time constraints on the bar exam are dangerously unrealistic.

We need to challenge all three of these obstacles to a valid licensing process. Our report suggests many ways to accomplish that end. Here’s just one modest proposal that would significantly improve the validity of the bar exam:

  • Maintain the Multistate Professional Responsibility Exam (MPRE), which tests knowledge of basic principles of professional conduct, but make the exam open book. No one can wind their way through the dense rules of professional conduct and commentary without previous study, so an open-book exam won’t make the test “too easy.” On the contrary, an open-book exam would encourage new lawyers to check the rules and commentary whenever they face a conduct issue. That’s a habit we want to encourage, not discourage.
  • Maintain two performance tests like the ones currently prepared by NCBE, but allow 3 hours (rather than 90 minutes) for each test. Expanding the time frame would make these tests more realistic measures of minimum competence. It might also make grading more reliable because graders would be faced with real-world products produced under realistic time constraints.
  • Create a 3-hour research exam that consists of multiple-choice or fill-in-the-blank questions. This exam would test the kind of quick research that lawyers do routinely: What is the statute of limitations for medical malpractice in Ohio? Does a will need witnesses to be valid in Texas? Give candidates access to any online tools they desire to do this research.
  • Create a 3-hour, multiple-choice exam that tests (a) basic understanding of U.S. legal processes and sources of law OR (b) a single substantive subject (such as civil procedure, contracts, business law, or family law). If the latter, consider giving candidates a choice of the area in which they wish to test.

Our research suggests that the final doctrinal component of this exam should be open-book; new lawyers simply don’t work from memory. Instead, they internalize basic principles that allow them to identify issues in a client problem and find the specific rules they need to address that problem. But if bar examiners insist on a closed-book exam component, they could make this portion of the exam closed-book. Memorizing the rules related to a single subject—especially if the subject reflects an area in which the lawyer hopes to practice—more closely parallels the work that new lawyers do than memorizing the rules that govern ten or more fields.

A bar exam with the four components listed above could be administered according to our current schedule. I.e., candidates could take the MPRE on one occasion and the other three components over two days in late July or February. Alternatively, the proposed exam could be divided into five different components (the MPRE, each of two performance tests, the research test, and the doctrinal test) and states could allow candidates to take the components at different times and in any order. Breaking the exam into components would relieve some stress and give candidates more flexibility. It might also allow candidates to determine, while still in law school, that they lack skills needed for law practice—and either choose a different career or remedy those deficiencies through more coursework.

Our report offers other options for licensing, including a rigorously structured diploma privilege. We also recommend that states complement any written exam by requiring candidates to complete law school clinics and courses in client counseling and negotiation. Those experiences assure instruction, practice, and feedback on skills that are difficult to test through a written exam.

It’s time to get serious about aligning legal education and licensing with the work that new lawyers do. Only then will we fulfill our mission of protecting the public.


Menstrual Products and the Bar: Advocacy Seeks to Create Equal Bar Exam Testing Conditions for Menstruators

By: Elizabeth B. Cooper, Fordham Law School; Margaret E. Johnson, U. Baltimore Law (visiting at American); and Marcy L. Karin, UDC David A. Clarke School of Law

We can all recall the stress, fear, and worry that accompany taking the bar exam.  About half of us also were anxious we would have to manage our period in the middle of this awful two-to-three-day ordeal.  Bar examiners across the country have made this prospect far more daunting than it needs to be by not treating menstruation as the natural biological process that it is.

Without proof of any test-taker having ever cheated using a tampon or pad, some states have chosen to stigmatize and potentially penalize people who have their periods with draconian policies prohibiting bar examinees from bringing their own menstrual products with them.  Other states have failed to adopt or communicate clear policies on the subject, increasing test-takers’ anxiety: one should not have to waste time researching the Bar Examiners’ hard-to-find policies  or calling their offices for answers—which may, or may not, yield consistent information. 

The harm here is four-fold: 1. It is wrong to make test-taking conditions more challenging for people based on the fact they menstruate; 2. It is wrong to limit test-takers to random products selected by Bar Examiners that could put test-takers’ health and menstruation management at risk; 3. It is wrong to exclude text-takers from any menstrual products simply because they do not use the women’s restroom; and 4. It is wrong to convey the harmful message that all people who menstruate are untrustworthy and do not belong in the legal profession. 

Some states, including Texas and Pennsylvania, prohibited exam-takers from bringing in their own menstrual products, offering to provide a limited and unpredictable set of products in the women’s bathroom.  (After much advocacy, Texas changed its rule for the September exam, though it is unclear if this is a permanent change.)  This does not solve the problems these states created in the first place by banning test-takers from bringing in their own products.  People who menstruate need their own products because menstrual products are not “one size fits all”: menstruaters require different sizes and levels of absorbency in their products to best fit their body and menstrual flow.  

Use of the wrong size product can lead to everything from pain and discomfort to toxic shock syndrome (if too large) and time-consuming, uncomfortable, and disruptive leaks (if too small). Further, some individuals require hypoallergenic products to protect against allergic reactions.  If not provided, applicants may experience vaginal itching or other problems caused by using allergen-containing tampons or pads inside or adjacent to their bodies.  All of these consequences are awful enough on their own; here, they create an unconscionable risk of derailing exam performance.

In addition, by limiting test-takers from bringing in their own products and then providing products only in the women’s restrooms, Bar Examiners relegate transgender men and nonbinary persons who may menstruate, and who may use the men’s restrooms or all-gender restrooms, to having no access to menstrual products during the bar exam.

Other states allow test-takers to bring their own products, but require them to be packaged in a clear plastic bag—with some states mandating that the product be unwrapped.  This last requirement makes no sense: the wrapper both keeps the product hygienic before being inserted into or placed adjacent to one’s body and provides an efficient way to safely dispose of used products, reducing janitorial staff’s exposure to bodily fluids.  Further, removing the wrapping exposes the adhesive on the bottom of some pads, rendering them practically useless when the menstruator tries to unstick them from the clear plastic bag.

As much as we want to destigmatize menstruation and eradicate the embarrassment and taboo of being seen with a tampon or pad, it remains an invasion of privacy to require test-takers to carry their products in a clear plastic bag, revealing to a proctor (and possibly a classmates, colleagues, or future opposing counsel) that one has or expects to get their period during the exam.  (One North Carolina bar exam test-taker reported that a proctor asked her if she “really needed those” while inspecting her plastic bag of menstrual products.)  Finally, this intrusion is even more painful for, and potentially outs, transgender men and non-binary law graduates who may not be public about their biological sex.  It may even set them up for bigoted harassment—during the biggest exam of their lives.

Other states allow test-takers to bring their own products and do not require them to be carried in a clear bag—but, they must check them with a proctor or retrieve them outside the exam room before heading to the restroom.  This “solution” means that a menstruating person with will have to take vital time away from the exam (or a break between sections of the exam) to obtain their menstrual products before using the restroom.  This “time tax” is as unacceptable as the other approaches described above.

At least some states treat people who menstruate without such bizarre suspicion, allowing them to bring in and keep their own products with them during the exam, and use them as needed during the test—without having to ask a stranger for their own personal possessions.  To date, there have been no known accusations of test-takers trying to do the impossible: write helpful information on a pad or tampon to give them an edge on the exam or smuggle in written answers inside the product’s wrapping.

The lack of uniformity of equity-based rules permitting access to one’s own menstrual products is unacceptable and must be changed. Thankfully, in the age of social media, law graduates have taken the lead on this advocacy, sharing the hurdles they are facing on social media and asking state Bar Examiners to eliminate these outrageous rules, largely under the #bloodybarpocalypse hashtag. 

Once we saw their posts, the three of us, working with fantastic former clinic students of Fordham and UDC, began advocating that all state Bar Examiners adopt better menstrual products policies.  We drafted a letter to the National Conference of Bar Examiners (NCBE)—co-signed by over 2800 law professors, law students, law school graduates, and lawyers in under 24 hours.  We also sent letters to Bar Examiners in each state that administered an in-person July bar exam and did not have a clear, acceptable policy in place.  All of these efforts led to some quick changes. 

The NCBE contacted state Bar Examiners and informed them that menstrual products were not included in the NCBE’s “prohibited paper” category and that test-takers should be able to bring in their own products.  The press started asking questions of the state Bar Examiners.  And state Bar Examiners began changing or clarifying their policies, with some confirming to examinees that they could bring personal menstrual products to the exam.  For instance, West Virginia Bar Examiners insisted that they permitted products in the exam room, even though their website said differently. Texas state Bar Examiners changed their policy from not permitting products to permitting them at its September exam.  (The state has issued contradictory statements, however, about whether this change is permanent.)

This positive change is not, however, uniform: even those states that have adopted equitable policies must be monitored to ensure they are adopting best practices.  In our efforts to get accurate and honest information from state Bar Examiners across the country, it has been deeply disconcerting to learn how many jurisdictions are silent on whether examinees may bring in their own menstrual products; have informal policies that contradict written statements about what items are allowed in the exam (e.g., not listing menstrual products in the list of items test-takers can bring in, but informally allowing them); or have stubbornly held onto their recalcitrant policies.  

Equally unacceptable, many Bar Examiners will not share the documentation that they say embodies their policies (e.g., generic letters to test-takers informing them what they can and cannot bring into the exam; postings on their web sites behind a security wall).  Without this proof, there is no accountability and the true practices of these states remain unknown.   

As we reach out to jurisdictions administering in-person exams in the coming months, our demands are clear: Bar Examiners must issue explicit policies permitting examinees to bring their own menstrual products in to bar exams, in an opaque container or on their person, and to publish these policies on their websites.  Other bar-related policies that can have disproportionate effects also must be changed.  For instance, examinees needing to pump their breastmilk must be given ready accommodations and bathroom access must not be limited as it affects both pumpers and menstruators.

To learn more about all of the advocacy efforts in this area, check out Menstrual Equity and the Bar Exam: Round Up of Op-Eds and Other Media Coverage on the Feminist Law Professors blog and follow the hashtag #MPandTheBar.  If you want to get involved in this work, let us know. And no doubt other activists working on the pumping and bathroom access issues would welcome assistance too. There is, unfortunately, plenty of work to be done.

MORE NEWS ON STATES, BAR EXAM, AND DIPLOMA PRIVILEGES

For several months now, this blog has commented on Courts and States continuing to require a traditional bar exam for admission to practice.  It has also covered the call by law deans and law students to enact Diploma Privileges.

Since our last post on this subject Oregon has adopted a Diploma Privilege and now a New York State Senator has proposed a bill which according to its “justification” allows for “a modified form of diploma privilege. “  The bill is found here.

Senator Brad Hoylman’s Sponsor Memo reads as follows:

At this point in time, it is too early to tell with certainty whether it will be safe and feasible to hold an in-person bar exam in early September. State and local restrictions on venues being open and limiting the size of in-person gatherings may preclude the administration of an in-person test. Amid the uncertainty over the test’s administration, law graduates are reporting that the already stressful bar exam preparation has been compounded by personal challenges ranging from their own health and wellbeing to financial hardship to increased caregiving responsibilities. Allowing a modified form of diploma privilege, as proposed in this bill, would give law graduates a reprieve from further delays in admittance, while retaining the Court of Appeals’ prerogative to set standards for the profession. Under this bill, as long as there is an extent state of Emergency related to COVID-19, the uniform system of examination for admission to practice law in New York will consist of the New York Law Course, the New York Law Examination, and the Multi- state Professional Responsibility Examination, all of which can be taken online.

Nothing in the bill precludes the State from moving forward with admin- istering the Multistate Bar Examination, meaning it can remain an option for New York-based law graduates who wish to practice law in a state other than New York. Passing the MBE, however, would not be a required prerequisite to admission to practice in New York for the duration of the COVID-19 State of Emergency.

According to Karen Sloan’s article on Law.com, 2020 Brooklyn Law School graduate Claire Schapira, who is involved with an advocacy group called NY 4 Diploma Privilege hopes

that the Board of Law Examiners and Court of Appeals will act on their own, because they have the power to do that,” Schapira said. “But I think that this helps push the momentum. This is not something that graduates want because we don’t want to take the bar exam. This is an issue that has a real impact across the legal community and the community more generally.”

I agree. I spent a portion of yesterday trying to problem solve with a brilliant, hardworking, ethical and professional law graduate and accepted bar examinee.  This student also excelled in clinical practice.  Like other examinees, this immune compromised student, who is normally efficient, excellent at focus and time management, and extremely organized is being distracted from bar study by 

  • trying to keep up on what is safe to do as the virus surges again
  • changing bar expectations and information across the country
  • concern about friends and family who live in other states
  • Rent issues while studying for the bar in a safe appropriate place in the Capital NY Region 
  • Figuring out when to move to a more expensive city where a more challenging living situation but good job is waiting
  • Fear that NYS will once again punt making a hard decision by delaying exam dates until October which continues unemployment for this graduate and many others.     

It is time to provide certainty and provide New York law graduates with a diploma privilege tied to other indicators of professional promise as outlined previously on this blog here and here.

 

UPDATE: 7/8/20 See also https://news.bloomberglaw.com/us-law-week/insight-clinical-education-a-safe-and-sure-pathway-to-law-licensure

Update as of 7/13/20 – 

Letter to Court of Appeals Requesting Hearing Re Bar Exam – 07.13.2020   This letter was submitted by “United Diploma Privilege NY”, to the Court of Appeals  requesting a hearing on the safe administration of the September Bar exam.  

Also this morning, the 15 New York Law deans submitted a letter to Chief Judge DiFiore advocating for graduates taking the bar exam. 

See also other relevant news here and here

 

Let’s Take This Period of Unprecedented Change to Consider How Grading Practices Can Affect Issues of Diversity and Inclusion in Our Law Schools

Jennifer S. Bard, Visiting Professor, University of Florida, Levin College of Law

For the last half of spring semester 2020, law schools all over the country were forced to change their method of instruction, delivery of final exams, and (in many cases) grading practices because of the demands for physical isolation following the outbreak of Covid-19.  Now that the semester is over, there is a further round of disruption as many states have delayed or even cancelled their bar exams, some have granted graduates diploma privileges, while others bravely go ahead in the face of a possibility that they will have to cancel at the last minute because of ever-rising rates of infection. 

Like the opportunities that may arise when a river is drained and a ship revealed, there may never again be such an opportunity for us to consider what role we play in the glacially slow diversification of the legal profession and how we can make our law schools more equitable, inclusive, challenging, and effective for all of our students—not just those for whom it has been particularly well suited.

With many things to choose from, my starting point for looking at things we rarely question is the marrow deep belief that we owe it to our students to sort them for the benefit of large law firms—even when our employment profile shows that very few of our students will ever work at such a place.  Since the threshold for this opportunity is a top 5 or perhaps 10 percent class rank, it may seem odd, on reflection, that we have designed a curriculum designed to compare students that may have many undesirable consequences including undermining self-esteem, discouraging learning for learning’s sake, and contributing to the lack of diversity in the legal profession.  

Over the years, other justifications have been added such as the need to motivate students or assess their progress but never have we had such a good opportunity to see what law school is like without grades or, more to the point, comparative curves.

Here are some Practices We Might Question

The Primacy of First Semester Grades

One result of the decision to go pass/fail (or some variation of the words) was to “freeze” first year first semester class ranks because it was impossible to produce comparative curves

The resulting phenomena gives us a chance to ask ourselves  some tough questions:

  1. Do First Semester Grades Reflect What Students Bring to Law School Rather Than What We Bring to Them? OR Do Students Who Come in Knowing the Rules Get Better First Semester Grades?

Many students, very often First Generation Students, but also some facing racial or gender identity or expression based discrimination, frequently tell us (and the many researchers who study first generation college students) some version of “everyone knew the game but me and by the time I figured it out, it was too late.” And while students living with disabilities might intersect with any of these groups, they also are often using new adaptive equipment and certainly facing new challenges that they may have been able to mitigate in college.

Certainly many of our students do know the game from the start.  The recent AALS survey “Before the JD” found a disproportionate number of students who ended up going to law school had parents who were either lawyers or professionals. While students have, themselves, created organizations to support each-other usually with the enthusiastic support of the law school it may not be enough.

Our challenge going forward is that history is told by the victors.  We can see the students who were not comfortable the first semester but then continued to graduate “at the top of their class” (a vague term that usually means somewhere in the top 20%), but we don’t hear from the ones who didn’t draw attention through academic distress, but also didn’t thrive.

It would be helpful to know more–and many schools do know more about their own students.  But so little of this information is published.

Much is being done in supplemental programs- to name them is to leave many out- such as pre-first semester programs, orientation programs  and excellent pre-law institutes like the Tennessee Institute for Pre-Law , and in wonderful conferences organized by the National Black Law Students AssociationLavender Law, the National Association of Law Students with Disabilities,  and so many others.  

But how much more effective would it be to have a curriculum that was truly equitable and inclusive – all the way through?

2. Did Pass/Fail Grading Help Learning, Hinder Learning, or None of the Above?

Across the board pass/fail grading that makes no effort to compare students to each other is so unusual as to make any observations worth considering. The expectation was a distressing list of bad results-students putting in less effort during class, performing worse on exams — but did that really happen?

3. Ditto Open Book Exams

As above, it would be interesting to test, in the fall, the content knowledge of students who took open exams.  Not so much as to compare them with past classes, but to see what how much they learned.

4. What Will Be the Long Term Effect of the Delayed or Cancelled Bar Exams–and How Might that Change Our Curriculums?

The opportunity presented by the necessary changes to the bar exam is already in very good hands, (thank you AccesLex) but it’s still worth considering what the future will look like in states which choose provisional or full licensure.  Even decisions to delay the bar exam could raise issues of an on-going, career long licensing process, much as many doctors (but not all) must take requalifying exams every ten years to retain their “Board Certificate.” What would that mean for law schools?

To Be Continued: Part II: What Can We Learn from the Delay of Fall On-Campus Interviewing?   

New York State Bar Association Leads with Bar Exam Questions

Patricia E. Salkin*

On April 4, 2020 the New York State Bar Association once again delivered a resounding thumbs down to the Uniform Bar Exam (UBE) as a measure of competency to practice law in New York.  Five years earlier, the Association’s Committee on Legal Education and Admission to the Bar delivered a report that was overwhelmingly approved by the House expressing the sense of the profession that a move to the UBE was a bad idea. 

The NYSBA Committee on Legal Education and Admission to Bar (CLEAB), which had long studied the bar exam, hosted a session during the annual meeting on January 16, 2019 titled, “A Question of New York Law: Should It Be Taught in Law Schools and Tested on the Bar Exam?”  A preview to that discussion, “The Role of State Law in Legal Education and Attorney Licensing,” was published in the New York Law Journal the week prior.  In April 2019, the NYSBA Task Force on the Bar Exam was appointed by President Michael Miller, “to investigate and report on the experience and impact of New York’s adoption of the UBE.”  Then president-elect Hank Greenberg stated, “New York law has long been the gold standard in American jurisprudence. The bar exam should play an important role in ensuring that newly admitted lawyers appreciate the importance of New York law, and have an appropriate grounding in it.”

On March 5, 2020, following a series of statewide hearings on the bar exam, the Task Force, chaired by the Honorable Alan Scheinkman, Presiding Justice of the Appellate Division, Second Department, released its report and recommendations.  The report contains an accurate and detailed description of meetings, stakeholders and the decision-making process that ultimately led the New York Court of Appeals, under the leadership of former Chief Judge Jonathan Lippman, to abandon the New York Bar Exam (NYBE) in favor of the UBE; and the report starkly concludes that “…since the adoption of the UBE, the fundamental purpose of the bar examination has been lost.” (emphasis added)

Accompanied by analysis of findings and explanations to support their positions, the Task Force made the following recommendations to chart a path forward for the licensing of new attorneys in New York:

  • Eliminate the New York Law Exam and replace it with a rigorous exam on New York law as a prerequisite to admission to the New York bar.
  • Conduct an independent psychometric analysis of the grading and scaling of the UBE.
  • Allow those who do not wish to practice law in New York to take only the UBE and allowing those who only wish to practice in New York to take only the Multi-State Bar Examination section of the UBE and the rigorous New York test.
  • Consider a New York law Certification program that would allow people to forego the bar exam entirely. Under this program, ABA-accredited law schools inside and outside of New York would offer courses that include New York law-based content.
  • Consider an experiential learning pilot program, which would allow second and third-year law students to spend time counseling clients, working with practicing attorneys and learning other practical skills so that a portfolio of work is created and assessed every semester.

These recommendations are welcome, especially the last two items which get to the heart of what many thoughtful national experts have maintained are the more accurate measures of competency to practice law. While under normal circumstances, degree privilege plus programs that incentivize curricular choices (in this case more New York law) and require client-focused legal skills experiences are the better measures of basic competencies, the strange confluence of the timing of this report and the COVID-19 pandemic has created a fortuitous opportunity to test some of the recommendations in the report.

The fact that this Task Force was in existence and already working on bar exam issues led NYSBA President Hank Greenberg to ask the group to separately opine on the challenges surrounding the then-scheduled July 2020 uniform bar exam in New York. Greenberg has been a staunch advocate for the soon-to-be members of our profession noting, “Graduating law school students are experiencing high levels of anxiety and distress as their lives and potential livelihoods have been significantly disrupted, and we are focused on making sure that their concerns are being heard and responded to by policymakers.”  The Task Force recommended postponing the July 2020 bar exam until early September and if the exam is still impossible at that time, then to expand practice orders to enable new graduates to begin supervised practice while waiting for a bar exam to be administered. 

While the Court of Appeals under the leadership of Chief Judge Janet DiFiore has accepted the State Bar recommendations, much more needs to be done to clarify the status of the developing procedures for licensing lawyers from the Class of 2020.  Another blog dedicated to pragmatic discourse on how to best license new lawyers who are getting ready to take their first bar exam during the COVID-19 pandemic is documenting the thoughtful and reasoned ways in which many state licensing jurisdictions are rethinking the value of the traditional bar exam limited to the unique challenges presented during the COVID crisis.  Law deans and faculty, law students and members of the profession, importantly including the leadership of the State Bar, are engaged in thoughtful dialogue on this topic with the Court of Appeals to arrive at a fair and just resolution for the Class of 2020. 

Some may think it unfortunate for the NYSBA Bar Exam Task Force to have issued its critique of the UBE at the same time that we are experiencing an unprecedented disruptor in the practice of law and in the administration of justice. However, this is precisely the time that New York can lead the country with piloting alternative ways to license lawyers with a reasoned roadmap prepared not under the pressure of the pandemic, but rather after a year-long focused study that supports the concept that there are different and equally effective, if not better, ways to assess candidate competency for admission to the bar in New York.

*Patricia Salkin is Provost of the Graduate and Professional Divisions of Touro College. She is a legal educator and a past co-chair of the NYSBA Committee on Legal Education and Admission to the Bar.     

Disparate Impact Magnified: Holding a Bar Exam during the COVID 19 Pandemic year of 2020

Yesterday the Harvard Law Review blog posted an excellent piece by a powerhouse group of legal educators who describe the prospect of a “licensing abyss” just when non-privileged folks and small businesses will need extra legal assistance to navigate the health, employment, housing and government benefits legal landscape.  On the same day, the ABA also urged states that cancel or delay the bar exam to  license law grads on an emergency basis “to help address the increase in legal needs for individuals and businesses caused by this pandemic.”

The Harvard blog authors note, in addition, the the reluctance of bar examiners and courts to find alternatives to the “closed-book, two-day exam anchored in 200 multiple-choice questions” despite the option of so many good alternatives that may well better predict competence to practice law. The authors ask,

Why do our courts and bar examiners place so much faith in this high-stakes exam to predict who is competent to practice law?

This question has puzzled readers and contributors of this blog particularly in light of the discriminatory nature of “speeded” exams  and the economic call for practice-ready lawyers. It is also puzzling when the profession itself is so deficient in diversity and standardized tests are used in ways that preference the privileged.

For 2020, the issue of disparate impact with respect to timed, closed-book exams anchored in multiple choice questions is further exacerbated by law students’ quarantine and sheltering conditions while studying for the bar exam- see the excellent piece in the NYT on how students returning home to attend classes removes the veneer that all are equal. Even more disturbing and heartbreaking is the information surfacing this week about the horrific disparate impact of COVID19 deaths on Americans of color.  Pre-existing disparities in trauma, housing, employment, healthcare, opportunity, discrimination and historical DNA exacerbate the distress and fatalities for communities of color and for those whose families and friends are populated by people of color.  Some of us – particularly our students of color – will be affected in disproportionate ways and in ways no one can predict or control over the course of the coming months.

As the authors of the Harvard Law Blog wrote, “Crises challenge assumptions and demand action. For this year, emergency licensing based on diplomas and periods of supervised practice would offer proof of competence.”  To do otherwise would demonstrate an inability of our profession to adapt and experiment, and a shocking refusal to recognize and correct disparate impacts.

NYS Law Students Urge Highest Court to Expeditiously Announce Alternatives to September Bar Exam

Throughout the country all of us are being asked to change, adapt and respond in unprecedented ways as we experience global pandemic, quarantine, loss, fear, empathy and grief.  New York’s situation seems at this moment most dramatic as the deaths due to the virus surpass those from September 11th.

Two days ago, on April 1st,  law students from the 15 New York law schools eloquently and compellingly argued for the highest court to recognize this unprecedented moment and act accordingly in their Student Letter to Chief Judge DiFiore . In addition, the 15 deans of New York Law schools co-wrote and submitted a similarly persuasive Letter from New York Law Deans to Chief Judge DiFiore.

Yesterday, April 2nd,  the National Law Journal published Judith Wegner’s An Immodest Proposal. Professor Wegner, the innovative co-author of the pathbreaking Carnegie report Educating Lawyers calls for state bars and courts to:

  1. Recognize the factors that are impeding meaningful responses;
  2. Mitigate immediate problems through supervised practice rules;
  3. Consider adopting an emergency provisional licensing system; and
  4. Recognize that other options also deserve attention.

It is incumbent upon the New York Court of Appeals to act swiftly and with innovative  measures to effectively respond to the informed voices of  New York’s law students and law deans.

Best Practice Contributors Highlighted in Best Articles of 2019

A big congratulations to our very own bloggers, Jennifer Bard and Benjamin Madison, for being featured on the TaxProf Blog!

Jennifer Bard’s article, “Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?” and Benjamin Madison’s article, “New Rubrics Available to Help Law Schools that Have Adopted Learning Outcomes Related to Professional Identity Formation” were both listed as TaxProf Blog’s “Best Legal Education Articles of 2019.”

Refuting the False Trope on Clinical Courses and Bar Passage

Robert Kuehn, Washington University School of Law

It has been observed that “the fewer the facts, the stronger the opinion.” Until recently, this could be said about the possible influence of enrollment in clinical courses on a student’s likelihood of passing the bar examination. While there was a shortage of empirical studies on any possible relationship, there have been plenty of opinions on how taking those courses might be harmful, opinions often reflected in graduation restrictions on clinical courses and requirements for bar subject-matter courses.

But, there are now significantly more facts to refute those opinions. Two recent, large-scale studies have both found no relationship between the number of law clinic or externship courses or credits a law graduate took and her likelihood of passing the bar exam.

In a forthcoming article in the Journal of Legal Education, academic records of ten years of law school graduates of Washington University in St. Louis and Wayne State University were reviewed for any relationship between the number of law clinic, externship, or, more generally, experiential courses or credits and bar passage. After first accounting for the possible influence of law school grades on bar passage (the most significant predictor of bar success), the study found no correlation at either school between law clinic or externship enrollment and bar passage — no relationship between participation in a law clinic or externship and passage, none between the number of clinical courses and passage, none between the number of clinical credits and passage, and no evidence that students graduating with lower GPAs were disproportionately enrolling in those courses as a way to avoid doctrinal courses (another not uncommon trope). This lack of any relationship was in spite of increased enrollment in experiential courses at both schools over the ten-year period and decreased enrollment in courses teaching material tested on the bar (referred to as bar subject-matter courses).

The article notes that nationwide data on experiential course enrollment and bar passage also belie any claim the two are related. That data indicate that as enrollment in experiential courses was increasing from 2006-2016, bar passage percentages were fairly steady and that the recent decline in passage coincided with decreased, not increased, enrollment in those courses.

A recent study commissioned by the California State Bar found a similar lack of relationship between law clinic and externship courses and bar exam performance. The study reviewed law school coursework and performance on three July exams for over 7,500 bar applicants from eleven California schools. It found no relationship between the number of academic credits from law clinic courses and exam performance, either across all schools or even when reviewing schools separately. Similarly, there was no relationship between the number of externship or internship credits and performance, again when examined across all schools or within schools. The broad range of entering credentials at the eleven schools, and lack of a relationship even within those schools, indicates that the results should be applicable to most law schools, including those with lower LSATs and undergraduate GPAs for entering students.

The study results from Washington University/Wayne State and the California State Bar are similar to smaller studies at Texas Tech and the University of Denver that also reported no statistically significant relationship between enrollment in a law clinic or externship course and bar passage.

The Washington University/Wayne State and California State Bar studies further revealed that opinions about the value of bar subject-matter courses should be moderated. There were small correlations at both schools between the number of bar subject courses and bar passage. But this result (explaining less than 5% of the variability in bar outcomes) was only for low performing students and additional courses showed no marginal benefit once students took the school’s average number of bar courses.

The California State Bar study focused on whether taking a specific course was related to performance on the bar exam topic taught in those courses. It found that neither attendance nor performance in courses covering any of the 13 bar-related topics was related to performance on the corresponding California bar exam or Multistate Bar Exam content covering that subject.

Studies at other schools also indicate that enrollment in bar subject-related courses do not support broad claims about the benefit of taking those courses.

It is time to put away the misinformed trope of participation in law clinic and externship courses harming a student’s chances of passing the bar exam and let the facts do the talking. Law schools should recognize and students should be told they can obtain valuable preparation for the practice of law by enrolling in clinical courses without affecting their likelihood of passing the bar exam.

Active Retrieval Practice: Known to Improve Learning but Underappreciated?

Exam time has arrived in law schools.  Students who want to excel on exams (and later, as attorneys) would do well to try out active retrieval practice.  To understand the value of retrieval practice, some brief discussion of well-established cognitive science is necessary.  Learning involves (1) taking knowledge into short-term working memory, and then (2) moving it from working memory to long-term memory by actively using the knowledge.[1]  In their excellent book, Teaching Law by Design, Dean Michael Hunter Schwartz, Professor Sophie Sparrow, and Professor Gerald Hess explain this process of storing learning in cognitive schema.[2]  They liken schemata to a “folder system[] provided for users of computer operating systems.”[3]  As they observe, however, storing knowledge “isn’t enough.  To analyze a problem, students must recall (“retrieve”) what they have learned and use that learning . . . .”[4] 

Research on cognition demonstrates that meaningful learning in any discipline requires the learner to perform some form of active retrieval exercises to be able to use the knowledge in analyzing and solving problems.   Active retrieval methods are ways in which the learner recalls knowledge and uses the recalled knowledge to solve problems or answer questions.[5]  Recalling for mere “knowledge checks,” sometimes called rote learning, is not effective.[6]   In the law school arena, a student can recite a memorized rule but not be able to apply it to fact patterns in a way that shows understanding.   Effective retrieval-based learning activities require the student to solve problems or to answer questions.  By doing so, the learner strengthens her understanding  of, and ability to recall, the knowledge.[7]  In law school, mid-term exams require students to recall information at least in mid-semester.  The problem there is that neuroscience shows a marked forgetting curve: if learning is not retrieved within a few days of its being stored, the knowledge is lost and must be relearned.[8]  Indeed, retrieving and using the knowledge are the critical parts of developing meaningful learning. 

               Spurred by the adoption of ABA Standard 314, my colleagues and I have been giving mid-term exams and using a variety of interim assessments designed to have students to recall information from previous classes. I regularly have a mid-term that includes multiple-choice, essays, or both.  The exams are graded and students receive a model answer. I discuss with students the answers to the assessments and common mistakes (e.g., failing to state rules accurately, insufficient application of facts in supporting one’s analysis).   The exam and follow-up discussions achieve the goal of providing the “meaningful feedback” that ABA Standard 314 seeks.   The mid-term is also summative. My experience is that many students do not take a practice, ungraded mid-term seriously. Having the exam count, but not so much as to prevent a student from recovering from a poor exam, helps to ensure that students prepare for and spend time on the mid-term.

               Another way that I have incorporated active recall practice is by using multiple-choice polling questions.  In the first class after we finish a course segment, we begin the class with multiple-choice questions that students answer by polling.    I ensure participation by recording the polling, by student, and including their responses (or lack thereof) as a class-participation part of the grade.  In answering the questions through polling, students must recall knowledge to analyze the question and reach a conclusion.   For example, after completing the study of removal jurisdiction and procedures, I use a series of multiple-choice polling questions that explore the many nuances of removal.   These sessions provide “meaningful feedback” to both students and to me.   If everyone misses a question, you can be sure I go back to discuss the area.  I also encourage anyone who missed a question in these polling sessions to meet with me after reviewing the topic addressed by the question.

               I urge students not to rely solely on the mid-term and the polling sessions as a means of ensuring they have learned material well.  Instead, I highly recommend preparing answers to essays under timed conditions.  At times I provide a model answer after they practice an essay. I also invite students to meet with me to go over their answers.   In these discussions, I almost always find some area in which a student has a mistaken understanding of a rule or concept.  If we did not uncover that misunderstanding, a student could repeatedly recall a flawed rule or approach.  Hence, I appreciate more than ever the wisdom of Standard 314’s emphasis on formative assessment.   After resolving any misunderstandings, I encourage a student to rewrite an answer.  That allows the student to revisit the topic and solidify her understanding.  Indeed, the act of writing itself helps students to embed the rules and concepts more firmly in their memory.[9] 

               What is true for law school is also true for the Bar Exam.   Last summer Sara Berman and I created a podcast for the ABA on practicing tests (essays, Multistate Performance Tests [MPTs], etc.) as some of the most effective ways to prepare for a state’s Bar Exam.[10]  Ideally, a student learns everything she needs to know in law school Bar review is just that–review.  More often, Bar applicants have a vague recollection of rules and concepts from their time in law school.  In other words, Bar applicants often find themselves relearning rules and concepts.  In so doing, they will learn more effectively by practicing an essay or MPT answer and by submitting these answers for grading to their Bar Preparation company and the faculty at their law school who help Bar applicants.  In doing these exercises, they benefit for at least two reasons.  First, it will identify areas in which the applicant’s knowledge of rules and concepts is so weak that she cannot answer a question.   Knowing that, the applicant can review that area and know that she needs to do so.  Second, exercises such as practice essay answers will require applicants to recall the rules they do know and apply them.  The more they do so, the more likely they are to remember them on the Bar. 

               Repeated, active retrieval practice is one of the best ways to learn to perform on exams or in law practice.  Yet, despite the data showing its effectiveness, such practices are not the norm in higher education.[11]  The practice is likely not the norm in law schools.   Standard 314 ought to help to some extent increase active retrieval before the end of the law school semester.  Yet, at present students are not spending their time as wisely as they could. Instead of preparing detailed outlines, and memorizing rules or flash cards, they would learn more from methods that require them to recall and apply legal rules and analysis.   Indeed, one might say this fact is one of the best kept secrets in law school. Perhaps it is time to let this secret out.


[1] Michael Hunter Schwartz, Sophie Sparrow & Gerald Hess, Teaching Law by Design 4­–7 (Carolina Academic Press 2009).

[2] Id.

[3] Id. at 5–6.

[4] Id. at 6.

[5] Jeffrey D. Karpicke, A Powerful Way to Improve Learning and Memory: Practicing Retrieval Enhances Long-Term, Meaningful Learning, American Psychological Association (2016), available at https://www.apa.org/science/about/psa/2016/06/learning-memory (last checked Dec. 6, 2019).

[6] Id.

[7] Id.

[8] Louis N. Schulze, Jr., Using Science to Build Better Learners: One School’s Successful Efforts to Raise its Bar Passage Rates in an Era of Decline, 68 J. Legal Educ. 230, 245 (Winter 2019).

[9] Bryan Goodwyn, The Magic of Writing Stuff Down, 75 Educational Leadership 78–79 (April 2019).

[10] Sara Berman and Ben Madison, Practice Makes Passing, Episode 6 of American Bar Associations Path to Law Student Well-Being Podcast series, available at  https://www.spreaker.com/show/path-to-law-student-well-being  (June 22, 2019).

[11] According to Dr. Karpicke, college students likewise use rote learning methods more than they use active retrieval exercises.  See Karpicke, supra note 5.

An Overview of “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes”

Robert Kuehn, Professor of Law at Washington University School of Law in St. Louis, and David Moss, Associate Clinical Professor at Wayne State University Law School, recently conducted a large-scale study looking at the relationship between clinical/experiential or bar subject-matter courses and bar passage success in a paper entitled, “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes.” As a law student currently enrolled in a clinic, this study immediately piqued my interest.

This study was in response to fear that bar passage rates were down because of rising enrollment in “experiential courses” as opposed to “bar-subject courses.” Law schools began pushing students to enroll in more of these bar-subject courses to correct this so-called issue. However, Professor Kuehn and Professor Moss observed that there was no evidence to suggest that taking more bar-subject courses was appropriate advice for all students. Their study looked at this missing evidence between bar-subject courses and experiential learning and bar exam outcomes for ten years between two law schools: Washington University School of Law (WashU) and Wayne State University Law School (Wayne State). Both schools only require the designed first year courses and the upper-level writing courses mandated by ABA accreditation standards.

Previous studies performed in states like Texas, Colorado, and California looked at the effects of coursework and bar passage rates. These studies did not support the claim that taking more bar-tested law school course improve chances of passing on the first attempt. Notably, a study done in Indiana concluded, “simply forcing lower-performing students to take more upper division bar-subject courses will not solve the bar examination failure problem.”

The first goal of the present study was to determine whether a graduate’s enrollment in elective experiential courses was related to first-time bar passage success. Next, it was to assess whether enrollment in elective courses that cover bar subjects was related to bar success.

Data was collected from law school graduates from 2006-2015. The following table outlines the number of graduates with LSAT scores and bar passage rates between the two schools:

The next table looked at graduate characteristics such as undergraduate GPA, LSAT score, 1L GPA, and law GPA and their correlation with bar passage:

It wasn’t until 2005 that the ABA began requiring graduates to receive professional skills instruction with as little as one credit satisfying the requirement. In 2014, the ABA changed this to require six credits beginning with 2019 graduates. The study authors decided to track enrollment in skills courses versus bar passage over this time period.

The table above reveals a solid line depicting that average bar passage percentages were steady from 2006-2013 (this is when experiential course enrollment increased by over 50%). During the significant rise in experiential enrollment, bar passage percentages were largely steady. “Therefore, efforts to link declining nationwide bar passage rates to the rise in experiential course enrollment are not supported by national statistics.” A more likely contributing cause for bar passage declines since 2014 is weaker credentials of incoming 1Ls.

At WashU, it was found that while taking at least the average number of bar courses is associated with increased likelihood of passing the bar, there was no statistically significant increase in bar passage associated with bottom-quartile LGPA graduates who took more than the school’s average. This was similar with graduates in the bottom half of their class at Wayne State. Results for both schools indicate that graduates in bottom quartile who take fewer than the average number of bar courses at their school were associated with a significant increase in bar failure. Further, at both schools, students entering with scores lower than 150 were associated with pass rates significantly below the school’s average.

This study concluded that the claim that the dramatic decline in bar passage rates is due to law students taking more experiential courses or fewer bar-related courses is not supported. It characterized efforts to cap experiential credits in order to improve bar passages rates are “misguided,” warning that schools should not expect that “mere exposure” to more bar courses will significantly improve bar passage rates.

Also see “Legal Skills Prof Blog” and “TaxProf Blog” for more posts on this study

After All These Years: Another Bar Exam Over, Another Entering Class, but Still a Disconnect between the Licensing Exam and What We Need Lawyers to Be and Do

I was never a Journey fan but I truly am astonished that after all these years of preparing lawyers for practice, and after two years of an unprecedented undermining of  the rule of law in our nation, law schools still live with a disconnect between the profession’s  licensing exam and what business, government and society needs lawyers to be and do, which includes protecting  the rule of law. 

The National Law Journal recently discussed two new major studies which will analyze whether the current exam is the best measure of new lawyer competence.  The National Conference of Bar Examiners (NCBE) is in the midst of a three year study  to “ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in the 21st century.”  (Hmm, continues? that’s a bit biased) and has already held 30 listening sessions.  

The second study, “Building a Better Bar: Capturing Minimum Competence” is an initiative of  the Institute for the Advancement of the American Legal System in partnership with Ohio State Law Professor Deborah Merritt, and aspires to develop a “fair, evidence-based definition of minimum competence” to improve the current licensing process.  Funded by Access-Lex, the researchers:

will be holding 60 focus groups in 12 locations around the country. While these focus group participants will primarily be new lawyers, we will also hold a number of specialized groups with supervisors. Additional specialized groups will include only women and only people of color, as well as groups in rural areas; traditional job analyses can mask the views of these lawyers, yet their perspectives are essential to create a more fully representative view of minimum competence and how to test for it effectively. Through these focus groups, we will be able to capture key information from a diversity of perspectives and provide concrete data on the definition of minimum competence that the profession can use to improve the bar exam and how lawyers are licensed.

 

Readers may remember that IAALS has provided helpful research in the past through its Foundations for Practice  research, which identified the  competencies over 24,000 legal employers value in new hires (most of which go untested by the current licensing process) as well as the evaluation of the graduates of the Daniel Websters Honors alternative to the bar exam in “Ahead of the Curve:  turning Law Students into Lawyers

I suppose I should be delighted that more studies are being launched. They are addressing the exact issues so many of us have raised for decades. However, my reaction is uncharacteristically pessimistic.  (Readers here who have tolerated my enthusiastic use of exclamation points and emphasis will agree it is uncharacteristic).  Perhaps it is the August humidity. Perhaps, it is the sorrow surrounding our nation after a week of grief from senseless gun violence But more likely, it is the fact that I am feeling frustrated that we have already studied this to death! For example, working with state bar associations The Foundations for Practice Project already studied new lawyer competencies with 24,000 lawyers from all 50 states participating and found

… the foundations that entry-level lawyers need to launch successful careers in the legal profession.

In a first-of-its-kind survey, we asked, “What makes a new lawyer successful?” More than 24,000 lawyers from all 50 states answered.

What we learned is that new lawyers need more than IQ and EQ to be successful. They also need CQ: Character Quotient. In fact, 76% of characteristics (thinks like integrity, work ethic, common sense, and resilience) were identified by a majority of respondents as necessary right out of law school.

Beyond character, new lawyers are successful when they come to the job with a broad blend of legal skills, professional competencies, and characteristics that comprise what we call the “whole lawyer.”

So why is the NCBE, who clearly has a stake in the outcome, refusing to respond to the outcome of that 3 year old study but instead promising only to do its own study. JEESH! We tweak here and there, we add more pro bono or experiential requirements, but no one truly influential will admit that our insistence on anchoring the gateway to the profession to a timed, written exam instead of clinical excellence is the problem.

Starting as early as 2008, this blog has discussed the problems with the bar exam and its role as an unhelpful, anxiety producing, discriminatory, skewed, and unnecessarily speeded, gate-keeping device.  For a sporadic history of posts between then and now, in fairly chronological order, click on the links below.

Did You Know That “Bar Courses” Don’t Matter? 

New Article: No Excuses Left for Failing to Reform Legal Education

Working with State Bar Associations on Best Practices

Bar Passage and Best Practices for Legal Education

One BAR to rule them all?

The Daniel Webster Scholar Honors Program

NYSBA Task Force on the Future of the Legal Profession Report

New Requirements for Bar Exam Stress Clinical Education

Existential Crisis and Bar Exams: what is really cruelest?

The Bar Exam Inhibits Curricular Reform

NEW YORK STATE BAR ASSOCIATION VIGOROUSLY OPPOSES PROPOSAL TO BRING UBE TO NY THIS JULY

Preparing Students for the Multistate Bar Exam

Musings on the Bar Exam and Legal Education’s Attitude toward it

Bar Exam Musings, Part II: Skillfully Changing the Bar Exam Narrative

Experts in the Legal Field Question the Bar Exam…

What’s going on in California? “TFARR- recommended” 15 credits of competency training

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Examining the Bar

Keeping an experiential identity in bar passage reform

Whither Clinical Courses and Bar Passage – by Prof. Robert Kuehn

DO LAW SCHOOLS ADEQUATELY PREPARE STUDENTS FOR PRACTICE? SURVEYS SAY . . . NO! – Robert Kuehn, Washington University School of Law

Professor Merritt’s Blog post on attorney discipline and bar exam WORTH A READ!

Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (Besides the Cut Score Debate)

Scholarship on Bar Exam Alternatives Needed

ABA Commission on Future of the Profession & ABA Vote on Bar Passage Proposal

Drafting Exams With Test-Taking Speed in MindConcrete Suggestions for Bar Exam Reform

We have to talk about the bar exam

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

More Resources Re Teaching, Learning, and Bar Passage

A Fresh Look at the Uniform Bar Examination

Letters raise concerns about changes to the bar pass accreditation standard

Time to Remedy the Ills Afflicting ABA Council’s Standard 316 Proposal

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Shifting the Focus of Legal Education Back to Just That: Education

How Practice Tests Reduce Anxiety in Bar Preparation and the Exam

Quite a listing, huh? I suspect that the IAALS and Merritt project will provide us with extraordinarily helpful insights into measuring minimum competence. But political clout is also needed. Will this BLOG simply be adding more posts for years to come on the unfairness and inappropriateness of a slightly modified, unnecessarily stressful, timed, bar exam — a continued hazing tradition?  I hope the NCBE and other institutional influencers proves me wrong.

%d bloggers like this: