The Next Generation of the Bar Exam, NCBE Style

The Testing Task Force of the National Conference of Bar Examiners (NCBE) recently completed the three phases of its “future-focused study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in a changing legal profession.” Earlier this month, as tumultuous events in Washington, D.C. grabbed most of the headlines, the task force quietly released an Overview of Preliminary Recommendations for the Next Generation of the Bar Exam. The recommendations are a big deal—possibly a game changer—and deserve some discussion.

I see three positive aspects of the task force’s preliminary recommendations:

First, quoting the document directly, “Our decisions were guided by the prevailing views expressed by stakeholders . . . : that the bar exam should test fewer subjects and should test less broadly and deeply within the subjects covered, [and] that greater emphasis should be placed on assessment of lawyering skills to better reflect real-world practice and the types of activities newly licensed lawyers perform . . . .” So many critics of the current bar exam, myself included, have argued exactly these positions for so long, but to read them coming from the task force directly affiliated with the NCBE is no small matter.

Indeed, the task force recommends a significant reduction in the number of legal subjects to be tested—just Civil Procedure, Contracts, Evidence, Torts, Business Associations, Constitutional Law, Criminal Law, Criminal Procedure (limited to constitutional protections), and Real Property. These reductions, if they were adopted, would cut from the current Multistate Essay Examination (MEE) subjects, not the Multistate Bar Exam (MBE) ones. Many will understandably quibble with the retention of all seven MBE subjects, and indeed I will do just that below. Still, one can hope that, if the task force’s recommendations are adopted by the NCBE, the bar exam of the future will in fact cover the several MBE subjects “less broadly and deeply” than the extent to which they are currently tested.

One reason to be hopeful that the next generation of the bar exam would in fact test MBE subjects less broadly and less deeply is that the task force is recommending a fundamental shift away from the model of discrete exams like the MBE, MEE, and Multistate Performance Test (MPT). It is instead calling for an integrated exam that would test “knowledge and skills holistically, using both stand-alone questions and item sets, as well as a combination of item formats (e.g., selected-response, short-answer, and extended constructed-response items).” As always, the devil will be in the details, but this preliminary recommendation sounds at least somewhat promising.

Second, the task force rightly recognizes the need to expand the scope of the foundational skills covered on the bar exam to include more than just legal analysis and writing. Among the skills to be encompassed by the projected bar exam of the future are legal research, factual investigation and evaluation (including fact gathering), client counseling and advising, client relationship and management, and negotiation and dispute resolution. It might be hard to imagine how some of these skill sets could be tested, but the task force foresees “uniform text- or video-based scenarios that require candidates to construct a written response or select the correct response.”

Third, the task force recommends that the next generation of the exam should be “a computer-based test, administered either on candidates’ laptops in jurisdiction-managed facilities and/or at computer testing centers managed by a suitable vendor.” The pandemic forced many state bar examiners to abruptly adopt computer-based testing. It makes sense for myriad reasons to move in the direction of computer-based testing and to do so as part of a deliberate and careful process.

Having said all that, there are at least two aspects of the task force’s recommendation that strike me as problematic—or at least present the risk of inadequate or counterproductive reform to the bar exam.

First, as mentioned earlier, there is the question of why the bar exam should continue to test legal knowledge in all seven of the MBE subject areas. Testing knowledge of such a large number of subject areas is especially problematic when considering that, by all appearances, the exam would remain closed-book. As Professor Deborah Merritt has cogently explained here, if the bar exam of the future needs to remain a closed-book exam, then the doctrinal portion of the exam should cover no more than a basic understanding of U.S. legal processes and sources of law or a single substantive subject. In arguing for an open-book exam, Merritt correctly points out that “new lawyers simply don’t work from memory. Instead, they internalize basic principles that allow them to identify issues in a client problem and find the specific rules they need to address that problem.”

Moreover, the empirical data on job activities of newly licensed lawyers, gathered during Phase 2 of the task force’s study, reveals that both experienced and newly licensed lawyers rated knowledge of Torts, Constitutional Law, Criminal Law, Criminal Procedure, and Real Property Law as only moderately important (all receiving an average rating of 2.2 or lower out of 3). They rated knowledge of all those subject areas as less important than knowledge of Statutory Interpretation Principles or Statutes of Limitations (both receiving an average rating of 2.3), neither of which is included as a stand-alone subject to be tested on the task force’s vision of the bar exam. To be sure, Civil Procedure (2.6), Contract Law (2.6), and Rules of Evidence (2.5) all received noticeably higher average ratings. Perhaps one or more of only those subject areas could be tested.

Second, on a more micro level, is the peculiar fate of one specific knowledge area: Statutory Interpretation Principles. In phase 3 of the study, the task force proposed that Statutory Interpretation Principles should not be tested as a stand-alone subject but rather could be subsumed under the Constitutional Law subject area and the skills portion of the exam. The suggestion that statutory interpretation can be subsumed under Constitutional Law is a dubious one. Rarely does a court do a deep textual dive into the precise phrasing of a constitutional provision as courts routinely do when confronted with a question of statutory construction. Moreover, questions on substantive knowledge of constitutional law will routinely call on knowledge of Supreme Court precedents, not on dissection of the text of a constitutional provision.

Statutory interpretation is plainly distinct from constitutional interpretation in critical ways. Hence, I trust that the skills components of the bar exam of the future will include not just cases as sources of law, but also statutory and regulatory provisions (cutting across many different substantive subject areas) which applicants have to dissect and interpret. Based on what I’ve heard from the task force, I am optimistic on this point. I discussed the extent to which statutory sources are used on the performance test, currently the only exclusively skills component of the bar exam, in an earlier post here.

In sum, much is uncertain and will take shape over time. But the early indications give some reason to be cautiously optimistic about the NCBE testing task force’s vision of a new bar exam.

Important New Report on Essential Lawyering Skills

Ohio State Professor Deborah Jones Merritt and Institute for the Advancement of the American Legal System Research Director Logan Cornett just published an important report, Building a Better Bar: The Twelve Building Blocks of Minimum Competence, based on insights from 50 focus groups.

They found that minimum competence consists of 12 interlocking “building blocks,” including the ability to interact effectively with clients, communicate as a lawyer, and see the “big picture” of client matters.

They propose 10 recommendations that courts, law schools, bar associations, bar examiners, and other stakeholders should consider in their efforts to move towards better, evidence-based lawyer licensing.

These include requirements that candidates for licensure should be required to complete coursework that develops their ability to interact effectively with clients and negotiate.

In addition, they recommend that candidates should be required to complete coursework that focuses on the lawyer’s responsibility to promote and protect the quality of justice and includes closely supervised clinical and/or externship work.

They include recommendations to use performance tests and restrict use of essay and multiple choice exams.

Here’s a summary of the report.  You might share it with your deans and curriculum committee chairs in case they haven’t seen it.

This brief summary doesn’t do justice to the insights from the report.  I’m sure that faculty are working like crazy in this year of living dangerously, especially at this point in the semester.  You might find it worth your time to spend an hour reading this report when you feel like procrastinating.

The report doesn’t include all my radical ideas for a negotiation school, but it clearly recommends moving in that direction.

I will discuss these ideas with Peter Robinson and Sukhsimran Singh in an upcoming event, They Should Call It Negotiation School, Not Law School, sponsored by the Pepperdine Caruso School of Law Straus Institute for Dispute Resolution.  It will be on Tuesday, November 10, at 1 PT, 2 MT, 3 CT, 4 ET.  Come join the conversation.

Looking Ahead: The Performance Test on the Bar Exam Post Covid-19

Even amidst great uncertainty as to how the Covid-19 pandemic will impact future bar exams, we have already seen a few “when the dust settles” articles addressing how best to move forward with bar exams after the effects of the pandemic have waned.[i] While there are justifiable calls to rethink the exam entirely, especially in light of disparate outcomes based on race,[ii] multiple commentators have highlighted the need to enhance the exam’s focus on the performance test.[iii]

A telling indicator of the importance of the performance test appears in the Phase One Report of the National Conference of Bar Examiners’ (NCBE’s) Testing Task Force, which is conducting “a future-focused study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in a changing legal profession.”[iv] Among the points made most frequently by various bar exam stakeholders during listening sessions were the following: “The MPT [Multistate Performance Test] is the strength of the current exam” and “Lawyering Skills should be emphasized over subject matter knowledge.”[v] The performance test remains the only vehicle on the bar exam that directly evaluates lawyering skills and not subject matter knowledge of—some would say rote memorization of—law, which is tested by both the Multistate Bar Exam (MBE) and essay questions. As such, it ranks high as to testing validity and has great potential as an evaluator of competence. 

The NCBE, which currently provides the testing instruments used on the vast majority of state bar exams,[vi] is an influential force in the bar exam world, and it approaches change at a glacial pace.[vii] Hence, there is good reason to think that, while the pandemic might lead to changes in the manner of exam administration (e.g., remote administration in lieu of in-person), the heavy majority of states will continue to use the NCBE’s three tests: the MBE, the Multistate Essay Exam (MEE), and the MPT. Hence, a closer look at the most important of those three, the MPT, and ways in which performance testing can be improved is in order. What follows is a review of the current status of the performance test on bar exams in the United States and some thoughts on enhancing performance testing moving forward:

What is the performance test, and what is its purpose?

A performance test assigns a specific written lawyering task (e.g. memo, brief, or a letter), to be completed by the examinee relying on a closed universe of provided factual and legal materials. Performance tests, unlike essay questions and MBE questions, require no advance memorization of doctrine, as they provide a Library, which includes all the needed legal authorities in the form of cases, statutes, rules, regulations, or some combination thereof. The facts are presented in the File, usually through assorted documents such as transcripts of witness interviews, correspondence, etc. The allotted time for completion of a single performance test is 90 minutes.

The performance test is a creature of the ABA’s landmark 1992 MacCrate Report, which called on law schools to enhance training in assorted fundamental lawyering skills.[viii] The NCBE responded to the MacCrate Report by crafting the MPT as an instrument to evaluate six of the fundamental lawyering skills listed in the report: (1) Problem Solving, (2) Legal Analysis and Reasoning, (3) Factual Analysis, (4) Communication, (5) Organization and Management of a Legal Task, and (6) Recognizing and Resolving Ethical Dilemmas.[ix] The NCBE began offering MPT items to jurisdictions in 1997.[x]

How is the performance test currently being used on the bar exam?

The NCBE continues to produce the MPT, and 44 states administer it on their bar exam.[xi] Thirty-six of those states administer two MPT items, pursuant to their administration of the Uniform Bar Exam (UBE).[xii] Some of the remaining eight administer one MPT, and some administer two. In addition, Pennsylvania creates and administers its own performance test,[xiii] as does California.[xiv] In total then, 46 states administer at least one performance test on their bar exam. Only Florida, Louisiana, Michigan, and Virginia eschew performance testing.[xv]

Is the performance test evaluating all of the MacCrate skill sets adequately?

As I discussed in detail in a 2015 article, the performance test is not living up to its potential, as typical test items implicate mostly just the core skill set of legal analysis and reasoning.[xvi] It remains not at all clear that the performance test is encompassing all six of the MacCrate skill sets on a consistent basis. Most notable among the skill sets that remain under addressed are (1) Problem Solving and (3) Factual Analysis. 

Problem Solving

The MPT occasionally implicates problem solving through a test item requiring an evaluation of multiple competing approaches to a given client’s problem—for example, MPT2 on the July 2019 exam, in which examinees had to evaluate the merits of two different estate planning approaches for a mock client.[xvii] In the last several years, neither California’s nor Pennsylvania’s performance tests has presented comparable test items. 

Factual Analysis

As to factual analysis, because the source materials in the File include both relevant and irrelevant facts, the performance test does a good job of testing an examinee’s ability to identify relevant facts within a given set of materials. However, among the core competencies encompassed by the skill set of factual analysis are determining the need for further factual investigation and planning a factual investigation.[xviii] Performance tests have not encompassed these competencies. Indeed, in the past several years, no performance test has tasked examinees with identifying what additional facts would be useful and drafting a discovery plan for obtaining those facts. 

To be fair, given that the performance text currently plays a limited role on the exam, it is perhaps not possible to consistently encompass all of the MacCrate-inspired skill sets. The addition of more performance tests, or multiple-choice questions within performance tests, discussed below, could help in this regard. 

Does the performance test make adequate use of statutes and regulations as legal sources?

Of course, statutes and regulations are central to much of law practice today—so much so that numerous law schools are requiring first-year students to take a course on legislation and regulation. It would seem beyond dispute that all examinees on every bar exam should be required to display competence in reading and applying a statute or regulation without the aid of a case interpreting the statute or regulation at issue.  Neither MBE questions nor MEE questions (nor state-specific essay questions) directly require examinees to do this. In contrast, the performance test easily can. Unfortunately, however, statutes or regulations (or comparable types of non-case authorities) do not consistently appear on recent performance tests—to say nothing of the four states that do not administer a performance test at all. 

To the NCBE’s credit, at least one of the two MPT test items in each exam administration since 2010 has included at least one statute, rule, or regulation.[xix] It is not clear the extent to which examines on these various MPTs needed to reason directly from the statute, rule, or regulation (as opposed to reasoning from a provided case that interpreted the statute, rule, or regulation) to resolve an issue, but a cursory review suggests in several cases that, to a goodly extent, they did. By way of example, one of the MPT items on the February 2015 exam included only a statute and regulations in the Library, and examinees were tasked with “parsing . . . HIPAA regulations” in order to answer effectively.[xx]

 In contrast, however, since shifting to the 90-minute performance test format in July 2017, California has included only cases as authority on its performance tests through the February 2020 exam.[xxi] This reflects a major failing of the California bar exam. 

The Pennsylvania examiners have done a better job, having deployed statutes as part of the law library on several different performance tests,[xxii] the first of which (February 2017) I highlighted in an earlier post. Still, though, given that there is only one performance test on each administration of the Pennsylvania exam, each Pennsylvania performance test with only cases (and there continue to be several of those[xxiii]) represents an exam that does not evaluate the fundamental competency of reading and reasoning from a statute, rule, or regulation. 

How can bar examinations make greater use of performance tests?

Ideally, all 50 states would administer at least a bare minimum of two performance test items on every exam. Though progress is being made toward that goal, we are not there yet. 

First, as noted earlier, four states do not administer a performance test at all. One can only speculate as to what reason, aside from inertia, leads the bar examiners in Florida, Louisiana, Michigan, and Virginia to forego a testing instrument of such value.

Second, as also noted earlier, several states that use the MPT outside the construct of the UBE, as well as Pennsylvania and California, administer only one performance test, not two. The use of two test items, as required for the UBE, affords an opportunity to evaluate a greater array of lawyering skills via the performance test. The use of only one relegates the performance test to playing a minimal role on the exam overall. By way of illustration, while the two MPT items in UBE jurisdictions count for 20% of the exam score, the one MPT item in Pennsylvania counts for only 11%, and the one MPT item in Nevada counts for only 10.5%.[xxiv]

Third is the broader question of expanding the use of the performance test beyond just two test items and the current 20% allocation on the UBE. As I noted in my 2015 article, this is a tougher challenge.[xxv] The MBE counts for such a large chunk of the score on the exam—50% in UBE and most other jurisdictions—because it is a psychometric anchor for the exam. In other words, the MBE has very strong testing reliability. As a multiple-choice exam that tests knowledge of various areas of substantive law, thereby calling for rote memorization, the MBE has weaker testing validity. In contrast, the performance test has high testing validity, but, because of its size and the subjectivity inherent in grading, has lower testing reliability. The NCBE prioritizes strong testing reliability for its products and hence allots greater percentages to the MBE and essay questions, 50% and 30% respectively.[xxvi]

There are many possible approaches that could allow for increased use of, and a greater scoring weight allotted to, performance tests, but one suggested by Jason Solomon in his recent article bears serious consideration. Solomon suggests the use of multiple-choice questions, which inherently afford greater testing reliability, within a performance test format.[xxvii] Instead of, or in addition to, writing an answer, as required on the traditional performance test, examinees would answer a series of multiple-choice questions on the materials provided in the test and on the most effective ways to resolve the issues presented. Unlike MBE questions, these questions would not require examinees to recall memorized legal doctrine, but rather to carefully review the provided factual or legal materials. Multiple-choice questions within a performance test format could also be an effective vehicle for adding legal research to the bar exam, as even within a closed-book format, examines could be asked questions about the most effective research strategy to build on the provided materials.[xxviii]

Conclusion

In sum, to improve the bar exam going forward still requires a focus on the performance test. The following goals are worth pursuing and achieving:

  • Performance testing that consistently encompasses more of the fundamental lawyering skills that the test was originally designed to encompass, including problem solving and factual analysis.
  • More and consistent use of statutory and regulatory authorities in the law library of performance tests
  • Use of performance testing in all 50 states, not just 46
  • At least two performance test items in all jurisdictions
  • Research into the use of multiple-choice questions within a performance test framework, including as a vehicle for testing legal research

[i] See, e.g., Vikram David Amar, What About the Bar Exam After the 2020 Dust Settles?, Verdict, Legal Analysis and Commentary from Justia (Sept 17, 2020), https://verdict.justia.com/2020/09/17/what-about-the-bar-exam-after-the-2020-dust-settles; Jason Solomon, INSIGHT: Saving the Bar Exam By Focusing on Performance, United States Law Week (July 16, 2020), https://news.bloomberglaw.com/us-law-week/insight-saving-the-bar-exam-by-focusing-on-performance.

[ii] Several thoughtful pieces touching on racial inequities caused by the bar exam appear in volume 3, Issue 4 of AccessLex’s publication, Raising the Bar, 3-12 (Fall 2020), https://www.accesslex.org/resources/raising-the-bar-fall-2020.

[iii] Amar, supra Note i; Solomon, supra Note i. 

[iv] Nat’l Conf. of Bar Exam’rs, Your Voice: Stakeholder Thoughts About the Bar Exam, Phase One Report of the Testing Task Force, 1 (Aug. 2019), https://testingtaskforce.org/wp-content/uploads/2020/03/FINAL-Listening-Session-Executive-Summary-with-Appendices-2.pdf.

[v] Id. at 3. 

[vi] See Adoption of the Uniform Bar Examination, with NCBE Tests Administered by Non-UBE Jurisdictions, Nat’l Conf. of Bar Exam’rs (Aug 18, 2020), https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F196.

[vii] The NCBE is notorious for approaching change very cautiously and incrementally. In 2012, the then President of the NCBE invoked geological metaphors to describe how bar exams evolve, writing that any evolution of the exam will be “more glacial than volcanic.” Erica Moeser, President’s Page, B. Examiner, Dec. 2012 at 4, 4.

[viii] MPT Skills Tested, Nat’l Conf. of Bar Exam’rs https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F54 (last visited Oct. 20, 2020) (citing Am. Bar Ass’n Section of Legal Educ. and Admissions to the Bar, Legal Educ. and Prof. Dev. – An Educational Continuum, Report of the Task Force on Law Schools and the Profession: Narrowing the Gap 138-41 (1992) [MacCrate Report]).

[ix] Judith Gunderson, Happy Birthday, MPT!, B. Examiner, Nov. 2007, at 18. See also MPT Skills Testedsupra Note viii.

[x] Id. at 20. 

[xi] Multistate Performance Test: Jurisdictions Administering the MPT, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/exams/mpt/ (last visited Oct. 20, 2020).

[xii] Adoption of the Uniform Bar Examination, with NCBE Tests Administered by Non-UBE Jurisdictions, supra Note vi.

[xiii] Pa. Bd. of Law Exam’rs, Bar Examinationhttps://www.pabarexam.org/bar_exam_information/bebasics.htm (last visited Oct. 20, 2020).

[xiv] The State Bar of Cal., Examinations, California Bar Examinationhttp://www.calbar.ca.gov/Admissions/Examinations (last visited Oct. 20, 2020).

[xv] Multistate Performance Test: Jurisdictions Administering the MPTsupra Note xi.

[xvi] Ben Bratman, Improving the Performance of the Performance Test: The Key to Meaningful Bar Exam Reform, 83 UMKC L. Rev. 565, 584-97 (2015).

[xvii] 2019 MPT Summaries, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F233 (last visited Oct. 20, 2020).

[xviii] MPT Skills Testedsupra Note viii.

[xix] See Free Summaries of MPTs from Recent Administrations Free MPTs and Point Sheets from Older Administrations, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/exams/mpt/preparing/ (last visited Oct. 20, 2020).

[xx] 2015 MPT Summaries, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F175 (last visited Oct. 20, 2020).

[xxi] The State Bar of Cal., Past Exams, Performance Tests and Selected Answershttps://www.calbar.ca.gov/Admissions/Examinations/California-Bar-Examination/Past-Exams (last visited Oct. 20, 2020).

[xxii] See, e.g., Pa. Bd. of Law Exam’rs, July 2019 Pennsylvania Bar Examination, Essay Questions and Examiners’ Analyses and Performance Test, 59-61, https://www.pabarexam.org/pdf/qa/qa719.pdf (last visited Oct. 20, 2020).

[xxiii] See, e.g., Pa. Bd. of Law Exam’rs, February 2020 Pennsylvania Bar Examination, Essay Questions and Examiners’ Analyses and Performance Test, 62-70, https://www.pabarexam.org/pdf/qa/qa220.pdf (last visited Oct. 20, 2020).

[xxiv] Nat’l Conf. of Bar Exam’rs, Comprehensive Guide to Bar Admission Requirements 36-37 (2020), available at https://www.ncbex.org/assets/BarAdmissionGuide/CompGuide2020_021820_Online_Final.pdf.

[xxv] Bratman, supra Note xvi, at 605-10.

[xxvi] Id. at 608.

[xxvii] Solomon, supra Note i.

[xxviii] If and how to test legal research on the bar exam has been on the NCBE’s radar screen since at least 2006. See Erica Moeser, President’s Page, B. Examiner, May 2006, at 4, 5.

Menstrual Products and the Bar: Advocacy Seeks to Create Equal Bar Exam Testing Conditions for Menstruators

By: Elizabeth B. Cooper, Fordham Law School; Margaret E. Johnson, U. Baltimore Law (visiting at American); and Marcy L. Karin, UDC David A. Clarke School of Law

We can all recall the stress, fear, and worry that accompany taking the bar exam.  About half of us also were anxious we would have to manage our period in the middle of this awful two-to-three-day ordeal.  Bar examiners across the country have made this prospect far more daunting than it needs to be by not treating menstruation as the natural biological process that it is.

Without proof of any test-taker having ever cheated using a tampon or pad, some states have chosen to stigmatize and potentially penalize people who have their periods with draconian policies prohibiting bar examinees from bringing their own menstrual products with them.  Other states have failed to adopt or communicate clear policies on the subject, increasing test-takers’ anxiety: one should not have to waste time researching the Bar Examiners’ hard-to-find policies  or calling their offices for answers—which may, or may not, yield consistent information. 

The harm here is four-fold: 1. It is wrong to make test-taking conditions more challenging for people based on the fact they menstruate; 2. It is wrong to limit test-takers to random products selected by Bar Examiners that could put test-takers’ health and menstruation management at risk; 3. It is wrong to exclude text-takers from any menstrual products simply because they do not use the women’s restroom; and 4. It is wrong to convey the harmful message that all people who menstruate are untrustworthy and do not belong in the legal profession. 

Some states, including Texas and Pennsylvania, prohibited exam-takers from bringing in their own menstrual products, offering to provide a limited and unpredictable set of products in the women’s bathroom.  (After much advocacy, Texas changed its rule for the September exam, though it is unclear if this is a permanent change.)  This does not solve the problems these states created in the first place by banning test-takers from bringing in their own products.  People who menstruate need their own products because menstrual products are not “one size fits all”: menstruaters require different sizes and levels of absorbency in their products to best fit their body and menstrual flow.  

Use of the wrong size product can lead to everything from pain and discomfort to toxic shock syndrome (if too large) and time-consuming, uncomfortable, and disruptive leaks (if too small). Further, some individuals require hypoallergenic products to protect against allergic reactions.  If not provided, applicants may experience vaginal itching or other problems caused by using allergen-containing tampons or pads inside or adjacent to their bodies.  All of these consequences are awful enough on their own; here, they create an unconscionable risk of derailing exam performance.

In addition, by limiting test-takers from bringing in their own products and then providing products only in the women’s restrooms, Bar Examiners relegate transgender men and nonbinary persons who may menstruate, and who may use the men’s restrooms or all-gender restrooms, to having no access to menstrual products during the bar exam.

Other states allow test-takers to bring their own products, but require them to be packaged in a clear plastic bag—with some states mandating that the product be unwrapped.  This last requirement makes no sense: the wrapper both keeps the product hygienic before being inserted into or placed adjacent to one’s body and provides an efficient way to safely dispose of used products, reducing janitorial staff’s exposure to bodily fluids.  Further, removing the wrapping exposes the adhesive on the bottom of some pads, rendering them practically useless when the menstruator tries to unstick them from the clear plastic bag.

As much as we want to destigmatize menstruation and eradicate the embarrassment and taboo of being seen with a tampon or pad, it remains an invasion of privacy to require test-takers to carry their products in a clear plastic bag, revealing to a proctor (and possibly a classmates, colleagues, or future opposing counsel) that one has or expects to get their period during the exam.  (One North Carolina bar exam test-taker reported that a proctor asked her if she “really needed those” while inspecting her plastic bag of menstrual products.)  Finally, this intrusion is even more painful for, and potentially outs, transgender men and non-binary law graduates who may not be public about their biological sex.  It may even set them up for bigoted harassment—during the biggest exam of their lives.

Other states allow test-takers to bring their own products and do not require them to be carried in a clear bag—but, they must check them with a proctor or retrieve them outside the exam room before heading to the restroom.  This “solution” means that a menstruating person with will have to take vital time away from the exam (or a break between sections of the exam) to obtain their menstrual products before using the restroom.  This “time tax” is as unacceptable as the other approaches described above.

At least some states treat people who menstruate without such bizarre suspicion, allowing them to bring in and keep their own products with them during the exam, and use them as needed during the test—without having to ask a stranger for their own personal possessions.  To date, there have been no known accusations of test-takers trying to do the impossible: write helpful information on a pad or tampon to give them an edge on the exam or smuggle in written answers inside the product’s wrapping.

The lack of uniformity of equity-based rules permitting access to one’s own menstrual products is unacceptable and must be changed. Thankfully, in the age of social media, law graduates have taken the lead on this advocacy, sharing the hurdles they are facing on social media and asking state Bar Examiners to eliminate these outrageous rules, largely under the #bloodybarpocalypse hashtag. 

Once we saw their posts, the three of us, working with fantastic former clinic students of Fordham and UDC, began advocating that all state Bar Examiners adopt better menstrual products policies.  We drafted a letter to the National Conference of Bar Examiners (NCBE)—co-signed by over 2800 law professors, law students, law school graduates, and lawyers in under 24 hours.  We also sent letters to Bar Examiners in each state that administered an in-person July bar exam and did not have a clear, acceptable policy in place.  All of these efforts led to some quick changes. 

The NCBE contacted state Bar Examiners and informed them that menstrual products were not included in the NCBE’s “prohibited paper” category and that test-takers should be able to bring in their own products.  The press started asking questions of the state Bar Examiners.  And state Bar Examiners began changing or clarifying their policies, with some confirming to examinees that they could bring personal menstrual products to the exam.  For instance, West Virginia Bar Examiners insisted that they permitted products in the exam room, even though their website said differently. Texas state Bar Examiners changed their policy from not permitting products to permitting them at its September exam.  (The state has issued contradictory statements, however, about whether this change is permanent.)

This positive change is not, however, uniform: even those states that have adopted equitable policies must be monitored to ensure they are adopting best practices.  In our efforts to get accurate and honest information from state Bar Examiners across the country, it has been deeply disconcerting to learn how many jurisdictions are silent on whether examinees may bring in their own menstrual products; have informal policies that contradict written statements about what items are allowed in the exam (e.g., not listing menstrual products in the list of items test-takers can bring in, but informally allowing them); or have stubbornly held onto their recalcitrant policies.  

Equally unacceptable, many Bar Examiners will not share the documentation that they say embodies their policies (e.g., generic letters to test-takers informing them what they can and cannot bring into the exam; postings on their web sites behind a security wall).  Without this proof, there is no accountability and the true practices of these states remain unknown.   

As we reach out to jurisdictions administering in-person exams in the coming months, our demands are clear: Bar Examiners must issue explicit policies permitting examinees to bring their own menstrual products in to bar exams, in an opaque container or on their person, and to publish these policies on their websites.  Other bar-related policies that can have disproportionate effects also must be changed.  For instance, examinees needing to pump their breastmilk must be given ready accommodations and bathroom access must not be limited as it affects both pumpers and menstruators.

To learn more about all of the advocacy efforts in this area, check out Menstrual Equity and the Bar Exam: Round Up of Op-Eds and Other Media Coverage on the Feminist Law Professors blog and follow the hashtag #MPandTheBar.  If you want to get involved in this work, let us know. And no doubt other activists working on the pumping and bathroom access issues would welcome assistance too. There is, unfortunately, plenty of work to be done.

Let’s Take This Period of Unprecedented Change to Consider How Grading Practices Can Affect Issues of Diversity and Inclusion in Our Law Schools

Jennifer S. Bard, Visiting Professor, University of Florida, Levin College of Law

For the last half of spring semester 2020, law schools all over the country were forced to change their method of instruction, delivery of final exams, and (in many cases) grading practices because of the demands for physical isolation following the outbreak of Covid-19.  Now that the semester is over, there is a further round of disruption as many states have delayed or even cancelled their bar exams, some have granted graduates diploma privileges, while others bravely go ahead in the face of a possibility that they will have to cancel at the last minute because of ever-rising rates of infection. 

Like the opportunities that may arise when a river is drained and a ship revealed, there may never again be such an opportunity for us to consider what role we play in the glacially slow diversification of the legal profession and how we can make our law schools more equitable, inclusive, challenging, and effective for all of our students—not just those for whom it has been particularly well suited.

With many things to choose from, my starting point for looking at things we rarely question is the marrow deep belief that we owe it to our students to sort them for the benefit of large law firms—even when our employment profile shows that very few of our students will ever work at such a place.  Since the threshold for this opportunity is a top 5 or perhaps 10 percent class rank, it may seem odd, on reflection, that we have designed a curriculum designed to compare students that may have many undesirable consequences including undermining self-esteem, discouraging learning for learning’s sake, and contributing to the lack of diversity in the legal profession.  

Over the years, other justifications have been added such as the need to motivate students or assess their progress but never have we had such a good opportunity to see what law school is like without grades or, more to the point, comparative curves.

Here are some Practices We Might Question

The Primacy of First Semester Grades

One result of the decision to go pass/fail (or some variation of the words) was to “freeze” first year first semester class ranks because it was impossible to produce comparative curves

The resulting phenomena gives us a chance to ask ourselves  some tough questions:

  1. Do First Semester Grades Reflect What Students Bring to Law School Rather Than What We Bring to Them? OR Do Students Who Come in Knowing the Rules Get Better First Semester Grades?

Many students, very often First Generation Students, but also some facing racial or gender identity or expression based discrimination, frequently tell us (and the many researchers who study first generation college students) some version of “everyone knew the game but me and by the time I figured it out, it was too late.” And while students living with disabilities might intersect with any of these groups, they also are often using new adaptive equipment and certainly facing new challenges that they may have been able to mitigate in college.

Certainly many of our students do know the game from the start.  The recent AALS survey “Before the JD” found a disproportionate number of students who ended up going to law school had parents who were either lawyers or professionals. While students have, themselves, created organizations to support each-other usually with the enthusiastic support of the law school it may not be enough.

Our challenge going forward is that history is told by the victors.  We can see the students who were not comfortable the first semester but then continued to graduate “at the top of their class” (a vague term that usually means somewhere in the top 20%), but we don’t hear from the ones who didn’t draw attention through academic distress, but also didn’t thrive.

It would be helpful to know more–and many schools do know more about their own students.  But so little of this information is published.

Much is being done in supplemental programs- to name them is to leave many out- such as pre-first semester programs, orientation programs  and excellent pre-law institutes like the Tennessee Institute for Pre-Law , and in wonderful conferences organized by the National Black Law Students AssociationLavender Law, the National Association of Law Students with Disabilities,  and so many others.  

But how much more effective would it be to have a curriculum that was truly equitable and inclusive – all the way through?

2. Did Pass/Fail Grading Help Learning, Hinder Learning, or None of the Above?

Across the board pass/fail grading that makes no effort to compare students to each other is so unusual as to make any observations worth considering. The expectation was a distressing list of bad results-students putting in less effort during class, performing worse on exams — but did that really happen?

3. Ditto Open Book Exams

As above, it would be interesting to test, in the fall, the content knowledge of students who took open exams.  Not so much as to compare them with past classes, but to see what how much they learned.

4. What Will Be the Long Term Effect of the Delayed or Cancelled Bar Exams–and How Might that Change Our Curriculums?

The opportunity presented by the necessary changes to the bar exam is already in very good hands, (thank you AccesLex) but it’s still worth considering what the future will look like in states which choose provisional or full licensure.  Even decisions to delay the bar exam could raise issues of an on-going, career long licensing process, much as many doctors (but not all) must take requalifying exams every ten years to retain their “Board Certificate.” What would that mean for law schools?

To Be Continued: Part II: What Can We Learn from the Delay of Fall On-Campus Interviewing?   

Disparate Impact Magnified: Holding a Bar Exam during the COVID 19 Pandemic year of 2020

Yesterday the Harvard Law Review blog posted an excellent piece by a powerhouse group of legal educators who describe the prospect of a “licensing abyss” just when non-privileged folks and small businesses will need extra legal assistance to navigate the health, employment, housing and government benefits legal landscape.  On the same day, the ABA also urged states that cancel or delay the bar exam to  license law grads on an emergency basis “to help address the increase in legal needs for individuals and businesses caused by this pandemic.”

The Harvard blog authors note, in addition, the the reluctance of bar examiners and courts to find alternatives to the “closed-book, two-day exam anchored in 200 multiple-choice questions” despite the option of so many good alternatives that may well better predict competence to practice law. The authors ask,

Why do our courts and bar examiners place so much faith in this high-stakes exam to predict who is competent to practice law?

This question has puzzled readers and contributors of this blog particularly in light of the discriminatory nature of “speeded” exams  and the economic call for practice-ready lawyers. It is also puzzling when the profession itself is so deficient in diversity and standardized tests are used in ways that preference the privileged.

For 2020, the issue of disparate impact with respect to timed, closed-book exams anchored in multiple choice questions is further exacerbated by law students’ quarantine and sheltering conditions while studying for the bar exam- see the excellent piece in the NYT on how students returning home to attend classes removes the veneer that all are equal. Even more disturbing and heartbreaking is the information surfacing this week about the horrific disparate impact of COVID19 deaths on Americans of color.  Pre-existing disparities in trauma, housing, employment, healthcare, opportunity, discrimination and historical DNA exacerbate the distress and fatalities for communities of color and for those whose families and friends are populated by people of color.  Some of us – particularly our students of color – will be affected in disproportionate ways and in ways no one can predict or control over the course of the coming months.

As the authors of the Harvard Law Blog wrote, “Crises challenge assumptions and demand action. For this year, emergency licensing based on diplomas and periods of supervised practice would offer proof of competence.”  To do otherwise would demonstrate an inability of our profession to adapt and experiment, and a shocking refusal to recognize and correct disparate impacts.

NYS Law Students Urge Highest Court to Expeditiously Announce Alternatives to September Bar Exam

Throughout the country all of us are being asked to change, adapt and respond in unprecedented ways as we experience global pandemic, quarantine, loss, fear, empathy and grief.  New York’s situation seems at this moment most dramatic as the deaths due to the virus surpass those from September 11th.

Two days ago, on April 1st,  law students from the 15 New York law schools eloquently and compellingly argued for the highest court to recognize this unprecedented moment and act accordingly in their Student Letter to Chief Judge DiFiore . In addition, the 15 deans of New York Law schools co-wrote and submitted a similarly persuasive Letter from New York Law Deans to Chief Judge DiFiore.

Yesterday, April 2nd,  the National Law Journal published Judith Wegner’s An Immodest Proposal. Professor Wegner, the innovative co-author of the pathbreaking Carnegie report Educating Lawyers calls for state bars and courts to:

  1. Recognize the factors that are impeding meaningful responses;
  2. Mitigate immediate problems through supervised practice rules;
  3. Consider adopting an emergency provisional licensing system; and
  4. Recognize that other options also deserve attention.

It is incumbent upon the New York Court of Appeals to act swiftly and with innovative  measures to effectively respond to the informed voices of  New York’s law students and law deans.

Is Mandatory P/F An Opportunity to More Accurately Assess Competency to Practice Law and For Bar Admission?

As our knowledge of COVID19 and its impact becomes more extensive each day, each workplace, profession and community is facing some common and some unique questions. Those working on the front lines in hospitals – such as several of  my relatives in NYC and NJ – are experiencing the kind of trauma, shortages, emotional overload and duress that is usually experienced in wartime. It can only be weakly imagined by the rest of us.   For those of us not experiencing  people suffering and dying in front of us on a daily basis, some less horrific choices are before us:  How do we modify “business as usual”?  How do we evolve and adapt with each days new tsunmai of information and data?  How do we support our best selves and our core values in this historically momentous time on our shared planet? 

Before turning to the topic of grading and assessment, I want to pause to give a shout-out to my home institution. Our multi-talented leader Dean Alicia Ouellette has been holding  community town halls every day since Friday March 20th. (BTW Dean Ouellette  just shared on Facebook  that she had been suffering from “presumptive COVID 19” fever and symptoms but thankfully is now symptom free). During our daily town halls, my faculty colleagues and I have expressed our wonder and gratitude for the  character, resilience and grit of our law students who are balancing so much right now, and facing so many financial, tech-related, health and extended family burdens. Our students’ engaged and forgiving response to “tech-curious but not necessarily tech-savvy” teachers and their community-minded empathy for those hardest hit keeps the faculty motivated and inspired.

One of the COVID19 decisions for legal educators involves whether and how we assess and sort — which in reductive  vernacular means “grade and rank.”  Maintaining appropriate expectations, options, rigor and excellence in law teaching  may assume primacy for those  who have been long focused on ensuring that law students receive real value for the time, talent and treasure they expend on law school.   For others focused on fairness in law placement,  transparent employer signals about how they will view Spring 2020 legal education may be most influential.  For those concerned about our profession’s  reputation for lack of wellness and lack of diversity, those concerns are elevated at this moment when those least advantaged are most hard pressed.  For those struggling with equity, there are so many permutations and consequences of COVID19 – whichever choice a school makes – that voting faculty could become as immobilized as Chidi Anagonye on THE GOOD PLACE. (BTW Good idea for escape television for those who love philosophy or Kristen Bell).

On the other hand, might this be a moment to look for the opportunities for reform and improvement that only come when the status quo is disturbed and rocked to its foundations as is happening now.  Here is what I am thinking:

Might Mandatory P/F force educators and employers to admit that traditional law school grading and ranking is a misleading and reductive proxy for measuring potential success as a lawyer?

Could it force employers to use other ways to learn about the WHOLE STUDENT with all her strengths, gaps, and individual aptitudes including the situation she faced during law school?

Might it accelerate a move to a more qualitative than quantitative assessment of each law student? Or, at least might it prioritize learning which enables a school to assemble a portfolio of student recommendations ( demonstration of knowledge, skills, aptitudes, and professionalism)?

Foundational resources include of course Educating Lawyers, Best Practices in Legal Education, and Building on Best Practices: Transforming Legal Education in a Changing World, which also provide helpful wisdom points. In addition, looking back through the dozen or so years of this blog’s existence, there are lessons from which we can pull core knowledge and core values to assist in our continued educational deliberations at this turbulent time. 

CORE KNOWLEDGE AND REFLECTIONS

Valuing Legal Education over Sorting – For example, focus on the difference between assessment and grading.  Educating Tomorrow’s Lawyers conferences have brought employers, law schools, and legal education stakeholders together to tackle the disconnect between our current sorting systems (primarily used to help elite employers looking for a simple and reductive initial screening system) and the needs of society and most employers for competent new attorneys and the needs of students and the profession for fairness.

Focus instead on formative and evaluative assessment of law students and graduates

Focus on growth mindset, on reflection and learning from mistakes or experience

Recognize the limits and problems with GPA’s or LSAT scores to create a more competent profession with more able and diverse learners.

Acknowledge that the media and the academy is still stuck in a mindset that focuses on sorting methods rather than on better preparation and assessment of law students to serve clients and society.

Class rank does not predict who will become a competent, healthy and ethical lawyer

Effective Education includes

CORE LEARNING VALUES

Growth Mindset 

Inclusion and Diversity

Student-centered Learning  and the Introduction to the original Best Practices – “One of our basic tenets is that law schools should become more student-centered”

Wellness  

Collaboration and Innovation

Integrity 

Character 

Justice

Situational Excellence

There is a common theme here: P/F with alternative assessment information and measures should be seen not as temporary emergency expedients to “sort and rank”, but rather as long overdue components of a better educational program and more nuanced assessment paradigm.

I would love to hear your thoughts in the comments below.  I wish all our readers and citizens of our little blue planet moments of peace, love, safety, and compassion. May someone be kind to you today and let’s pay it forward.

 

 

 

After All These Years: Another Bar Exam Over, Another Entering Class, but Still a Disconnect between the Licensing Exam and What We Need Lawyers to Be and Do

I was never a Journey fan but I truly am astonished that after all these years of preparing lawyers for practice, and after two years of an unprecedented undermining of  the rule of law in our nation, law schools still live with a disconnect between the profession’s  licensing exam and what business, government and society needs lawyers to be and do, which includes protecting  the rule of law. 

The National Law Journal recently discussed two new major studies which will analyze whether the current exam is the best measure of new lawyer competence.  The National Conference of Bar Examiners (NCBE) is in the midst of a three year study  to “ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in the 21st century.”  (Hmm, continues? that’s a bit biased) and has already held 30 listening sessions.  

The second study, “Building a Better Bar: Capturing Minimum Competence” is an initiative of  the Institute for the Advancement of the American Legal System in partnership with Ohio State Law Professor Deborah Merritt, and aspires to develop a “fair, evidence-based definition of minimum competence” to improve the current licensing process.  Funded by Access-Lex, the researchers:

will be holding 60 focus groups in 12 locations around the country. While these focus group participants will primarily be new lawyers, we will also hold a number of specialized groups with supervisors. Additional specialized groups will include only women and only people of color, as well as groups in rural areas; traditional job analyses can mask the views of these lawyers, yet their perspectives are essential to create a more fully representative view of minimum competence and how to test for it effectively. Through these focus groups, we will be able to capture key information from a diversity of perspectives and provide concrete data on the definition of minimum competence that the profession can use to improve the bar exam and how lawyers are licensed.

 

Readers may remember that IAALS has provided helpful research in the past through its Foundations for Practice  research, which identified the  competencies over 24,000 legal employers value in new hires (most of which go untested by the current licensing process) as well as the evaluation of the graduates of the Daniel Websters Honors alternative to the bar exam in “Ahead of the Curve:  turning Law Students into Lawyers

I suppose I should be delighted that more studies are being launched. They are addressing the exact issues so many of us have raised for decades. However, my reaction is uncharacteristically pessimistic.  (Readers here who have tolerated my enthusiastic use of exclamation points and emphasis will agree it is uncharacteristic).  Perhaps it is the August humidity. Perhaps, it is the sorrow surrounding our nation after a week of grief from senseless gun violence But more likely, it is the fact that I am feeling frustrated that we have already studied this to death! For example, working with state bar associations The Foundations for Practice Project already studied new lawyer competencies with 24,000 lawyers from all 50 states participating and found

… the foundations that entry-level lawyers need to launch successful careers in the legal profession.

In a first-of-its-kind survey, we asked, “What makes a new lawyer successful?” More than 24,000 lawyers from all 50 states answered.

What we learned is that new lawyers need more than IQ and EQ to be successful. They also need CQ: Character Quotient. In fact, 76% of characteristics (thinks like integrity, work ethic, common sense, and resilience) were identified by a majority of respondents as necessary right out of law school.

Beyond character, new lawyers are successful when they come to the job with a broad blend of legal skills, professional competencies, and characteristics that comprise what we call the “whole lawyer.”

So why is the NCBE, who clearly has a stake in the outcome, refusing to respond to the outcome of that 3 year old study but instead promising only to do its own study. JEESH! We tweak here and there, we add more pro bono or experiential requirements, but no one truly influential will admit that our insistence on anchoring the gateway to the profession to a timed, written exam instead of clinical excellence is the problem.

Starting as early as 2008, this blog has discussed the problems with the bar exam and its role as an unhelpful, anxiety producing, discriminatory, skewed, and unnecessarily speeded, gate-keeping device.  For a sporadic history of posts between then and now, in fairly chronological order, click on the links below.

Did You Know That “Bar Courses” Don’t Matter? 

New Article: No Excuses Left for Failing to Reform Legal Education

Working with State Bar Associations on Best Practices

Bar Passage and Best Practices for Legal Education

One BAR to rule them all?

The Daniel Webster Scholar Honors Program

NYSBA Task Force on the Future of the Legal Profession Report

New Requirements for Bar Exam Stress Clinical Education

Existential Crisis and Bar Exams: what is really cruelest?

The Bar Exam Inhibits Curricular Reform

NEW YORK STATE BAR ASSOCIATION VIGOROUSLY OPPOSES PROPOSAL TO BRING UBE TO NY THIS JULY

Preparing Students for the Multistate Bar Exam

Musings on the Bar Exam and Legal Education’s Attitude toward it

Bar Exam Musings, Part II: Skillfully Changing the Bar Exam Narrative

Experts in the Legal Field Question the Bar Exam…

What’s going on in California? “TFARR- recommended” 15 credits of competency training

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Examining the Bar

Keeping an experiential identity in bar passage reform

Whither Clinical Courses and Bar Passage – by Prof. Robert Kuehn

DO LAW SCHOOLS ADEQUATELY PREPARE STUDENTS FOR PRACTICE? SURVEYS SAY . . . NO! – Robert Kuehn, Washington University School of Law

Professor Merritt’s Blog post on attorney discipline and bar exam WORTH A READ!

Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (Besides the Cut Score Debate)

Scholarship on Bar Exam Alternatives Needed

ABA Commission on Future of the Profession & ABA Vote on Bar Passage Proposal

Drafting Exams With Test-Taking Speed in MindConcrete Suggestions for Bar Exam Reform

We have to talk about the bar exam

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

More Resources Re Teaching, Learning, and Bar Passage

A Fresh Look at the Uniform Bar Examination

Letters raise concerns about changes to the bar pass accreditation standard

Time to Remedy the Ills Afflicting ABA Council’s Standard 316 Proposal

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Shifting the Focus of Legal Education Back to Just That: Education

How Practice Tests Reduce Anxiety in Bar Preparation and the Exam

Quite a listing, huh? I suspect that the IAALS and Merritt project will provide us with extraordinarily helpful insights into measuring minimum competence. But political clout is also needed. Will this BLOG simply be adding more posts for years to come on the unfairness and inappropriateness of a slightly modified, unnecessarily stressful, timed, bar exam — a continued hazing tradition?  I hope the NCBE and other institutional influencers proves me wrong.

How Practice Tests Reduce Anxiety in Bar Preparation and the Exam

Sara Berman and I recently did a podcast in the ABA’s Path to Law Student’s Well-Being Podcast series. See https://www.spreaker.com/show/path-to-law-student-well-being. Anyone associated with helping applicants prepare for the Bar exam knows that the challenges they face can affect their well-being.  In the podcast, we share our experience that applicants who practice tests regularly learn not only content and skills, but also the ability to manage anxiety as they get closer to and take the exam.

            In bar preparation, students take seemingly endless sets of multiple-choice Multistate Bar Exam questions. In addition, their bar preparation companies provide opportunities to practice essays and Multistate Performance Tests (MPTs).  Applicants need to follow the Bar company’s suggestions and to get feedback on submitted work.  They should welcome critiques and suggestions, assess their own strengths and weaknesses, and improve by building on strengths and addressing weaknesses.  If allocation of time to different study methods is an issue (and it always is), applicants need to do more—not less—practice testing than reading and re-reading outlines, flash cards, and the like.  Cognitive science indicates that people retain information better when they learn a concept by applying it in a problem-based approach. See Dani Brecher Cook & Kevin Michael Klipfel, How Dow Our Students Learn?   An Outline of a Cognitive Psychological Model for Information Literacy Instruction, 55 Reference & User Services Quarterly 34 (Fall 2015).   In studying legal rules, applicants retain more by doing practice essays or MPTs because they are learning and applying rules in the context of the facts that trigger them.

            This message may be not be welcomed by an applicant who, when she writes a practices essay or MPT answer, experiences anxiety during the practice test.   Applicants often do not want to face the reality that they do not know a rule.  They falsely believe that they must have mastered a subject area before doing practice tests.   The podcast encourages applicants to practice essays and MPTs regularly and often even if they are do not feel that they have fully mastered a subject.  Using the open-book method for practicing can help someone get the process going.  The key is to break through the resistance to doing practice tests.

            If applicants get past the reluctance to embrace practice tests, they can experience reduced anxiety as they move forward.  Again, the context of our recommendation presumes that an applicant is receiving feedback from a Bar Company representative, an academic support advisor, or both.  When applicants respond to feedback in new practice test answer and see their work product improving, that reduces their anxiety.  The anxiety does not go away but remains at a reduced level—a level at which it can motivate performance rather than interfere with it.   At such a point, it is fair to say that an applicant is managing anxiety.  

            In the podcast, Ms. Berman implored law students who might be listening to apply these principles in law school.   Practicing tests—whether essays, multiple-choice, or other tests—will benefit a student.  The student of course needs to seek feedback, recognize areas in which she can improve, and be working toward that goal.   Those students who I have seen take such an approach report (1) less anxiety on graded tests and (2) that they believe they performed more effectively.  Although the days of a class hinging on one grade at the end of the semester seem to be fading, the final exam still forms a major part of student’s assessment in many courses.  Of course, ABA Standard 314 encourages formative and summative assessment and students are receiving meaningful feedback.   By doing practice tests, such as writing an answer to a potential essay, the student can apply what she has learned from feedback and seek more.

            An excellent article on practice tests concluded that such tests may improve student performance.  See Andrea A. Curcio, Gregory Todd Jones & Tanya M. Washington, Does Practice Make Perfect? An Empirical Examination of the Impact of Practice Essays, 35 Fla. St. L. Rev. 271 (Winter 2008).   The question explored in the article is whether practice essays improve performance.  The inquiry in our podcast is different.  We ask whether practice tests allow students to manage anxiety.  We entitled our podcast “Practice Makes Passing,” to counter the view that applicants must be perfect (or have completely mastered) most subjects.  Applicants need to do their best. However, they will increase their chance of passing by recognizing that practice may well be what gets them to “good enough”—i.e., a passing score.

            The ABA’s series on student well-being is an important look at a problem once viewed solely as an attorney well-being problem. Many now accept that law schools and students are an environment that can diminish or enhance student well-being, depending on choices by the school and by the students.  By learning to manage anxiety through practice tests, law students can choose to improve their well-being. Bar applicants can do the same. By spending their time wisely in bar preparation, and including a healthy dose of practice tests, the applicant will ultimately experience less anxiety and likely perform more effectively. 

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Thank you so much to Best Practices for Legal Education for inviting me to blog again and to Elizabeth Murad for her remarkable work in keeping contributors in touch and on track.  So much is written about the very real decline in bar passage that it is easy for schools with high pass rates–or at least high in relation to other schools in their state– to ignore the need to change what goes on in the classroom and dismiss the excellent work being done in effective law teaching as a problem for “lesser schools” in “lower tiers.”

We know, as legal educators , members of the bar and even members of the public, that bar passage rates have been falling.  And we also know that many, if not most, law schools are admitting students today with LSAT scores lower than those that they  admitted ten years ago. So it’s easy to see a correlation between lower scores and falling rates.  After all, the bar exam is a test much like the LSAT–why wouldn’t there be a relationship?   But even if students are failing the bar exam for the same reasons they are getting low LSAT Scores,  we still have the opportunity to intervene in ways that we know raise pass rates.  This blog contains so many resources for those who want to teach more effectively.   Why wouldn’t we want this for all our students?

Everyone at a school with a “bar passage problem” is well aware that we cannot continue to do the same things we always have when they are no longer working the way they used to.  But we hear this less at schools satisfied with their bar passage  Perhaps the students who are failing are really canaries in the coal mine and a warning to all of legal education that all of today’s law students find it more difficult translating their legal education into the very peculiar format required for bar passage-regardless of LSAT score? Everyone who has ever studied for the bar exam remembers it as a grueling, unpleasant, and highly intensive process–but until very recently that process started after graduation and barring personal disaster almost always resulted in passage.  Even when it didn’t, the consequences of were lower.  Today, students safely employed in September find themselves fired if October brings news of failure.  We need to consider bar passage as an issue both for students who fail and for those who pass–after all, both groups spend the same three years in law school.

Anecdotal evidence (which we could easily substitute for actual data by doing some surveys) suggests that bar passage anxiety spreads well beyond those students most at risk.  All students know that the stakes are high and many believe that their chances of passing are lower than students in the past.  Does that affect their choices while in law school?  Could they be doing more to prepare for their future careers if we could provide them more effective instruction?

Medical students and educators are expressing the same kinds of concerns about their curriculum being shaped by a test as we should be about ours.   We can’t easily change the bar exam–but we can adopt more direct methods of instruction that support not just bar passage but create time for the more complex and less exam focused thinking that we want to be going on in class.

I hope over the week to share resources that would encourage everyone to consider how studying for a very old fashioned test is negatively shaping the education of all of today’s law students. (and because it always warrants reposting-here is a recently revised article by, Louis Schulze of what they have done at FIU to apply the “science of learning” across the curriculum in support of higher bar passage.

 

A Fresh Look at the Uniform Bar Examination

The bar exam is back in the news. Later this month the ABA House of Delegates will consider a proposal to raise the bar-passage standard. The proposal would require that 75% of an accredited law school’s graduates pass a bar exam within two years of graduation.

In most states, bar exam means the National Conference of Bar Examiners’ Uniform Bar Exam. The UBE has been adopted by 35 jurisdictions and is under consideration in others. It carries the endorsement of many ABA-related entities. They include the Conference of Chief Justices, the Law Student and Law Practice Divisions, the Section of Legal Education and Admissions to the Bar, and the House of Delegates.

As a reminder of the features of the UBE, and an update on the pace of its adoption, I have accepted Mary Lynch’s gracious invitation to reblog my recent post “Will the Uniform Bar Exam Come to Michigan?” from the Western Michigan University-Cooley Law School blog. (Spoiler alert: no time soon.)

More Resources Re Teaching, Learning, and Bar Passage

Thank you to Best Practices for Legal Education Blog for having me as a blogger this week.  I hope the examples I’ve provided about methods medical schools use to evaluate their curriculum, test the effect of new programs, and look for factors that affect success on licensing exams.  As I mentioned at the end of my last post, the most comprehensive source for research based information about bar passage programs as well as a source of funding for sources is AccessLex.  There is a growing literature of articles from schools which have implemented successful bar passage programs.  Here’s an article by Louis Schulze about his work at FIU.

You might also be interested in a series of articles from back in 2009-2010 when those at the front lines of legal education, first year faculty and legal writing and research faculty, began to see significant differences in performance between the students they were teaching and those in the past.  These articles provide information about how substantial changes to the k-college education system in the U.S.A. impacts law students’ transition to law school. This article by Rebecca Flanagan is a good overview.  Prof. Patricia Grande here.  A literature review of law learning strategies by Profs Jennifer M. Cooper and Regan A.R. Gurung.   One more by Profs Susan Stuart and Ruth Vance

Here are the proceedings of a 2014 Symposium entitled “Teaching the Academically Underprepared Law Student” and I invited readers to take advantage of the comments section of this blog to share other publications—including the many more recent ones.  My point here is historical, not bibliographical.  And here, as a quick reminder of one of the crucial skills the bar doesn’t test– research.  Caroline L. Osborne

Finally, something I’ve mentioned elsewhere, the new collaboration between LSAC and Khan Academy providing free, on-line, very high quality LSAT preparation may have something to offer law students.  The skills underlying LSAT performance, close reading and legal reasoning, are not immutable—students can get better at them after enrolling in law school and may find some time with these materials a helpful and interesting way to brush up on these skills.

 

 

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

It’s not unusual for a provost or a colleague or a relative at Thanksgiving to ask a legal academic why law students have so much trouble passing the bar exam when the pass rates for medical students are usually in the high 90th percent.  The short answer to that question is that the two processes are completely different—and there’s no obvious trick, technique, or intervention that could convert our bar passage rates into their licensure passage rates.   For one thing, it’s the wrong question.  “Passing” the medical licensing exams is certainly important, but unlike the “all or nothing” process of passing the bar exam, the score achieved on Step 1 affects medical students’ entire career path.  But there is a lot to learn about the methods that medical schools use in studying the very few students who have trouble as well as how they evaluate the effect of changes to their curriculums on scores on the licensing exams.

Quick recap on professional licensing—future doctors take a series of three exams over the first six years of their undergraduate medical education and the start of their residency.  (more links in a post I wrote earlier this year here).  The exams are almost entirely national although the actual process of being licensed is conducted on a state by state basis.   Law students take a licensing exam in the state where they intend to practice upon graduation.  For purposes of this post, the closest analogy to the bar exam is the more academic Step One students take during their second year of medical school.  Like  our NCBE, the National Board of Medical Examiners which produces United States Medical Licensing Examination works with medical licensing boards and depends on their confidence.  It issues annual reports.

The focus of this post is on the methods that medical schools use to study the small number of their students who do have trouble passing the licensing the exams as well as the factors that can affect the scores students achieve.  I’ve tried to focus on articles outside of paywalls, and would certainly encourage you to conduct your own searches in the various data bases to which you have access.  There are several journals devoted directly to studying medical education—although these articles can pop up anywhere.

Medical educators use a wide range of research techniques to learn more about students who struggle with licensure exams.  Like us, medical schools would prefer students pass the first time and many articles like this one look for characteristics who fail the first time but eventually pass.  Others look for characteristics of students at risk for failure here and here  or even  what students think of the exam.    Another area for inquiry involves the role stress plays in the score students achieve.   In partnership with social scientists at our schools or in our communities, we too could be conducting studies to help us learn more about students who face difficulty passing the bar exam.  These studies can be part of graduate student work or may even be funded by groups like Access which is making money available to study bar passage.

 

The actual reason the medical school pass rates are so high, though, may not be all that helpful.

It’s not just because they are able to limit admission to students who have already demonstrated an ability to score very highly on the MCAT.  A test that is much more similar to step 1 than the bar exam is to the LSAT.  Indeed, medical schools have direct input in both the MCAT and the Licensing Exams—so when one changes, the other can too. And it’s not clear that anything in the curriculum makes a difference at all—the industry offering study aids and licensure prep courses dwarfs the bar prep and study aid market to a point where students often start studying for the licensing exams before the first day of medical school.

But if it is the curriculum, it’s important to remember the vast difference in time scale between medical and legal education.  We have students for three years post B.A. Medical schools in the U.S. plan their curriculum based on  8 plus years of increasingly specialized medical education.  They are therefore comfortable holding off on the direct teaching of practice skills for the first two years while they are aligning their curriculum with the content of the Step 1 exam.

Even Step 1, though, is far more focused on practice than on knowledge accumulation or deliberately confusing question formulations that characterize the bar exam. Step 2,  the second round of licensing exams prior to graduation medical school,  go past paper and pencil in that they actually test students’ ability to conduct exams and exercise medical judgement.  Another reason for the high pass rate is that most medical schools have stopped developing their own tests and instead use assessment instruments (shelf exams) provided by the same company that produces the exam.   Sure, there is grumbling and criticism about content & timing of the licensing exams, but medical schools work hard to make sure that their curriculums are aligned with the content of the exams.  Finally, medical education is extremely self-reflecting–they are constantly aware of the risks that come from confusing correlation and causation.  How do you know that a change in one part of the curriculum is the cause of a change in test scores?  You run Pearson correlations followed by stepwise linear regressions.  Seeing is not believing when comes to identifying factors that affect performance on licensure exams.   Look here, here, here, and here for studies evaluating curriculum changes.  They take nothing for granted—does attendance make a difference, does flipping classrooms really work? Does reducing the number of hours spend in the anatomy lab reduce USMLE scores?

Another standard practice in medical schools is curriculum mapping— an essential first step for any school that wants to understand what they are teaching—let alone make changes.   Like all maps, curriculum maps are DESCRIPTIVE, not PROSCRIPTIVE.  Here is   Harvard’s curriculum map, but you can find examples on the home page of just about every U.S. Medical School.This is a an article walking through how to map a curriculum.

So what’s helpful to us isn’t so much what medical schools are doing, but how they are evaluating themselves. 

In recap, neither I nor anyone else who has ever practiced law thinks it would be a good idea to emulate medical schools by fully aligning our curriculum with the bar exam so as to turn the three years of law school into one extended bar prep course.  Among other reasons, the material tested on the bar is quite static and doesn’t reflect the realities of today’s law practice.   It also wouldn’t make much sense for schools whose students take the bar exam in many different jurisdictions.   Also, the bar exam is just not equivalent to the three rounds of USMLE exams in actually testing both the knowledge and application of knowledge needed to be a successful lawyer.  If it was, we wouldn’t hear so many complaints about how students who have passed bar are never-the-less not “practice ready.”

Tomorrow—where can we get the help we need to find out this information, and who is going to pay for it?  Spoiler--Access Lex has a program.

We have to talk about the bar exam

Thank you very much to the team at Best Practices for Legal Education for inviting me to blog this week.  My particular thanks to Elizabeth Murad for administering the blog, Professor Mary Lynch, Kate Stoneman Chair in Law and Democracy & Director, Center for Excellence in Law and President & Dean Alicia Ouellette of Albany Law School for hosting this blog.  It is an honor to join such a distinguished group of scholars and teachers.

We knew it was going to be a bad bar year when on Sept 14, 2018 the NCBE announced that “the national average MBE Score for July 2018” had decreased “about 2.2. points from the July 2018 average.”  And, indeed, as states have announced the pass rates based on their own individual formula of MBE plus essays plus the MPT (multistate performance test) plus their own individualized questions, the results were bad.  A lot of our students failed the bar exam.  Pass rates were down in New York, in California, Georgia, Florida, in Texas, and lots of other places.  Yet at the same time, individual schools saw significant success in raising pass rates in the face of grim news all around them.  All of this makes for glib headlines and much handwringing, but in the context of a blog post on “Best Practices for Legal Education” it is more helpful to take a step back and assess the tools we, as legal educators, have available to us in addressing bar passage in our individual schools.  I do so from my Ph.D. studies in higher education as well as from my experience as a dean, associate dean, law professor, and medical school professor.

One of my main themes this week will be to argue for individualized problem solving.  If anyone comes to you with a product to solve all your bar passage problems, I hope after this week you will be able to ask some questions about the data on which they base their claims.    Because a productive discussion of bar exam passage really rests on two questions—1. Why aren’t the students at your law school passing the bar exam at the rate they “should” and 2. What should you do about it?

I am going to use this week to share with you some of the resources available to law schools, to individual faculty members, and even to law students who want to increase their chances of passing the bar the first time.  Along the way, I hope to address some of the unhelpful myths that have arisen and to endorse a very old idea borrowed from the then revolutionary 1960s era child rearing techniques of Dr. Benjamin Spock: These are your students—and you know more than you think do.  Trust your judgement.  Ask questions.  That doesn’t mean that you can do everything yourself—it’s fine to consult with experts, but in the end addressing bar exam passage issues is a school wide effort and everyone has relevant information to add and a valuable role to play.

To get started, it’s helpful to have an overview of the players.  As a matter of foundational Constitutional Law, each state retains the power to license and regulate professionals.  (more detail here).   As a result, every state and territory has its own process for setting criteria for bar passage.   Almost every state contracts with the National Conference of Bar Examiners which develops the annual exam, grades it, and spends a lot of time explaining itself.  If you have any interest in this topic, a free subscription to The Bar Examiner will quickly bring you up to speed.

Tomorrow–how a test from the 1950’s trips up today’s digital natives (or “Do we need a Tardis to match law school curriculum to the bar exam?”)