The Latest Data on Legislation & Regulation and Administrative Law Course Requirements at US Law Schools

With the aid of my smart and industrious research assistant Rob Galloway, I have completed the annual list of Legislation and Regulation (Leg Reg) and Administrative Law course requirements at ABA accredited law schools. The link to the list as a publicly viewable Google doc is below.

Two schools (Georgia State and Wisconsin) have added Leg Reg to their first-year curriculum, and the University of Chicago has added Legislation and Statutory Interpretation. Indiana-Indianapolis has made Administrative Law a required course. New to the list, though perhaps not new requirements: Both Inter-American and Pontifical Catholic (Puerto Rico) require Administrative Law as well. Oklahoma City Law School has added as a requirement that students take either Administrative Law or Legislation.

The list now identifies 55 schools that require their students to take a Leg Reg course or a course on legislation, statutory interpretation, or administrative law. Also of note on this year’s list: A more thorough review of the various schools’ curricula revealed several more that offer Leg Reg (or a course on legislation and/or statutory interpretation) or Administrative Law as a first-year elective. The list identifies 19 schools under this category.

I take responsibility for any errors. If your school is not represented accurately, get in touch with me at beb9@pitt.edu.

https://docs.google.com/document/d/1tbTcvxbaR11M15b2Mc9u5_UPrq4STcSClN9ntF5G1UE/edit?usp=sharing

The Next Generation of the Bar Exam, NCBE Style

The Testing Task Force of the National Conference of Bar Examiners (NCBE) recently completed the three phases of its “future-focused study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in a changing legal profession.” Earlier this month, as tumultuous events in Washington, D.C. grabbed most of the headlines, the task force quietly released an Overview of Preliminary Recommendations for the Next Generation of the Bar Exam. The recommendations are a big deal—possibly a game changer—and deserve some discussion.

I see three positive aspects of the task force’s preliminary recommendations:

First, quoting the document directly, “Our decisions were guided by the prevailing views expressed by stakeholders . . . : that the bar exam should test fewer subjects and should test less broadly and deeply within the subjects covered, [and] that greater emphasis should be placed on assessment of lawyering skills to better reflect real-world practice and the types of activities newly licensed lawyers perform . . . .” So many critics of the current bar exam, myself included, have argued exactly these positions for so long, but to read them coming from the task force directly affiliated with the NCBE is no small matter.

Indeed, the task force recommends a significant reduction in the number of legal subjects to be tested—just Civil Procedure, Contracts, Evidence, Torts, Business Associations, Constitutional Law, Criminal Law, Criminal Procedure (limited to constitutional protections), and Real Property. These reductions, if they were adopted, would cut from the current Multistate Essay Examination (MEE) subjects, not the Multistate Bar Exam (MBE) ones. Many will understandably quibble with the retention of all seven MBE subjects, and indeed I will do just that below. Still, one can hope that, if the task force’s recommendations are adopted by the NCBE, the bar exam of the future will in fact cover the several MBE subjects “less broadly and deeply” than the extent to which they are currently tested.

One reason to be hopeful that the next generation of the bar exam would in fact test MBE subjects less broadly and less deeply is that the task force is recommending a fundamental shift away from the model of discrete exams like the MBE, MEE, and Multistate Performance Test (MPT). It is instead calling for an integrated exam that would test “knowledge and skills holistically, using both stand-alone questions and item sets, as well as a combination of item formats (e.g., selected-response, short-answer, and extended constructed-response items).” As always, the devil will be in the details, but this preliminary recommendation sounds at least somewhat promising.

Second, the task force rightly recognizes the need to expand the scope of the foundational skills covered on the bar exam to include more than just legal analysis and writing. Among the skills to be encompassed by the projected bar exam of the future are legal research, factual investigation and evaluation (including fact gathering), client counseling and advising, client relationship and management, and negotiation and dispute resolution. It might be hard to imagine how some of these skill sets could be tested, but the task force foresees “uniform text- or video-based scenarios that require candidates to construct a written response or select the correct response.”

Third, the task force recommends that the next generation of the exam should be “a computer-based test, administered either on candidates’ laptops in jurisdiction-managed facilities and/or at computer testing centers managed by a suitable vendor.” The pandemic forced many state bar examiners to abruptly adopt computer-based testing. It makes sense for myriad reasons to move in the direction of computer-based testing and to do so as part of a deliberate and careful process.

Having said all that, there are at least two aspects of the task force’s recommendation that strike me as problematic—or at least present the risk of inadequate or counterproductive reform to the bar exam.

First, as mentioned earlier, there is the question of why the bar exam should continue to test legal knowledge in all seven of the MBE subject areas. Testing knowledge of such a large number of subject areas is especially problematic when considering that, by all appearances, the exam would remain closed-book. As Professor Deborah Merritt has cogently explained here, if the bar exam of the future needs to remain a closed-book exam, then the doctrinal portion of the exam should cover no more than a basic understanding of U.S. legal processes and sources of law or a single substantive subject. In arguing for an open-book exam, Merritt correctly points out that “new lawyers simply don’t work from memory. Instead, they internalize basic principles that allow them to identify issues in a client problem and find the specific rules they need to address that problem.”

Moreover, the empirical data on job activities of newly licensed lawyers, gathered during Phase 2 of the task force’s study, reveals that both experienced and newly licensed lawyers rated knowledge of Torts, Constitutional Law, Criminal Law, Criminal Procedure, and Real Property Law as only moderately important (all receiving an average rating of 2.2 or lower out of 3). They rated knowledge of all those subject areas as less important than knowledge of Statutory Interpretation Principles or Statutes of Limitations (both receiving an average rating of 2.3), neither of which is included as a stand-alone subject to be tested on the task force’s vision of the bar exam. To be sure, Civil Procedure (2.6), Contract Law (2.6), and Rules of Evidence (2.5) all received noticeably higher average ratings. Perhaps one or more of only those subject areas could be tested.

Second, on a more micro level, is the peculiar fate of one specific knowledge area: Statutory Interpretation Principles. In phase 3 of the study, the task force proposed that Statutory Interpretation Principles should not be tested as a stand-alone subject but rather could be subsumed under the Constitutional Law subject area and the skills portion of the exam. The suggestion that statutory interpretation can be subsumed under Constitutional Law is a dubious one. Rarely does a court do a deep textual dive into the precise phrasing of a constitutional provision as courts routinely do when confronted with a question of statutory construction. Moreover, questions on substantive knowledge of constitutional law will routinely call on knowledge of Supreme Court precedents, not on dissection of the text of a constitutional provision.

Statutory interpretation is plainly distinct from constitutional interpretation in critical ways. Hence, I trust that the skills components of the bar exam of the future will include not just cases as sources of law, but also statutory and regulatory provisions (cutting across many different substantive subject areas) which applicants have to dissect and interpret. Based on what I’ve heard from the task force, I am optimistic on this point. I discussed the extent to which statutory sources are used on the performance test, currently the only exclusively skills component of the bar exam, in an earlier post here.

In sum, much is uncertain and will take shape over time. But the early indications give some reason to be cautiously optimistic about the NCBE testing task force’s vision of a new bar exam.

Looking Ahead: The Performance Test on the Bar Exam Post Covid-19

Even amidst great uncertainty as to how the Covid-19 pandemic will impact future bar exams, we have already seen a few “when the dust settles” articles addressing how best to move forward with bar exams after the effects of the pandemic have waned.[i] While there are justifiable calls to rethink the exam entirely, especially in light of disparate outcomes based on race,[ii] multiple commentators have highlighted the need to enhance the exam’s focus on the performance test.[iii]

A telling indicator of the importance of the performance test appears in the Phase One Report of the National Conference of Bar Examiners’ (NCBE’s) Testing Task Force, which is conducting “a future-focused study to ensure that the bar examination continues to test the knowledge, skills, and abilities required for competent entry-level legal practice in a changing legal profession.”[iv] Among the points made most frequently by various bar exam stakeholders during listening sessions were the following: “The MPT [Multistate Performance Test] is the strength of the current exam” and “Lawyering Skills should be emphasized over subject matter knowledge.”[v] The performance test remains the only vehicle on the bar exam that directly evaluates lawyering skills and not subject matter knowledge of—some would say rote memorization of—law, which is tested by both the Multistate Bar Exam (MBE) and essay questions. As such, it ranks high as to testing validity and has great potential as an evaluator of competence. 

The NCBE, which currently provides the testing instruments used on the vast majority of state bar exams,[vi] is an influential force in the bar exam world, and it approaches change at a glacial pace.[vii] Hence, there is good reason to think that, while the pandemic might lead to changes in the manner of exam administration (e.g., remote administration in lieu of in-person), the heavy majority of states will continue to use the NCBE’s three tests: the MBE, the Multistate Essay Exam (MEE), and the MPT. Hence, a closer look at the most important of those three, the MPT, and ways in which performance testing can be improved is in order. What follows is a review of the current status of the performance test on bar exams in the United States and some thoughts on enhancing performance testing moving forward:

What is the performance test, and what is its purpose?

A performance test assigns a specific written lawyering task (e.g. memo, brief, or a letter), to be completed by the examinee relying on a closed universe of provided factual and legal materials. Performance tests, unlike essay questions and MBE questions, require no advance memorization of doctrine, as they provide a Library, which includes all the needed legal authorities in the form of cases, statutes, rules, regulations, or some combination thereof. The facts are presented in the File, usually through assorted documents such as transcripts of witness interviews, correspondence, etc. The allotted time for completion of a single performance test is 90 minutes.

The performance test is a creature of the ABA’s landmark 1992 MacCrate Report, which called on law schools to enhance training in assorted fundamental lawyering skills.[viii] The NCBE responded to the MacCrate Report by crafting the MPT as an instrument to evaluate six of the fundamental lawyering skills listed in the report: (1) Problem Solving, (2) Legal Analysis and Reasoning, (3) Factual Analysis, (4) Communication, (5) Organization and Management of a Legal Task, and (6) Recognizing and Resolving Ethical Dilemmas.[ix] The NCBE began offering MPT items to jurisdictions in 1997.[x]

How is the performance test currently being used on the bar exam?

The NCBE continues to produce the MPT, and 44 states administer it on their bar exam.[xi] Thirty-six of those states administer two MPT items, pursuant to their administration of the Uniform Bar Exam (UBE).[xii] Some of the remaining eight administer one MPT, and some administer two. In addition, Pennsylvania creates and administers its own performance test,[xiii] as does California.[xiv] In total then, 46 states administer at least one performance test on their bar exam. Only Florida, Louisiana, Michigan, and Virginia eschew performance testing.[xv]

Is the performance test evaluating all of the MacCrate skill sets adequately?

As I discussed in detail in a 2015 article, the performance test is not living up to its potential, as typical test items implicate mostly just the core skill set of legal analysis and reasoning.[xvi] It remains not at all clear that the performance test is encompassing all six of the MacCrate skill sets on a consistent basis. Most notable among the skill sets that remain under addressed are (1) Problem Solving and (3) Factual Analysis. 

Problem Solving

The MPT occasionally implicates problem solving through a test item requiring an evaluation of multiple competing approaches to a given client’s problem—for example, MPT2 on the July 2019 exam, in which examinees had to evaluate the merits of two different estate planning approaches for a mock client.[xvii] In the last several years, neither California’s nor Pennsylvania’s performance tests has presented comparable test items. 

Factual Analysis

As to factual analysis, because the source materials in the File include both relevant and irrelevant facts, the performance test does a good job of testing an examinee’s ability to identify relevant facts within a given set of materials. However, among the core competencies encompassed by the skill set of factual analysis are determining the need for further factual investigation and planning a factual investigation.[xviii] Performance tests have not encompassed these competencies. Indeed, in the past several years, no performance test has tasked examinees with identifying what additional facts would be useful and drafting a discovery plan for obtaining those facts. 

To be fair, given that the performance text currently plays a limited role on the exam, it is perhaps not possible to consistently encompass all of the MacCrate-inspired skill sets. The addition of more performance tests, or multiple-choice questions within performance tests, discussed below, could help in this regard. 

Does the performance test make adequate use of statutes and regulations as legal sources?

Of course, statutes and regulations are central to much of law practice today—so much so that numerous law schools are requiring first-year students to take a course on legislation and regulation. It would seem beyond dispute that all examinees on every bar exam should be required to display competence in reading and applying a statute or regulation without the aid of a case interpreting the statute or regulation at issue.  Neither MBE questions nor MEE questions (nor state-specific essay questions) directly require examinees to do this. In contrast, the performance test easily can. Unfortunately, however, statutes or regulations (or comparable types of non-case authorities) do not consistently appear on recent performance tests—to say nothing of the four states that do not administer a performance test at all. 

To the NCBE’s credit, at least one of the two MPT test items in each exam administration since 2010 has included at least one statute, rule, or regulation.[xix] It is not clear the extent to which examines on these various MPTs needed to reason directly from the statute, rule, or regulation (as opposed to reasoning from a provided case that interpreted the statute, rule, or regulation) to resolve an issue, but a cursory review suggests in several cases that, to a goodly extent, they did. By way of example, one of the MPT items on the February 2015 exam included only a statute and regulations in the Library, and examinees were tasked with “parsing . . . HIPAA regulations” in order to answer effectively.[xx]

 In contrast, however, since shifting to the 90-minute performance test format in July 2017, California has included only cases as authority on its performance tests through the February 2020 exam.[xxi] This reflects a major failing of the California bar exam. 

The Pennsylvania examiners have done a better job, having deployed statutes as part of the law library on several different performance tests,[xxii] the first of which (February 2017) I highlighted in an earlier post. Still, though, given that there is only one performance test on each administration of the Pennsylvania exam, each Pennsylvania performance test with only cases (and there continue to be several of those[xxiii]) represents an exam that does not evaluate the fundamental competency of reading and reasoning from a statute, rule, or regulation. 

How can bar examinations make greater use of performance tests?

Ideally, all 50 states would administer at least a bare minimum of two performance test items on every exam. Though progress is being made toward that goal, we are not there yet. 

First, as noted earlier, four states do not administer a performance test at all. One can only speculate as to what reason, aside from inertia, leads the bar examiners in Florida, Louisiana, Michigan, and Virginia to forego a testing instrument of such value.

Second, as also noted earlier, several states that use the MPT outside the construct of the UBE, as well as Pennsylvania and California, administer only one performance test, not two. The use of two test items, as required for the UBE, affords an opportunity to evaluate a greater array of lawyering skills via the performance test. The use of only one relegates the performance test to playing a minimal role on the exam overall. By way of illustration, while the two MPT items in UBE jurisdictions count for 20% of the exam score, the one MPT item in Pennsylvania counts for only 11%, and the one MPT item in Nevada counts for only 10.5%.[xxiv]

Third is the broader question of expanding the use of the performance test beyond just two test items and the current 20% allocation on the UBE. As I noted in my 2015 article, this is a tougher challenge.[xxv] The MBE counts for such a large chunk of the score on the exam—50% in UBE and most other jurisdictions—because it is a psychometric anchor for the exam. In other words, the MBE has very strong testing reliability. As a multiple-choice exam that tests knowledge of various areas of substantive law, thereby calling for rote memorization, the MBE has weaker testing validity. In contrast, the performance test has high testing validity, but, because of its size and the subjectivity inherent in grading, has lower testing reliability. The NCBE prioritizes strong testing reliability for its products and hence allots greater percentages to the MBE and essay questions, 50% and 30% respectively.[xxvi]

There are many possible approaches that could allow for increased use of, and a greater scoring weight allotted to, performance tests, but one suggested by Jason Solomon in his recent article bears serious consideration. Solomon suggests the use of multiple-choice questions, which inherently afford greater testing reliability, within a performance test format.[xxvii] Instead of, or in addition to, writing an answer, as required on the traditional performance test, examinees would answer a series of multiple-choice questions on the materials provided in the test and on the most effective ways to resolve the issues presented. Unlike MBE questions, these questions would not require examinees to recall memorized legal doctrine, but rather to carefully review the provided factual or legal materials. Multiple-choice questions within a performance test format could also be an effective vehicle for adding legal research to the bar exam, as even within a closed-book format, examines could be asked questions about the most effective research strategy to build on the provided materials.[xxviii]

Conclusion

In sum, to improve the bar exam going forward still requires a focus on the performance test. The following goals are worth pursuing and achieving:

  • Performance testing that consistently encompasses more of the fundamental lawyering skills that the test was originally designed to encompass, including problem solving and factual analysis.
  • More and consistent use of statutory and regulatory authorities in the law library of performance tests
  • Use of performance testing in all 50 states, not just 46
  • At least two performance test items in all jurisdictions
  • Research into the use of multiple-choice questions within a performance test framework, including as a vehicle for testing legal research

[i] See, e.g., Vikram David Amar, What About the Bar Exam After the 2020 Dust Settles?, Verdict, Legal Analysis and Commentary from Justia (Sept 17, 2020), https://verdict.justia.com/2020/09/17/what-about-the-bar-exam-after-the-2020-dust-settles; Jason Solomon, INSIGHT: Saving the Bar Exam By Focusing on Performance, United States Law Week (July 16, 2020), https://news.bloomberglaw.com/us-law-week/insight-saving-the-bar-exam-by-focusing-on-performance.

[ii] Several thoughtful pieces touching on racial inequities caused by the bar exam appear in volume 3, Issue 4 of AccessLex’s publication, Raising the Bar, 3-12 (Fall 2020), https://www.accesslex.org/resources/raising-the-bar-fall-2020.

[iii] Amar, supra Note i; Solomon, supra Note i. 

[iv] Nat’l Conf. of Bar Exam’rs, Your Voice: Stakeholder Thoughts About the Bar Exam, Phase One Report of the Testing Task Force, 1 (Aug. 2019), https://testingtaskforce.org/wp-content/uploads/2020/03/FINAL-Listening-Session-Executive-Summary-with-Appendices-2.pdf.

[v] Id. at 3. 

[vi] See Adoption of the Uniform Bar Examination, with NCBE Tests Administered by Non-UBE Jurisdictions, Nat’l Conf. of Bar Exam’rs (Aug 18, 2020), https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F196.

[vii] The NCBE is notorious for approaching change very cautiously and incrementally. In 2012, the then President of the NCBE invoked geological metaphors to describe how bar exams evolve, writing that any evolution of the exam will be “more glacial than volcanic.” Erica Moeser, President’s Page, B. Examiner, Dec. 2012 at 4, 4.

[viii] MPT Skills Tested, Nat’l Conf. of Bar Exam’rs https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F54 (last visited Oct. 20, 2020) (citing Am. Bar Ass’n Section of Legal Educ. and Admissions to the Bar, Legal Educ. and Prof. Dev. – An Educational Continuum, Report of the Task Force on Law Schools and the Profession: Narrowing the Gap 138-41 (1992) [MacCrate Report]).

[ix] Judith Gunderson, Happy Birthday, MPT!, B. Examiner, Nov. 2007, at 18. See also MPT Skills Testedsupra Note viii.

[x] Id. at 20. 

[xi] Multistate Performance Test: Jurisdictions Administering the MPT, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/exams/mpt/ (last visited Oct. 20, 2020).

[xii] Adoption of the Uniform Bar Examination, with NCBE Tests Administered by Non-UBE Jurisdictions, supra Note vi.

[xiii] Pa. Bd. of Law Exam’rs, Bar Examinationhttps://www.pabarexam.org/bar_exam_information/bebasics.htm (last visited Oct. 20, 2020).

[xiv] The State Bar of Cal., Examinations, California Bar Examinationhttp://www.calbar.ca.gov/Admissions/Examinations (last visited Oct. 20, 2020).

[xv] Multistate Performance Test: Jurisdictions Administering the MPTsupra Note xi.

[xvi] Ben Bratman, Improving the Performance of the Performance Test: The Key to Meaningful Bar Exam Reform, 83 UMKC L. Rev. 565, 584-97 (2015).

[xvii] 2019 MPT Summaries, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F233 (last visited Oct. 20, 2020).

[xviii] MPT Skills Testedsupra Note viii.

[xix] See Free Summaries of MPTs from Recent Administrations Free MPTs and Point Sheets from Older Administrations, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/exams/mpt/preparing/ (last visited Oct. 20, 2020).

[xx] 2015 MPT Summaries, Nat’l Conf. of Bar Exam’rs, https://www.ncbex.org/pdfviewer/?file=%2Fdmsdocument%2F175 (last visited Oct. 20, 2020).

[xxi] The State Bar of Cal., Past Exams, Performance Tests and Selected Answershttps://www.calbar.ca.gov/Admissions/Examinations/California-Bar-Examination/Past-Exams (last visited Oct. 20, 2020).

[xxii] See, e.g., Pa. Bd. of Law Exam’rs, July 2019 Pennsylvania Bar Examination, Essay Questions and Examiners’ Analyses and Performance Test, 59-61, https://www.pabarexam.org/pdf/qa/qa719.pdf (last visited Oct. 20, 2020).

[xxiii] See, e.g., Pa. Bd. of Law Exam’rs, February 2020 Pennsylvania Bar Examination, Essay Questions and Examiners’ Analyses and Performance Test, 62-70, https://www.pabarexam.org/pdf/qa/qa220.pdf (last visited Oct. 20, 2020).

[xxiv] Nat’l Conf. of Bar Exam’rs, Comprehensive Guide to Bar Admission Requirements 36-37 (2020), available at https://www.ncbex.org/assets/BarAdmissionGuide/CompGuide2020_021820_Online_Final.pdf.

[xxv] Bratman, supra Note xvi, at 605-10.

[xxvi] Id. at 608.

[xxvii] Solomon, supra Note i.

[xxviii] If and how to test legal research on the bar exam has been on the NCBE’s radar screen since at least 2006. See Erica Moeser, President’s Page, B. Examiner, May 2006, at 4, 5.

A Comprehensive Review of Legislation and Regulation & Administrative Law Course Requirements

In support of a few different projects, I recently asked my summer research assistant to do a comprehensive review of Legislation and Regulation and Administrative Law course requirements at ABA accredited law schools in the United States. The completed list (please see file below) updates one most recently compiled by Professor Ed Richards at LSU Law School.

At this juncture, over 30 schools require JD students to take a Legislation and Regulation course (or a similarly titled course focused mainly on the role of statutes and regulations in contemporary law). At almost all of those schools the course is offered in the first year. A handful more require a course on just legislation, statutory interpretation, or the like. In addition, about ten schools impose an upper division requirement to take Administrative Law or a comparable course.

Now, perhaps more than ever, additional schools should seriously consider adding Legislation and Regulation or Administrative Law requirements. Each of the two big crises facing our country today provides yet another example of the centrality of the regulatory state—as opposed to the common law—in our legal system, thereby reinforcing the importance of exposing all law students to the fundamentals of legislation and regulation.

First, the varying government responses to the Covid-19 pandemic are acute illustrations of regulatory trade-offs—the kind that administrative agencies in numerous sectors of our society grapple with all the time: Benefits to the economy produce a cost in human life; benefits to human life produce a cost to the economy. Also, what authority do governors, health departments, and other relevant agencies have, many law students might wonder, and how did they get that authority?

Second, as to issues of police brutality and racial discrimination, the law’s response largely has come and will come in the form of legislation (or ordinances at the municipal level) and regulation. City councils consider fundamental changes to police departments, while state legislatures and Congress debate various other policing reforms. Police commissions and review boards, which are administrative agencies, are under scrutiny. Even the judicially-created doctrine of qualified immunity, which almost always insulates police officers from liability in civil suits, may very well endure in its present state unless Congress passes a statute modifying or eliminating it.

Fundamental concepts and processes of our regulatory state, several of them center stage in the issues of our day, are the focus of required courses at the various schools on the list. May that list grow each year moving forward.

Improv for First-Year Law Students?

Just over a year ago, in search of a mid-life growth opportunity, I began taking improv (i.e., improvisational performance) classes at a small theater in Pittsburgh. For decades, I had been a fan of improv as a comedy form but did not have the confidence to think that I could step on a stage and do it myself. Then I happened upon Alan Alda’s book, If I Understood You, Would I Have This Look on my Face? Post his acting career, Alda has become a communications consultant of sorts, working primarily with scientists to help them explain complex ideas in ways that a lay audience can understand. One of the central messages of the book is that improv training and exercises can help professionals of all types relate to others more empathetically and communicate with others more confidently and clearly. Alda references studies showing the benefits of improv training and describes his own experiences running improv exercises for groups of engineers and other scientists.

After reading Alda’s book, I realized that my job as a law professor is to communicate complex ideas to a lay audience too. So, I decided that I could benefit from improv classes—and have some fun at the same time.

My improv experiences over the past year, including joining a team and performing numerous short sets before a live audience, have convinced me that, in short, legal education needs improv. More specifically, to improve the learning environment throughout law school, entering first-year law students need improv! (I could write a separate post on the salutary effects that improv training has had on my teaching, but I will focus for now on how exposure to improv could benefit law students.)

There are CLE courses on improv offered for practicing lawyers (for example, in California and Florida), and there is a blog on improvisational skills for lawyers. But how about improv for law students? Based on a cursory online search, it appears that a handful of law schools offer or have offered improv courses or workshops, including Drexel and Indiana University McKinney. How much improv work has been done with 1Ls, if any, is unclear.

The benefits of improv for law students seem most apparent in the context of skills or experiential courses involving oral communication. Indeed, I have begun using some limited improv exercises during oral argument lessons in my 1L legal writing course. No doubt faculty members at various law schools—Northwestern, for example—have used improv exercises in other skills courses. Thinking more broadly though, and extrapolating from studies discussed in Alda’s book, I believe that offering improv workshops to law students early in their law school careers could very well improve in-class performance and learning throughout law school.

Improv revolves heavily around a group or team dynamic. Someone on the team must initiate a scene by stepping out on stage and doing something, or saying something, or both. One or more of her teammates then must step out in support, accepting the reality established by whatever the first person did and adding to it to help build the scene. That is the essence of “yes, and,” the fundamental premise of improv. The priority in every scene is to make one’s scene mates look good and to never hang them out to dry.

A quick example: An improv teammate and I walk onto the stage. She purposefully walks to a corner of the stage where there is a chair, sits down, puts her head in her hands, sighs deeply, and then begins to sob. Since I have walked onto the stage too, I am the team member in this scene who must accept her established reality: she is distraught about something, and we are in a location where she has taken a seat. However, I must also add to that reality. There are myriad ways in which I could do so. I could, for example, walk over, put my hand on her shoulder, and attempt to comfort her by saying, “It’s alright, honey, I never really liked our BMW anyway, and the side of the house that you hit—we really don’t use it much anymore now that the kids are gone.” She in turn accepts the reality that I have created, and on it goes from there, each of us supporting the other as the scene develops.

I see multiple potential benefits for law students (and, in turn, their professors) that could come from some basic improv training with exercises, offered perhaps within an orientation program before classes begin. For starters, law students, like lawyers, need to work collaboratively and need to relate to each other in a civil and empathetic manner. Improv’s emphasis on teamwork can help in that regard, enhancing students’ abilities to work productively and constructively with classmates in group exercises and projects.

Then there is the classroom learning environment and the sometimes strained or unproductive exchanges that take place between professors and students. Consider the ways in which exposing students to improv could mitigate the impediments to learning existing in the following classroom scenarios, each of which should be familiar to most law professors:

  • Professor poses a question or discussion topic to the class and waits for a volunteer to raise a hand and respond. Nobody does. Or, in a similar scenario, professor calls on a specific student, and the student asks to pass (even though the student might very well have done the reading).
  • Professor poses a question to a specific student, and the student asks for the question to be repeated, or answers in a manner that is not directly responsive to the question.
  • Professor poses a question to a specific student, and the student couches his or her answer in the form of a question, not a statement, suggesting uncertainty and lack of confidence. (For example, in my Legislation & Regulation course, I might ask regarding a case, “Which of the three opinions—majority, concurrence, or dissent—seems to approach the statutory interpretation question most like a textualist would?” The student somewhat meekly responds, “Is it the dissent?”)
  • Professor poses a question that is not explicitly addressed in the reading for that class but rather concerns a hypothetical scenario or a thematic issue in the course that is implicated by the reading. In response, the student struggles to answer or fully engage with the question because, as some students are wont to say, “it wasn’t in the reading.”

These are usually not scenarios where the student is incapable of responding insightfully; rather, the student is just not confident enough to respond or too nervous to respond. Students who have done improv exercises involving initiating scenes and supporting teammates in scenes would naturally be less averse to speaking up in class and doing so in the form of confident and clear statements. They would also naturally be less averse to joining in a discussion after a classmate speaks up (akin to supporting one’s teammate). And they would naturally be more inclined to listen actively and carefully to the professor’s questions and their classmate’s statements. All of which is to say that each of the above scenarios might play out differently—with some robust student participation leading to more productive and constructive discussion. The final scenario in many ways gets to the heart of what improv is all about: going with the flow and accepting whatever comes your way. With exposure to improv, students would perchance be less phased by the question that, while technically not encompassed by the day’s reading, is still well within the scope of the course and their abilities.

In classroom discussion, students whose answers are “wrong” or whose contributions to discussion are somewhat off target tend to view the episode as an embarrassment and a reason not to ever answer a question in class again. I would expect that, with exposure to improv, students would be at least somewhat more inclined to view the episode as a learning experience, which is indeed how it should be viewed.

Posted on the wall of the green room at the theater where I take improv classes is a sign that reads “You Are Enough.” For any law student, improv can impart the message that, even though there will be struggles and mistakes along the way, you are indeed enough. I continue to get nervous before my improv team’s shows, and on more than a few occasions, I have said or done things on the improv stage that fell flat, did not effectively advance the scene, or otherwise just did not feel right. Yes, I get frustrated. But I keep confidently walking in front of the audience because that is how I will continue to learn and grow. So it should be for law students in the classroom, and ultimately in the practice of law, and in life.

A Pedagogical Twist for the 1L Appellate Brief and Oral Argument

For those who teach legal writing to first-year law students, it is the season for appellate oral argument. Yes, the long-standing tradition of requiring first-year students to complete an appellate oral argument in the legal writing course continues today at the large majority of American law schools–at just under 75% of them, according to recent data. At those schools, the oral argument, which is commonly the capstone exercise near the end of the spring semester, has become something of a rite of passage for the students.

In a 2011 article, Legal Research and Writing as Proxy, I argued that assigning an appellate brief and appellate oral argument in the 1L legal writing course remains a pedagogically sound practice, even though a large majority of practicing attorneys will never engage in appellate practice, let alone complete an appellate oral argument. I still retain that view but won’t rehash my arguments here. Rather, I will focus on a pedagogical opportunity afforded by the brief/oral argument sequence of assignments that I discovered more recently.

In the last few iterations of my legal writing course, the appellate brief and oral argument assignments have proven an excellent vehicle for a bit of a pedagogical twist: A few weeks before the brief is due, not after, I teach lessons on oral argument and require the students to complete a practice oral argument round in front of my 2L teaching assistants. (The formal rounds of oral argument in front of a trio of local attorneys still occur after the briefs are submitted.) For many years, I kept brief writing and oral argument entirely separate—only after the briefs were completed and submitted would I shift the students’ attention to oral argument. (After all, that mimics the realities of the “real world“ of appellate practice.) But as a pedagogical matter, just like writing the brief helps in preparing an oral argument, working on an oral argument–and thereby having to talk out and defend one’s positions–can help in preparing a brief.

A few weeks before the brief is due, most students will have a scattered and underdeveloped array of arguments. Completing a practice oral argument can help them–or, in the case of those students who are spinning their wheels, force them–to organize and further develop those arguments for the purposes of the brief. In pursuit of this goal, I ask my TAs to give extensive feedback to both students after each practice round. Moreover, I require every student to attend two additional practice rounds as observers. At each round, the student representing Petitioner, the student representing Respondent, and the students attending as observers also begin to appreciate the formalities and peculiarities of oral argument, thus helping them to prepare for the formal rounds that will occur after submission of their briefs.

This semester, shortly after the practice rounds (just over a week before the briefs were due), my students graciously agreed to provide me some feedback on the experience. One of my students volunteered to solicit comments from all of her classmates, anonymize those comments and her own, and then send them to me. Twelve out of fourteen students in my small section gave a positive review. I include two of the more thoughtful evaluations here:

  1. I found doing the practice oral arguments before my brief was fully written to be helpful. Arguing my side in the courtroom and fielding questions from the TAs helped me more precisely narrow the theme of my arguments and determine how I wanted to frame my position in the brief itself. After receiving pushback from the TAs on certain points, I was able to refine my responses to common criticisms that would come from the other side. Additionally, I now feel more comfortable going into the “official” oral arguments having completed a practice round. However, I would have liked to participate in another mandatory practice round with the TAs after my brief is written; the substance of my oral argument has substantially changed since my first practice round.
  2. Practice oral arguments were a large motivator to get my arguments organized. I found it really helpful to speak out loud about the arguments. Doing so really helped me understand what my points were and whether or not they held up against scrutiny. Speaking about the arguments also helped me understand how they related to each other. The TA’s did a good job of making us feel comfortable throughout the process. I think overall the exercise is going to be beneficial as long as the practice round is kept informal. We were all stressed about how to perform the oral arguments, so maybe there could be a concession in the formality/process of the oral argument that could make us more comfortable.

Good food for thought, as I continue the tradition of appellate oral argument again next spring.

On the Value of Gap Years and Non-Legal Experience to Legal Employers (and Law Schools)

Reviewing the results of the Foundations for Practice survey conducted by the Institute for the Advancement of the American Legal System (IAALS), I cannot help but note how the 24,000 responding attorneys ranked the helpfulness of various criteria for hiring beginning lawyers. (See here.) Just under 80 percent (78.3%, to be exact) identified “life experience between college and law school” as either a very helpful or somewhat helpful hiring criterion. Moreover, while “legal employment” (88.4% ranking as very or somewhat helpful) and “legal externship” (81.5%) unsurprisingly sat near the top of the list, “other experiential education” — meaning non-legal — was very close behind at 79.4%.

The responding attorneys, from a wide variety of practice areas and from throughout the country, ranked these two non-legal experience criteria — “life experience between college and law school” and “other experiential education” — as slightly more helpful than certain types of legal experience, including federal court clerkships, state court clerkships, and participation in law school clinics. The starker divide, however, came when considering traditional hiring criteria related to law school performance. While well over three quarters of respondents classified both personal and professional experience of a non-legal nature as helpful hiring criteria, only 62.5% said the same about law school class rank. Similarly, only 61.1% said so about law school attended, and merely 51.2% said so about law review experience.

One narrative coming out of the survey could indeed be that practical experience matters more than academic experience, and that seems to be what IAALS is highlighting. But, consistent with the results discussed above, I would suggest another as well: Non-legal experience — both personal and professional — matters almost as much, if not just as much, as legal experience.

The survey was just the first phrase of IAALS’s broader project, entitled Foundations for Practice, and the second phrase, which is being implemented now, directly implicates law schools. IAALS is working with four law schools to “translate the survey results into actionable learning outcomes and hiring rubrics.”

The current phrase focusing on law school outcomes turns my mind to the “incoming” side of law school admissions. The results of the survey suggest to me that law schools should more explicitly prioritize admission of students with meaningful life experience or non-legal professional experience. In addition, if it is not already, LSAC ought to be gathering and reporting to law schools pertinent data as to what percentage of law school applicants are undergraduate students who would be going directly to law school. And, as to those who are not, what are the percentages one year out from the undergraduate degree, two years, three or more, etc.? Just as law schools view national statistics on other important admissions criteria (GPA, LSAT, ethnic diversity, to mention a few) as important benchmarks, they ought to be in a position to do the same for number of years since undergraduate degree.

Having a significant percentage of students with meaningful life experience outside of the law is indisputably of great benefit to the law school learning environment. I see it every year in my classroom. More to the point of the IAALS survey results, by bringing in a significant number of students with such experience, law schools will be contributing to better outcomes — learning outcomes and employment outcomes. In a typical incoming J.D. class at my home school, the University of Pittsburgh School of Law, 20% to 30% of the students are three or more years removed from their undergraduate degree. We do not have a part-time or night program, and certainly those schools that do will have higher percentages of that demographic.

At least one-third and in some years close to one-half of the students in our typical incoming class are coming straight out of their undergraduate studies without even one gap year. This demographic exists at nearly every law school in the country (in varying percentages). Given what we know about the next generation of law students, and given the importance of life experience and non-legal experience as hiring criteria to today’s legal employers, these students would seem to face a more challenging path. What do law schools need to do, if anything? Offer or enhance existing professional development programming or curricula? Offer or enhance existing experiential opportunities that are not exclusively legal in nature and that expose students to non-lawyers and other disciplines and experiences? (Just two examples: teaching or working with high school students, or working with entrepreneurs at a tech startup. Law schools affiliated with a university can offer assorted interdisciplinary educational opportunities as well.)  I will be interested to see if the second phrase of the IAALS project emphasizes ideas like these or others that respond to the demonstrated need for lawyers with life experience and non-legal professional experience.

More Thoughts on the Post-Millenial Generation of Students Arriving in Law School

In two thoughtful posts from last month, here and here, Shailini Jandial George and then Andi Curcio and Sara Berman offered specific and practical suggestions of ways that we as legal educators can reach the post-millenial generation of students through our teaching. These posts bring to mind Professor Jean M. Twenge’s recently published book iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy—and Completely Unprepared for Adulthood.  Twenge is probably the pre-eminent generational researcher in this country, and her empirical findings reported in the book have profound implications for legal education. What’s more, those implications are here now. Twenge defines the post-millenial “iGen” (sometimes referred to as Generation Z) as those born between 1995 and 2012, meaning the oldest among them are approaching their mid 20s—the average student age at most American law schools. In Twenge’s words, “[t]hey grew up with cell phones, had an Instagram page before they started high school, and do not remember a time before the internet. They are different from any generation that came before them.”

One concerning and challenging implication for legal education relates to the role law schools should play in inculcating basic norms of professional behavior, especially those of importance to interpersonal interaction. Given that they have spent an enormous percentage of time during their formative years on social media and elsewhere in the virtual world, most of today’s law students (those in their early to mid 20s, at least) have far less interpersonal experience than previous generations had at the same age. Speaking more broadly, as Twenge’s research reveals, they have largely avoided or deferred grown-up responsibilities that previous generations were tackling often in their teens. Much of Twenge’s research focused on high school and college students, considering such responsibilities as learning to drive, moving out of the house, and gaining financial independence. Still, as we teach and mentor law students in their early to mid 20s, we must consider what other grown up responsibilities and behaviors that we expect of legal professionals can no longer be taken for granted.

In a recent survey conducted by the Institute for the Advancement of the American Legal System (IAALS), a wide array of legal employers ranked the legal skills and professional competencies and characteristics that they believe new lawyers most need to succeed. (The study’s results are reported here. There is also a detailed accounting of the results and an explanation of the study’s role within IAALS’s broader project in the summer 2018 edition of The Bar Examiner, pp. 17-26.) The results revealed that legal employers value foundational characteristics and competencies much more than they do foundational legal skills. Among the top 20: Arrive on time for meetings, appointments, and hearings; Treat others with courtesy and respect; Listen attentively and respectfully; Promptly respond to inquiries and requests; and Exhibit tact and diplomacy. The only specific legal skill that reached the top 20 was legal research.

If we in legal education have been presuming that our arriving 1Ls possess these basic types of competencies, or at least understand their importance, I am not at all sure that we can do so any longer. The visceral reaction for so many of us, no doubt, is that it is not the job of a law school to teach students these and other very basic norms of interpersonal relations for professionals. Imagine some variation of “they should have learned that in college or high school or from their parents” or “they’ll learn the hard way in their first summer legal job.” Given legal education’s obligation to the profession that it serves, we ought to move past those mindsets. I recognize that many in legal education have done so, and I recognize that many law schools have developed programming or courses on different aspects of developing a professional identity. But professional identity, at least as it was discussed in the Carnegie Foundation’s Educating Lawyers, relates more to appreciating one’s role as a legal professional in society more broadly. It takes on a moral component. That remains important. What I raise here, however, is more behavioral and foundational: Meet deadlines, arrive on time, respond to inquiries promptly, be tactful and diplomatic with others, etc., etc.

In my 1L Legal Analysis & Writing course, I seek to instill professional behavioral norms through various course policies, all explicitly stated in my syllabus, concerning compliance with deadlines, punctual attendance at class and scheduled meetings, civil and respectful interaction with classmates and me, timely and good faith completion of ungraded exercises, etc.  A percentage of each student’s grade depends on how well he or she meets these professional standards. Two of my students missed their first deadline for an ungraded exercise last week; neither had any kind of explanation. Consistent with the underlying professionalism theme of my course, I informed these students that such behavior, if repeated, would fail to meet my professional standards, just as it would fail to meet the professional standards of any legal employer.

It will be interesting to see if and how Twenge’s findings manifest themselves in the current and future 1L classes. I strongly recommend the book; it provides an excellent foundation for putting a variety of possible student behaviors into context.

What Do We Remember about our Teachers Decades Later?

This coming academic year will be my 20th in law school teaching. Truth be told, I don’t really need a milestone in my career to grow contemplative and introspective, as I am wont to do so under far more mundane circumstances anyhow, but this milestone is definitely doing the trick.

The grandest question that I might ask myself is what sort of difference I have made in the lives and careers of the students—well over one thousand, I expect—that I have taught or worked with since I joined the legal writing faculty at the University at Buffalo in 1999. (In 2002, I moved to the University of Pittsburgh School of Law and am still on the faculty there.) That is a rather abstract query, bordering on the metaphysical, I suppose. A related but more concrete question that I have actually been pondering is this: Decades after someone has been my student, what will he or she remember about me or my teaching? Something, I hope! And something positive, I hope too! Is there a particular thing I said, lesson I taught, teaching technique I used, kind gesture I made, or even joke I told that will stick with them, and maybe have a positive impact, even decades later?

Thinking back to my years in college and law school, I can easily identify the most  impactful specific thing that one of my professors did. I can even remember the date—September 26, 1983—as it was my very first day of classes as a freshman at the University of Washington. The class was Political Science 101, and the professor was Reza Sheikholeslami.

After I and hundreds of my fellow undergraduate students (mostly freshmen) settled into our seats in the lecture hall, Professor Sheikholeslami, sharply dressed and looking the part, strode confidently to the lectern. He scanned the crowd of young faces. Then came his first words: “Would everyone please stand up.” I hadn’t the slightest idea why we needed to stand up, and I doubt that any of the other students did either. But, promptly and without any hesitation, we all complied. He paused and again briefly scanned the crowd. “Alright,” he said, “please go ahead and sit down.”

After we settled back into our seats and the lecture hall again grew quiet, Professor Sheikholeslami explained, “This is Political Science 101, and the goal of this course is to teach you why you just stood up and then sat down.” Being an impressionable 18-year-old, I was mesmerized.

Of course, standing up and then sitting down in compliance with a professor’s instruction does not begin to reach the level of gravity of what happened in the various societies and cultures that we ultimately studied in the course. Think Nazi Germany, for example. But the metaphor Professor Sheikholeslami delivered with his opening words was ever so powerful, and the course largely lived up to it.

Wherever I see demagoguery or other anti-democratic forces rearing their ugly heads, and people are blindly following an authority to their own detriment and the detriment of others, I think back to Poly Sci 101. (I’ll refrain from further comment on the current political climate in this country.) Moreover, in various contexts in the law school classroom, I have drawn from the simple but crucial lessons of that first class with Professor Sheikholeslami. Sometimes, in my Legislation & Regulation course, the substantive lesson about obedience to authority underlies a thread of classroom discussion. More often, regardless of the course, the pedagogical lesson—try to leave a lasting positive impression with one’s students—drives me to think more creatively about how to approach a class topic.

Among the numerous professors that I had across four years of undergraduate study and three years of law study, off the top of my head I could probably name one-third of them (a higher percentage from law school than from undergraduate, I expect).  And among those whom I can remember off the top of my head, only a handful of them sit prominently in my memory—because of how talented they were as teachers, how funny they were in the classroom, how helpful and supportive they were in one-on-one work on a research project, etc. But only one remains prominent in my memory for one particular thing that he did in the classroom: Professor Sheikholeslami.

A few years after I benefited from his creative and thoughtful teaching, Sheikholeslami became the Masoumeh and Fereydoon Soudavar Professor of Persian Studies at the University of Oxford. I was saddened to learn recently that he died earlier this year at the age of 76. I regret that I never succeeded in reaching him to tell him of the positive impact that he had on me—as a person and as a teacher.

To be sure, there are many ways to define and measure good teaching. Now that I will soon begin my third decade in legal education, what I often come back to when I ponder my own qualities as a teacher is this: What do I say or do as a professor that will stick in a student’s memory and still carry some positive influence—no matter how concrete or abstract—decades later? A fond memory of any kind would be great. A memory on the level of my memory of Reza Sheikholeslami in Poly Sci 101 would be a wonderful bonus.

Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (Besides the Cut Score Debate)

In addition to her post on Law School Café about alleged correlations between bar exam scores and lawyer discipline (discussed on this blog here), Professor Deborah Merritt recently offered another bar exam-related post. This one provides intriguing historical perspective on the current need to expand the range of skill sets tested on the bar exam. Following up on points made by Professor Derek Muller, Professor Merritt discusses a 1980 study by the California Committee of Bar Examiners, cosponsored by the National Conference of Bar Examiners (NCBE), on adding a clinical component to the bar exam. Several hundred applicants who had taken the July 1980 California Bar Exam volunteered to complete an additional clinical evaluation requiring them, among other things, to interview a client and examine witnesses. Professional actors played the role of clients, akin to the standard patient role that actors perform for clinical evaluations in medicine. The applicants were scored based on an elaborate protocol.

Delving into the statistical results of the study, including comparisons between outcomes on the conventional bar exam and outcomes on the clinical evaluation, Professor Merritt illuminates how crucial it is nearly 40 years later for bar examiners to study and implement alternative assessments of skills not currently evaluated by the bar exam. She points out that, while the study’s results were by no means definitive, they at least suggest “the disturbing conclusion that a significant percentage of conventional bar passers (about two of every five) lack basic practice skills that are essential in representing clients.”

I find this discussion particularly apt in 2017, the 20th anniversary of the first administration of the Multistate Performance Test (MPT), the written skills test now a part of the bar exam in 40 states and D.C.  What started the path toward written performance testing and the MPT? A study conducted by the California Committee of Bar Examiners (cosponsored by the NCBE), possibly the same one referenced by Professor Merritt.  On the occasion of the MPT’s 10-year anniversary in 2007, the Bar Examiner, a magazine published by the NCBE, briefly described the California-based origins of the performance test and indicated that the MPT was ultimately based largely on “the California model.” (The piece, in the November 2007 edition of the Bar Examiner, is apparently not retrievable online.)

Written performance testing was the last meaningful innovation in bar exam testing. In thinking about who might lead an effort toward the next one that introduces greater clinical evaluation, including possibly of oral skills, I think not of a top-down effort from the resolutely conservative NCBE. It is focused on getting as many jurisdictions as possible to adopt its Uniform Bar Exam (26 and counting as of today). Rather, I think of a bottom-up effort by individual states—perhaps with California in the lead—serving as laboratories for testing methods that could ultimately spread to other jurisdictions, thereby persuading or forcing the NCBE to join.

The history of written performance testing is illustrative of my point. Long before the NCBE went forward with the MPT in 1997, not just California but also Alaska and Colorado devised performance tests of their own and administered them on the bar exam. Indeed, those three states were administering performance tests in the early 1980s, playing an important initial role in advancing the cause of a needed bar exam reform. Here, for example, is a follow-up study of the 1983 California Bar Exam, discussing its two performance tests.

The biggest barrier to innovation at the state level is the NCBE’s influence, which increases with each state that adopts the UBE and thereby constrains itself to offer the conventional bar exam that the NCBE requires it to. Indeed, both Alaska and Colorado, two of the original performance test states from the 1980s, have adopted the UBE, meaning neither of those states will be doing any more bar exam innovation. That leaves California (and any of the other 23 states that have yet to join the UBE, none of which matches the influential profile of California).

Why the California study or studies did not lead to some form of clinical evaluation beyond written performance tests is unclear, though two obstacles that come to mind are expense and testing reliability. Indeed, the 1980 study that Professor Merritt references summarized one of its findings as follows: “[T]he relatively low reliability, administrative difficulties, and high costs associated with most (but not necessarily all) standardized oral tasks probably precludes even considering them as possible components of a general bar examination. Written tests of clinical skills, on the other hand, are relatively easy to construct, administer, and score. Further, unlike oral tasks, the score on written tasks are moderately correlated with one another.”

It seems worthwhile to revisit those conclusions, given the passage of time and possible advances in testing methods, and given that the medical profession requires clinical evaluation of its applicants.  Today, 24 years after the MacCrate Report, 20 years after the advent of the MPT, and 10 years after the Carnegie Foundation Report, the legal profession needs a better bar exam.  I join Professor Merritt’s call for a national task force on the bar exam, sponsored by AALS, the Conference of Chief Justices, the ABA Section of Legal Education and Admissions to the Bar, and maybe even the NCBE.  As Professor Merritt writes, such a task force could “study current approaches to the bar exam, develop a more realistic definition of minimum competence, and explore best practices for measuring that competence.”

But I also come back to the states, and to California specifically. There is a vigorous debate going on about whether California should lower its bar exam cut score. That’s an important discussion to have. But I might suggest another discussion to have about the California Bar Exam: Shouldn’t California resist the UBE and instead conduct a new study of alternative methods for assessing today’s relevant lawyering skills that are not encompassed by the UBE?

Legislation & Regulation and the Bar Exam

Most readers of this blog will be familiar with the performance test (PT), a portion of the bar exam in 42 states and D.C. (Forty states use the Multistate Performance Test (MPT); examiners in Pennsylvania and California write and administer their own PT.) For states using the Uniform Bar Exam (UBE), the MPT counts for 20 percent of the overall exam score.

I wrote about the performance test previously here. I extolled its virtue as the only part of the exam that exclusively tests lawyering skills, requiring zero memorization of legal rules; and I bemoaned its status as the ugly step-child of the bar exam that gets next to no attention in conversations about exam reform.

Over time, bar examiners have concluded that certain substantive subjects have grown or lessened in importance to law practice such that they have added subjects to the MBE (e.g., Federal Civil Procedure) or dropped subjects from essays (e.g., Secured Transactions, in some jurisdictions). Why not the same with skills on the PT? Is it not fair to say, for example, that a greater percentage of beginning lawyers today work in fields dominated by regulations than did in 1993 when the MPT was born? Yet the vast majority of PTs to this day test the ability to reason from cases, not from statutes or regulations without the aid of cases.

The anti-regulation bent of the current administration notwithstanding, we live in a heavily regulatory state. Lawyers in numerous specialty areas, including health care law and environmental law; lawyers working for government agencies; or lawyers serving as in-house compliance officers—among the most important skill sets for all of them are reading, interpreting and applying statutes and regulations. (Compliance, by the way, has been a growing field, and positions in compliance are J.D. preferred jobs increasingly being filled by newly licensed lawyers.) Many law schools have responded to this reality by adding a 1L course on legislation and regulation to provide law students the needed foundation for practicing law in our heavily regulatory state. (A running list, accessible from here, indicates that about 30 law schools are offering a course of this nature in the first year.)

In reviewing summaries of the last 28 MPT items (covering the last 14 exams back to February 2010), I found only one among the 28 that provided only statutes and regulations and no cases as part of its law library. Typically, PTs presenting issues of statutory application have both statutes and cases in the library, and the cases provide the statutory interpretation needed to answer the issue posed. That’s still common law reasoning—a very important skill, to be sure, but not very helpful for a lawyer when the only applicable law is a statute or a regulation.

All of the above helps to explain how pleasantly surprised I was to see a purely statutory issue on the February 2017 performance test on the Pennsylvania Bar Exam. The assigned task was to write a memorandum analyzing and supporting the client’s position on three legal issues raised by opposing counsel in a motor vehicle accident. One of the issues was whether a driver had violated the state’s law banning texting while driving. The text of the law appeared in the materials, and applicants had to dissect its language and apply it to the facts—all without the aid of cases in the materials, each of which was relevant only to other issues. This is basic stuff, but exactly the kind of basic stuff that beginning lawyers must be able to do well.

“I won’t be in class; I have a job interview”: Dealing with a Changing 1L Experience

As I regularly teach first-year students, I note with interest a proposed revision to NALP’s 1L Guidelines that would remove all date restrictions on interaction between 1Ls and career service professionals (currently disallowed before October 15) and between 1Ls and employers (currently disallowed before December 1). I also note, with disappointment, the growing number of times that 1L students are missing my Legal Writing class for a job interview or, in one recent case, even for a work shift. A colleague of mine reports her frustration with the same trend.

If adopted, the change to the NALP guidelines would further chip away at the conventional but no longer sacrosanct wisdom that 1Ls should be almost exclusively focused on their academic studies, especially in their first few months of law school. I expect that many schools clinging to that wisdom would have to loosen up existing approaches to the 1L experience, if they have not already.

By way of example, the administration at the school where I teach intentionally schedules 1L classes on all five days of the week for each section. This is consistent with our policy actively discouraging 1Ls from engaging in any employment outside of law school. But probably it is not consistent with shifting realities affecting 1Ls—the very realities that give rise, no doubt, to the proposal to reform the NALP guidelines.

Career services offices, which coordinate many 1L job interviews, simply cannot avoid all potential class conflicts—or at least they cannot where all 1Ls have class five days a week. I am very curious if other schools set aside a whole day of the week or multiple entire mornings or afternoons where there are no classes for 1Ls. I expect that more and more schools will need to consider adopting such an approach.

The challenge to law schools arises not just from the growing priority for 1Ls to search and interview for summer jobs, but also, I think, from a growing priority for at least some 1Ls to actually work in jobs—law-related or otherwise—during the academic year. Indeed, I hear from more and more 1Ls that they are working outside of law school. However, I expect that there remain many schools that, like mine, actively discourage 1L employment.

Interestingly, presuming the ABA House of Delegates goes along with a pending reform to ABA Standard 305, law schools will no longer be barred from offering academic credit to students working for compensation in an approved externship. (As of now, Interpretation 305-2 prohibits law schools from granting credit to a student for “participation in a field placement program for which the student receives compensation.”) Most current externship programs or opportunities are designed for 2Ls and 3Ls. But what should schools do about 1L students who need to work during the school year to make ends meet? I imagine that, if the change to Standard 305 is adopted, some of those 1Ls might be savvy enough to ask for academic credit for their work—presuming it is law-related—and then complain when they cannot get it because the work does not fall within the prescribed curriculum.

Even if my imagination is too wild on that point, the reality is that there are 1Ls in full-time J.D. programs who are working. And, anecdotally, I sense their numbers are rising. If I am right about that trend, perhaps some 1L curricular or programmatic reform is appropriate in response. Whether that means the development of elective 1L externship programs (for unpaid work in government and public interest positions too, of course) or something else, I’m not sure. But I am sure that, for better or worse, employment–searching for it and engaging in it–is changing the 1L experience.

The 25 Most Important Lawyering Skills?

In discussing bar exam reform in my earlier post, I referenced the results of this job analysis survey of newly licensed attorneys. The attorneys, all in practice for three years or less, were asked to rate the significance to their jobs of various skills or abilities (e.g., legal reasoning, organizational skills, written communication) and various knowledge domains (e.g., Rules of Evidence, Contract Law, Rules of Civil Procedure). Ever since I first saw the results, I have been taken with one particular statistic: The respondents rated 25 different skills or abilities as more significant to their jobs than the highest rated knowledge domain.

After the results came out, I looked more closely at these 25 skills and organized them into five broader skill categories. (My chart, which includes all 25 skills and each one’s average rating on a scale of 1 to 4, is below.) I then led a discussion on the importance of all of this to legal education at a legal writing conference last spring. Some of the colleagues in attendance offered insightful and practical comments that I’d like to share here.

One suggested that the 25 skills are a good starting point for formulating a new course to satisfy the ABA’s expanded practical skills requirement in the new Standard 303(a)(3). Others suggested that my chart, or something akin to it, could be a means for identifying and measuring learning outcomes for “other professional skills needed for competent and ethical participation as a member of the legal profession” under Standard 302(d), or additional learning outcomes under Interpretation 302-2.

I hope that many in legal education will find this chart, my colleagues’ ideas, and the overall survey results to be valuable tools. And, if anyone has feedback on how to revise the chart to make it a more useful tool, please get in touch.

Communication Analysis Research Project Management Professionalism
Written communication 3.77 Critical reading & comprehension 3.55 Computer skills 3.28 Paying attention to details 3.67 Professionalism 3.58
Listening 3.60 Synthesizing facts & law 3.55 Electronic researching 3.26 Using office technologies 3.56 Judgment 3.29
Oral communication 3.58 Legal reasoning 3.54 Fact gathering & evaluation 3.22 Knowing when to go back & ask ?s 3.46 Diligence 3.26
Interpersonal skills 3.44 Issue spotting 3.43 Organizational skills
3.46
Answering questions succinctly 3.30 Information integrating 3.10 Working within established time constraints 3.44  
Advocacy 3.24 Decisiveness 3.31
Consciousness of limitations 3.15
Planning & strategizing 3.13

 

Bar Exam Musings, Part II: Skillfully Changing the Bar Exam Narrative

There really needs to be a paradigm shift in the way the National Conference of Bar Examiners and state bar examiners approach potential reform of the exam. It should not be so novel an idea to increase the range of skills tested on the bar exam (or at least enhance the testing of existing skills) instead of increasing the number of subjects tested on the bar exam. Adding Federal Civil Procedure as the seventh subject on the MBE, as the NCBE just did this year, is not helping. An expanded MBE exacerbates the already heavy imbalance in favor of testing for content knowledge over testing for professional skills

Granted, some skills do not lend themselves to being tested on a standardized exam, but some very well could. Has the NCBE done a careful study of the skills coverage of the Multistate Performance Test akin to its review of the subject coverage of the MBE that led to the adding of Civil Procedure? I have seen little evidence that it has.

Consider a few skill sets as examples. The vast majority of newly licensed lawyers responding to a recent job analysis survey indicated that their job requires them to investigate facts and gather facts. A similarly large majority indicated that their job requires them to develop strategy for client matters. The MPT is supposed to test these skill sets, but has it? My review of the last 10 years’ worth of MPT questions suggests that it has not but has rather focused consistently on basic legal and factual analysis to be written in the form of a memo, brief, or client letter. (Not that there’s anything wrong with that; it’s just that there is something wrong with having only that.) Moreover, among the documents that MPT examinees are told that they could be asked to produce are a discovery plan or a witness examination plan, but I have never seen either assigned.

Surely, if the MBE deserved review to determine if it needed another subject, the MPT deserves review to determine how it can expand to test more skills and more often.

In the same vein, there is the question of whether and how to test legal research, which has gotten some attention and has been studied by the NCBE. Even legal writing, though a fundamental part of completing an answer to an MPT or essay question, is not really tested on its own merits.

Musings on the Bar Exam and Legal Education’s Attitude toward it

I have been studying and writing about the bar exam of late, so I appreciate the guest blogging opportunity, graciously offered by Mary Lynch, which I shall use to share some bar exam musings. Later this week, I hope to follow up with a bit more.

I noted with interest a recent New York Times feature, Is the Bar Too Low to Get into Law School? The feature offered perspectives from five legal professionals, four of whom are law professors, on how best to respond to declining bar exam passage rates. (Scores on the MBE, the anchor of the bar exam in almost every state, have declined again this year.) Two took issue with the bar exam itself, arguing for fundamental changes or its complete abolition. But Linda Sheryl Greene of the University of Wisconsin Law School argued that law schools simply need to do the work of preparing their students for the exam.

Law schools (or at least those not in the very top tier) indeed need to help their students prepare for the bar exam, but the bar exam also has to change in a way that allows law schools to do their part without the deleterious distraction of the exam’s heavy focus on recall of memorized law. Regrettably, bar exam reform efforts over the last 20 years have not focused on the one part of the exam that actually and exclusively tests lawyer competencies, requiring zero memorization of legal rules. That sadly neglected part of the exam is the performance test, which assigns a specific written lawyering task to be completed using a closed universe of factual materials and legal authorities. About one-fifth of the states do not even administer a performance test. Among states that do, the performance test remains the smallest part of the exam, accorded the least weight in scoring. It is in a very real sense the ugly step-child of the bar exam.

The behemoth of the bar exam, the MBE, compels examinees to study and memorize a copious number of legal rules. To be fair, the MBE does not test only for knowledge of law. But every skill set evaluated by the MBE—reading comprehension and legal analysis among them—is evaluated also by the performance test. The MBE’s primary value to the overall exam is psychometric—i.e., when scores on other parts of the exam are scaled to the MBE, the overall exam achieves testing reliability. A reasonable level of testing reliability can be achieved if the MBE is weighted at 40% of the overall score. (See page 13 of this article by the National Conference of Bar Examiners’ former Director of Research.) However, the NCBE recommends 50%, a recommendation that most states follow.

What of the rest of the exam? In every state, the remaining part of the score comes mostly from answers to essay questions, which, like the MBE, require memorization and recall of legal rules. If the MBE is testing knowledge of law (and creating more than enough focus on rote memorization), what reason other than inertia is there for essay questions to retain such a significant place on bar exams? Or to remain on bar exams at all? For years, essay questions were the venue for testing knowledge of state-specific law. However, most states now use the NCBE’s Multistate Essay Examination. And, as a growing number of states adopt the Uniform Bar Examination, several are employing other means outside of the bar exam, such as a required seminar, to ensure that new lawyers are familiar with unique attributes of local law.

And that takes me back to the performance test, the most valid of the testing instruments on the bar exam. The performance test was the answer from bar examiners 20 years ago to the recommendations of the MacCrate Report, which called on law schools and bar examiners to increase their attention to lawyering skills. Since then, while the MBE and essay examinations have been expanded, the performance test has remained stagnant. That needs to change. Through careful attention to the various skills today’s beginning lawyers have to perform, examiners should be able to reinvigorate the performance test and expand its skills coverage. They should also be able to increase the inadequate weight given to the performance test in scoring.

As for legal education’s attitude and approach toward the bar, I think an exam that focuses more heavily on skills through performance testing is one that would put law schools in a better position to help their students prepare. Because performance tests do not evaluate substantive knowledge of law, bar preparation specialists in law schools can easily administer performance tests from previous bar exams to students as both formative and evaluative assessments. Legal Writing professors have been using performance test-style problems for many years, especially with first-year students. Clinical professors use them, and, yes, even some doctrinal professors have too.  (I compiled a list of articles discussing the use of performance test-based problems by law professors in footnote 269 of my recent article.)