Implementation of the ABA’s New Experiential Training Requirement: More Whimper Than Bang

By: Robert Kuehn, Washington University School of Law


When the ABA adopted a new experiential training requirement in 2014, there was hope it would spur law schools to significantly change the way they prepared students for legal practice. The new six-credit requirement in ABA Standard 303(a)(3) was less than the fifteen credits proposed by some educators and did not include a mandate for a law clinic or externship experience. Nonetheless, the six credits were an improvement over the ABA’s previous “substantial instruction” in professional skills requirement.[1] But data from the initial implementation of the new experiential requirement suggest its effect has been more of a whimper than the bang some hoped for, with little evidence it has spurred legal education to enhance the ability of students to get hands-on training in professional skills.

            Law schools are required to report annually to the ABA on the number of seats simply “available” to students in law clinic and simulation courses and the number of field placement/externship positions actually “filled.”[2] Data from the first two years of the new six-credit requirement in 2019 and 2020 show no increase in the positions available to students in clinics or simulations and even a decrease in actual enrollment in field placement courses, when normalized to address fluctuations in nationwide law school enrollment. While some law schools have made important changes to their curriculum, the graph below indicates that, on average, schools have not reported positive changes in law clinic, field placement, or simulation data since the ABA’s adoption of the new experiential standard in 2014. The number of clinic seats available per J.D. student in 2014 was 0.27 and still only 0.28 in 2020; field placements decreased from 0.26 in 2014 to 0.24 in 2020; and seats available in simulations likewise decreased over the six-year period from 1.22 to 1.12 per student.


  Source: ABA 509 Required Disclosures at http://www.abarequireddisclosures.org/Disclosure509.aspx

            The New York Court of Appeals followed the ABA in 2015 with its own new skills competency standard for bar candidates, proclaiming that “the goal of ensuring effective, ethical and responsible legal services in New York requires more than what the new ABA Standards provide.”[3] Commentators on the proposed New York standard argued it simply mirrored the ABA’s requirement, with some additional paperwork, and would not improve the skills training of students. The graph below shows that the New York competency standard, indeed, does not appear to have spurred New York’s law schools to noticeably enhance their professional skills training of students or to provide more training than schools in states following only the ABA requirement. Although students at New York schools were offered more opportunities to enroll in simulation courses lacking the supervised experience of handling the complexities of real-life clients, opportunities to participate in a law clinic were unchanged and field placements decreased.


Source: ABA 509 Required Disclosures for 15 New York law schools

            Data from the recent Center for the Study of Applied Legal Education (CSALE) 2019-20 Survey of Applied Legal Education of 95% of law schools also show little measurable effect from the new experiential training standard.[4] Schools reported no increase in the median number of law clinic courses offered to their students since the prior 2016-17 survey and no increase in the percentage of students that graduate with a law clinic experience. Similarly, there was no reported increase in the percentage of students that graduate with an externship experience, with student demand for externship courses in 2019-20 up slightly from the last survey yet significantly less than externship demand in 2014 when the new standard was adopted. And, the percentage of schools requiring each J.D. student to participate in a law clinic or externship course as a condition of graduation only increased marginally from 20% in 2016-17 to 23% in the 2019-20 survey.

            Two thirds of schools in the CSALE survey did report making some changes to their curriculum in response to the ABA’s new experiential requirement, with almost half reporting the addition of a new law clinic, field placement, or simulation course(s), and one quarter of schools reporting increased slots available in an existing experiential course(s). A 2018 survey by Allison Korn and Laila Hlass also found that about two thirds of schools reported an expansion or enhancement of their course offerings in light of the ABA’s new experiential course requirement.[5]

            In both surveys, however, significant numbers of schools simply restructured existing courses to meet the experiential training definition, including merely relabeling parts of the first-year required legal writing course as “experiential” or offering a one-credit simulation component to a doctrinal course. Because the survey questions did not ask separately about law clinic and externship courses but grouped them with non-clinical simulation courses, the data do not reveal if legal education has increased live-client clinic or externship opportunities for students or simply adjusted to the new requirement in other ways. In the 2019-20 CSALE survey, there was a slight increase of approximately 5% in the reported percentage of students that participated in a law clinic or externship prior to graduation. But fewer than 20% of schools attributed any increase in clinic or externship demand to the new ABA requirement.

            To the extent the ABA’s new six-credit experiential requirement was intended to provide law students with more meaningful hands-on training in important professional skills, its own data do not show that intended result. In addition, surveys of schools on their implementation of the new training requirement do not show significant gains in skills training as a result of the new accreditation standard.

            It is time for the ABA to address these deficiencies by at a minimum requiring schools to report actual enrollments in law clinic and simulation courses so that the ABA can truly judge the effect of its requirement and prospective applicants to law schools will not continue to be potentially deceived by reports of ethereal “available” law clinic opportunities.[6]

            Yet students, and the clients they will soon represent in practice, deserve more than just enhanced reporting requirements. The ABA’s six-credit experiential requirement remains far below the skills training other professional schools require of their students.[7] Two recent studies on legal education have highlighted the need for greatly enhanced skills training, including mandatory clinical training prior to bar licensing.[8] The ABA should heed these calls for reform and revisit the proposals for fifteen-credits of experiential coursework and a mandatory, live-client clinical experience for all J.D. students.


[1] An ABA memorandum explained that “substantial instruction” equaled only one credit of lawyering skills instruction, which could be in a simulation course. Peter A. Joy, The Uneasy History of Experiential Education in U.S. Law Schools, 122 Dick. L. Rev. 551, 574 (2018), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3129111.

[2] Prior to 2017, the ABA also required schools to report the actual number of students enrolled in law clinic and simulation courses, not just seats available. However, the ABA determined that asking schools to report actual enrollment, when the accreditation standard only requires “substantial opportunities,” was unnecessarily burdensome and now only requires schools to report the number of clinic and simulation opportunities that are potentially available to students.

[3] New York Court of Appeals, New Skills Competency Requirement for Admission to the Bar (Dec. 16, 2015), at

http://www.courts.state.ny.us/ctapps/news/nottobar/nottobar121615.pdf; Task Force on Experiential Learning and Admission to the Bar: Report to Chief Judge Lippman and the New York Court of Appeals 3 (Nov. 2015), at http://ww2.nycourts.gov/sites/default/files/document/files/2018-05/Experiential-Learning-Admiss2Bar-Report122015.pdf.

[4] Center for the Study of Applied Legal Education, 2019-20 Survey of Applied Legal Education (2020), available at https://www.csale.org/#results.

[5] Allison Korn & Laila L. Hlass, Assessing the Experiential (R)Evolution, 65 Villanova L. Rev. 713, 731-33 (2020), available at https://digitalcommons.law.villanova.edu/vlr/vol65/iss4/1/.

[6] One school with enrollments of approximately 300 students per class claimed in its 2018 509 Required Disclosure to prospective applicants over 1,500 seats available to students in its law clinics. Another school with a class of 100 reported over 300 clinic positions available, yet only 50 students actually enrolled in those purported available positions.

[7] See Robert R. Kuehn, Pricing Clinical Legal Education, 92 Denv. U. L. Rev. 1, App.. A (2014) (documenting one-quarter to one-third required credits in skills training for other professional schools), available at. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2318042.

[8] Deborah Jones Merritt & Logan Cornett, Building a Better Bar 75-76 (2020), available at  https://iaals.du.edu/sites/default/files/documents/publications/building_a_better_bar.pdf; Joan W. Howarth & Judith Welch Wegner, Ringing Changes: Systems Thinking About Legal Licensing, 13 Fla. Int’l L. Rev. 383, 430-31 (2019), available at https://scholars.law.unlv.edu/cgi/viewcontent.cgi?article=2220&context=facpub.


The Imperative and Power of Empirical Research

By Anahid Gharakhanian, Southwestern Law School

Allison Korn and Laila L. Hlass’s Assessing the Experiential (R)evolution, recently published in Villanova Law Review, should be celebrated as a much needed example of empirical investigation and analysis in legal education, specifically experiential education.  As aptly noted in the Experiential Education section of Building on Best Practices, “[l]egal education urgently needs empirical research on what methods will best promote deep learning that transfers to practice.” 

For many years, the experiential teaching community has had the benefit of the triennial CSALE Study, providing extensive data about the infrastructure of clinics and externships.  Now Korn & Hlass’s empirical work provides data about the proliferation of deans/directors of experiential education and growth in experiential curricula.  This data sets the stage for the important questions they raise about what law schools are doing about the following:  “working to uplift experiential programming as an essential part of the institution,” and “core to the law school curriculum”; “taking steps to identify, recruit, and support clinicians of color”; and ensuring security of position and voice in law school governance.  Korn & Hlass’s work, along with CSALE’s compilation of data since 2007 about applied legal education, serves as an essential foundation for posing these important questions and joins the clarion call of others that rigorous empirical research is critical in every aspect of our assessment and advancement of experiential education – the students’ learning, role of experiential curricula, and diversity of and equity for experiential faculty. 

I think about the critical importance of empirical work from the vantage point of externships or field placement courses, which provide a singularly unique bridge to practice and where so much of the student’s experience occurs outside of the classroom and the externship professor’s direct observation.  Anecdotally we know that these real world experiences are very important to a student’s professional development and practice readiness as a new attorney.  At the same time, the ABA and some in legal education have worried about the educational rigor outside of the law school setting.  What’s needed is exploration of our impressions and perceptions through rigorous empirical work.  In the world of externships, this translated into research questions that Carolyn Young Larmore, of Chapman University, Fowler School of Law, and I took up in a year-long, multi-school study, assessing students’ success at externships and factors contributing to it (involving three law schools in the same geographic area, with very different externship program designs, and widely different incoming credentials – with 2019 median LSATs of 153, 158, and 168).  The study yielded helpful information about the importance of externships to practice readiness.  Also, a notable finding of our study – related to access – was that students from all three surveyed schools achieved very similar levels of externship success (measured in terms of first-year practice readiness), regardless of widely different entering credentials as well as the academic component of the externship programs.  Similarly, the study found that law school GPA plays a very limited role in predicting externship success.  You can see how this data could be a powerful tool in creating access for law students, from many diverse academic backgrounds and schools, to career-shaping professional experiences while in law school and beyond.

As we tackle empirical questions in experiential education, it’s helpful to think about backward design.  In the case of the experiential programming that we offer to our students, a couple of recent national studies are enormously helpful: IAALS’s Foundations for Practice, a relatively recent national study about what foundations entry-level attorneys need to begin a successful legal career (which is the study that Carolyn and I used to define externship success in our own study – i.e., how close are externs by the end of their externship to first-year practice readiness); and the very recent study by IAALS and Professor Deborah Jones Merritt, Building a Better Bar: Capturing Minimum Competence, with one of its two objectives to “[d]evelop and promote understanding of the minimum competence needed to practice law” (and the second one to “[a]lign the bar exam with research-based concepts of minimum competence”). 

To borrow from IAALS and Professor Merritt, the key here is being guided by research-based concepts.  Whether assessing our students’ learning (as Carolyn and I tackled in our externship study), or raising questions about the role of experiential curricula, and diversity of and equity for experiential faculty – as Korn & Hlass have done – we need to engage in more empirical research and use this powerful tool to inform and advance the critical work of experiential education and educators.

Virtual Practice/Virtual Externships: Past, Present and Future

By: Dena Bauman, UC Davis; Gillian Dutton, Seattle University; Kendall Kerew, Georgia State; Chipo C. Nyambuya, Loyola Chicago; and Amy Sankaran, University of Michigan

When we proposed a concurrent session focused on remote externships in November 2019 for the March 2020 Externships 10 Conference, we expected to be exploring a novel field placement arrangement, that of students working from a different location than their supervisors, and/or remote from the physical site.  A few months later, at the very time the conference was originally scheduled, we found ourselves in a global pandemic. We were living in the environment we planned to explore.

Overnight, our “novel” work arrangement became reality. Students were working remotely from both the law school and from field placements and supervisors. We planned to “model” a Zoom supervision meeting for panel attendees. Overnight, virtually all law school faculty, law students and field supervisors were relying almost exclusively on Zoom and other video conferencing platforms for classes, work at field placements and site visits.

Survey on Working Remotely and “Best Practices” Template:

Per our November 2019 proposal, we planned to survey the externship community about their current practices and rationales regarding remote placements. We also planned to create and distribute a template to use in evaluating and making decisions about virtual externship sites and advising students. We are linking to the template.

We distributed a revised survey through the national externship list serv in September 2020 that took a “snapshot” of practices before and during COVID. It also asked externship clinicians to consider what they might do post-COVID. [1] We used the 81 responses to create a template for assessing potential virtual placements and distributed the template at our rescheduled panel on October 23, 2020 along with a brief summary of the survey.  We used these three definitions.

  1. Traditional remote placement: Student and site supervisor physically present at the site, but remote from the faculty member.
  2. Remote supervisor placement: Student physically at the site but separate from the supervisor.
  3. Virtual remote placement: Student physically separate from both the site and supervisor.

We are including several tables summarizing the primary survey findings. The complete data report is linked here.

Pre-Pandemic: Few Schools Allowed Virtual Remote Placements but Many Allowed Traditional Remote Placements

Type of Placement AllowedPercentageNumber
Virtual remote10.78%11
Traditional Remote60.78%62
Remote supervisor11.76%12
None of the above16.67%17
Reason for not permitting remote placementsPercentage of SchoolsNumber of Schools
Students would not have the same immersive experience30.59%52
Students would not get the same or complete learning opportunities36.47%45
Supervisors would not give adequate feedback17.06%29
Reasons for allowing remote placementsPercentage of schoolsNumber of Schools
Benefit part-time/evening students19.44%7
Supervisors work in a different location16.67%6
Evolution of law practice16.67%6
Student want to work in virtual environments13.89%5

During the Pandemic (as of date of survey):  No Schools are Barring Remote and/or Virtual Placements

In response to the pandemic, we asked schools whether they were permitting a traditional remote placement and/or a virtual remote placement. 80 schools, out of the 81 respondents, permit virtual remote placements. 69 schools are permitting students to work apart from the faculty but are on site with the supervisor also present.

Another question asked why they were permitting these new arrangements. The three top reasons were:

ReasonPercentage of SchoolsNumber of Schools
Externship sites are closed or ask students to work remotely45.65%63
Students asking to work remotely for caregiving or other reasons31.16%43
Law school requirements10.14%14

Law school comments included: students need to complete graduation requirements, confidence that students are getting adequate supervision and experience in a new work environment, distance and traffic, and public health and safety concerns.

Nearly ¾ of the law schools said the changes have been both beneficial and problematic. As the pandemic continues through the fall, likely into the spring, and perhaps beyond, we will continue to learn much more about those changes.

Post-Pandemic: Training will be a Priority in Permitting Virtual Placements

Schools predict that training requirements for law students and field supervisors will be an important condition for considering virtual placements after the pandemic has passed.

RequirementPercentage of SchoolsNumber of Schools
Externship program will train students in remote work 22.73%50
Sites must demonstrate training for attorneys and students12.27%27
Supervisors must agree on training from externship programs11.36%25

Faculty Anticipate a Case-by-Case Approach

Factors that schools would consider in assessing the virtual and remote placement include family, health, financial and geographical circumstances, as well as the availability of the experience.  Schools also noted that it is too early to know if and how law practice will change, but that consideration is important in preparing our students for their careers. Additionally, we also realized that we need to scrutinize and analyze this nearly overnight reliance on technology in practice. Our presentation discussed these issues, such as access to technology and roles at home.  We will discuss those issues, and how they will affect our preparation of students for their post-graduate careers, in another posting.

Summary:  We believe our survey is a good starting point for future assessments, even as the pandemic continues to upend legal education and law practice.  Our template, which was favorably received at our panel, will help externship clinicians make decisions for placements and students, gather reasonably consistent information that they can use, and share with the community to build a record of useful information. Post-pandemic, we will depend on each other to share information about what we are contemplating and are implementing regarding virtual placements.


[1] Thanks to Inga Laurent, Gonzaga University School of Law, Theodora Pina, Santa Clara University School of Law, Sue Schechter, UC Berkeley School of Law and June Tai, Iowa College of law for their review and suggestions regarding the survey instrument.

Refuting the False Trope on Clinical Courses and Bar Passage

Robert Kuehn, Washington University School of Law

It has been observed that “the fewer the facts, the stronger the opinion.” Until recently, this could be said about the possible influence of enrollment in clinical courses on a student’s likelihood of passing the bar examination. While there was a shortage of empirical studies on any possible relationship, there have been plenty of opinions on how taking those courses might be harmful, opinions often reflected in graduation restrictions on clinical courses and requirements for bar subject-matter courses.

But, there are now significantly more facts to refute those opinions. Two recent, large-scale studies have both found no relationship between the number of law clinic or externship courses or credits a law graduate took and her likelihood of passing the bar exam.

In a forthcoming article in the Journal of Legal Education, academic records of ten years of law school graduates of Washington University in St. Louis and Wayne State University were reviewed for any relationship between the number of law clinic, externship, or, more generally, experiential courses or credits and bar passage. After first accounting for the possible influence of law school grades on bar passage (the most significant predictor of bar success), the study found no correlation at either school between law clinic or externship enrollment and bar passage — no relationship between participation in a law clinic or externship and passage, none between the number of clinical courses and passage, none between the number of clinical credits and passage, and no evidence that students graduating with lower GPAs were disproportionately enrolling in those courses as a way to avoid doctrinal courses (another not uncommon trope). This lack of any relationship was in spite of increased enrollment in experiential courses at both schools over the ten-year period and decreased enrollment in courses teaching material tested on the bar (referred to as bar subject-matter courses).

The article notes that nationwide data on experiential course enrollment and bar passage also belie any claim the two are related. That data indicate that as enrollment in experiential courses was increasing from 2006-2016, bar passage percentages were fairly steady and that the recent decline in passage coincided with decreased, not increased, enrollment in those courses.

A recent study commissioned by the California State Bar found a similar lack of relationship between law clinic and externship courses and bar exam performance. The study reviewed law school coursework and performance on three July exams for over 7,500 bar applicants from eleven California schools. It found no relationship between the number of academic credits from law clinic courses and exam performance, either across all schools or even when reviewing schools separately. Similarly, there was no relationship between the number of externship or internship credits and performance, again when examined across all schools or within schools. The broad range of entering credentials at the eleven schools, and lack of a relationship even within those schools, indicates that the results should be applicable to most law schools, including those with lower LSATs and undergraduate GPAs for entering students.

The study results from Washington University/Wayne State and the California State Bar are similar to smaller studies at Texas Tech and the University of Denver that also reported no statistically significant relationship between enrollment in a law clinic or externship course and bar passage.

The Washington University/Wayne State and California State Bar studies further revealed that opinions about the value of bar subject-matter courses should be moderated. There were small correlations at both schools between the number of bar subject courses and bar passage. But this result (explaining less than 5% of the variability in bar outcomes) was only for low performing students and additional courses showed no marginal benefit once students took the school’s average number of bar courses.

The California State Bar study focused on whether taking a specific course was related to performance on the bar exam topic taught in those courses. It found that neither attendance nor performance in courses covering any of the 13 bar-related topics was related to performance on the corresponding California bar exam or Multistate Bar Exam content covering that subject.

Studies at other schools also indicate that enrollment in bar subject-related courses do not support broad claims about the benefit of taking those courses.

It is time to put away the misinformed trope of participation in law clinic and externship courses harming a student’s chances of passing the bar exam and let the facts do the talking. Law schools should recognize and students should be told they can obtain valuable preparation for the practice of law by enrolling in clinical courses without affecting their likelihood of passing the bar exam.

%d bloggers like this: