Let’s Take This Period of Unprecedented Change to Consider How Grading Practices Can Affect Issues of Diversity and Inclusion in Our Law Schools

Jennifer S. Bard, Visiting Professor, University of Florida, Levin College of Law

For the last half of spring semester 2020, law schools all over the country were forced to change their method of instruction, delivery of final exams, and (in many cases) grading practices because of the demands for physical isolation following the outbreak of Covid-19.  Now that the semester is over, there is a further round of disruption as many states have delayed or even cancelled their bar exams, some have granted graduates diploma privileges, while others bravely go ahead in the face of a possibility that they will have to cancel at the last minute because of ever-rising rates of infection. 

Like the opportunities that may arise when a river is drained and a ship revealed, there may never again be such an opportunity for us to consider what role we play in the glacially slow diversification of the legal profession and how we can make our law schools more equitable, inclusive, challenging, and effective for all of our students—not just those for whom it has been particularly well suited.

With many things to choose from, my starting point for looking at things we rarely question is the marrow deep belief that we owe it to our students to sort them for the benefit of large law firms—even when our employment profile shows that very few of our students will ever work at such a place.  Since the threshold for this opportunity is a top 5 or perhaps 10 percent class rank, it may seem odd, on reflection, that we have designed a curriculum designed to compare students that may have many undesirable consequences including undermining self-esteem, discouraging learning for learning’s sake, and contributing to the lack of diversity in the legal profession.  

Over the years, other justifications have been added such as the need to motivate students or assess their progress but never have we had such a good opportunity to see what law school is like without grades or, more to the point, comparative curves.

Here are some Practices We Might Question

The Primacy of First Semester Grades

One result of the decision to go pass/fail (or some variation of the words) was to “freeze” first year first semester class ranks because it was impossible to produce comparative curves

The resulting phenomena gives us a chance to ask ourselves  some tough questions:

  1. Do First Semester Grades Reflect What Students Bring to Law School Rather Than What We Bring to Them? OR Do Students Who Come in Knowing the Rules Get Better First Semester Grades?

Many students, very often First Generation Students, but also some facing racial or gender identity or expression based discrimination, frequently tell us (and the many researchers who study first generation college students) some version of “everyone knew the game but me and by the time I figured it out, it was too late.” And while students living with disabilities might intersect with any of these groups, they also are often using new adaptive equipment and certainly facing new challenges that they may have been able to mitigate in college.

Certainly many of our students do know the game from the start.  The recent AALS survey “Before the JD” found a disproportionate number of students who ended up going to law school had parents who were either lawyers or professionals. While students have, themselves, created organizations to support each-other usually with the enthusiastic support of the law school it may not be enough.

Our challenge going forward is that history is told by the victors.  We can see the students who were not comfortable the first semester but then continued to graduate “at the top of their class” (a vague term that usually means somewhere in the top 20%), but we don’t hear from the ones who didn’t draw attention through academic distress, but also didn’t thrive.

It would be helpful to know more–and many schools do know more about their own students.  But so little of this information is published.

Much is being done in supplemental programs- to name them is to leave many out- such as pre-first semester programs, orientation programs  and excellent pre-law institutes like the Tennessee Institute for Pre-Law , and in wonderful conferences organized by the National Black Law Students AssociationLavender Law, the National Association of Law Students with Disabilities,  and so many others.  

But how much more effective would it be to have a curriculum that was truly equitable and inclusive – all the way through?

2. Did Pass/Fail Grading Help Learning, Hinder Learning, or None of the Above?

Across the board pass/fail grading that makes no effort to compare students to each other is so unusual as to make any observations worth considering. The expectation was a distressing list of bad results-students putting in less effort during class, performing worse on exams — but did that really happen?

3. Ditto Open Book Exams

As above, it would be interesting to test, in the fall, the content knowledge of students who took open exams.  Not so much as to compare them with past classes, but to see what how much they learned.

4. What Will Be the Long Term Effect of the Delayed or Cancelled Bar Exams–and How Might that Change Our Curriculums?

The opportunity presented by the necessary changes to the bar exam is already in very good hands, (thank you AccesLex) but it’s still worth considering what the future will look like in states which choose provisional or full licensure.  Even decisions to delay the bar exam could raise issues of an on-going, career long licensing process, much as many doctors (but not all) must take requalifying exams every ten years to retain their “Board Certificate.” What would that mean for law schools?

To Be Continued: Part II: What Can We Learn from the Delay of Fall On-Campus Interviewing?   

Refuting the False Trope on Clinical Courses and Bar Passage

Robert Kuehn, Washington University School of Law

It has been observed that “the fewer the facts, the stronger the opinion.” Until recently, this could be said about the possible influence of enrollment in clinical courses on a student’s likelihood of passing the bar examination. While there was a shortage of empirical studies on any possible relationship, there have been plenty of opinions on how taking those courses might be harmful, opinions often reflected in graduation restrictions on clinical courses and requirements for bar subject-matter courses.

But, there are now significantly more facts to refute those opinions. Two recent, large-scale studies have both found no relationship between the number of law clinic or externship courses or credits a law graduate took and her likelihood of passing the bar exam.

In a forthcoming article in the Journal of Legal Education, academic records of ten years of law school graduates of Washington University in St. Louis and Wayne State University were reviewed for any relationship between the number of law clinic, externship, or, more generally, experiential courses or credits and bar passage. After first accounting for the possible influence of law school grades on bar passage (the most significant predictor of bar success), the study found no correlation at either school between law clinic or externship enrollment and bar passage — no relationship between participation in a law clinic or externship and passage, none between the number of clinical courses and passage, none between the number of clinical credits and passage, and no evidence that students graduating with lower GPAs were disproportionately enrolling in those courses as a way to avoid doctrinal courses (another not uncommon trope). This lack of any relationship was in spite of increased enrollment in experiential courses at both schools over the ten-year period and decreased enrollment in courses teaching material tested on the bar (referred to as bar subject-matter courses).

The article notes that nationwide data on experiential course enrollment and bar passage also belie any claim the two are related. That data indicate that as enrollment in experiential courses was increasing from 2006-2016, bar passage percentages were fairly steady and that the recent decline in passage coincided with decreased, not increased, enrollment in those courses.

A recent study commissioned by the California State Bar found a similar lack of relationship between law clinic and externship courses and bar exam performance. The study reviewed law school coursework and performance on three July exams for over 7,500 bar applicants from eleven California schools. It found no relationship between the number of academic credits from law clinic courses and exam performance, either across all schools or even when reviewing schools separately. Similarly, there was no relationship between the number of externship or internship credits and performance, again when examined across all schools or within schools. The broad range of entering credentials at the eleven schools, and lack of a relationship even within those schools, indicates that the results should be applicable to most law schools, including those with lower LSATs and undergraduate GPAs for entering students.

The study results from Washington University/Wayne State and the California State Bar are similar to smaller studies at Texas Tech and the University of Denver that also reported no statistically significant relationship between enrollment in a law clinic or externship course and bar passage.

The Washington University/Wayne State and California State Bar studies further revealed that opinions about the value of bar subject-matter courses should be moderated. There were small correlations at both schools between the number of bar subject courses and bar passage. But this result (explaining less than 5% of the variability in bar outcomes) was only for low performing students and additional courses showed no marginal benefit once students took the school’s average number of bar courses.

The California State Bar study focused on whether taking a specific course was related to performance on the bar exam topic taught in those courses. It found that neither attendance nor performance in courses covering any of the 13 bar-related topics was related to performance on the corresponding California bar exam or Multistate Bar Exam content covering that subject.

Studies at other schools also indicate that enrollment in bar subject-related courses do not support broad claims about the benefit of taking those courses.

It is time to put away the misinformed trope of participation in law clinic and externship courses harming a student’s chances of passing the bar exam and let the facts do the talking. Law schools should recognize and students should be told they can obtain valuable preparation for the practice of law by enrolling in clinical courses without affecting their likelihood of passing the bar exam.

An Overview of “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes”

Robert Kuehn, Professor of Law at Washington University School of Law in St. Louis, and David Moss, Associate Clinical Professor at Wayne State University Law School, recently conducted a large-scale study looking at the relationship between clinical/experiential or bar subject-matter courses and bar passage success in a paper entitled, “A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes.” As a law student currently enrolled in a clinic, this study immediately piqued my interest.

This study was in response to fear that bar passage rates were down because of rising enrollment in “experiential courses” as opposed to “bar-subject courses.” Law schools began pushing students to enroll in more of these bar-subject courses to correct this so-called issue. However, Professor Kuehn and Professor Moss observed that there was no evidence to suggest that taking more bar-subject courses was appropriate advice for all students. Their study looked at this missing evidence between bar-subject courses and experiential learning and bar exam outcomes for ten years between two law schools: Washington University School of Law (WashU) and Wayne State University Law School (Wayne State). Both schools only require the designed first year courses and the upper-level writing courses mandated by ABA accreditation standards.

Previous studies performed in states like Texas, Colorado, and California looked at the effects of coursework and bar passage rates. These studies did not support the claim that taking more bar-tested law school course improve chances of passing on the first attempt. Notably, a study done in Indiana concluded, “simply forcing lower-performing students to take more upper division bar-subject courses will not solve the bar examination failure problem.”

The first goal of the present study was to determine whether a graduate’s enrollment in elective experiential courses was related to first-time bar passage success. Next, it was to assess whether enrollment in elective courses that cover bar subjects was related to bar success.

Data was collected from law school graduates from 2006-2015. The following table outlines the number of graduates with LSAT scores and bar passage rates between the two schools:

The next table looked at graduate characteristics such as undergraduate GPA, LSAT score, 1L GPA, and law GPA and their correlation with bar passage:

It wasn’t until 2005 that the ABA began requiring graduates to receive professional skills instruction with as little as one credit satisfying the requirement. In 2014, the ABA changed this to require six credits beginning with 2019 graduates. The study authors decided to track enrollment in skills courses versus bar passage over this time period.

The table above reveals a solid line depicting that average bar passage percentages were steady from 2006-2013 (this is when experiential course enrollment increased by over 50%). During the significant rise in experiential enrollment, bar passage percentages were largely steady. “Therefore, efforts to link declining nationwide bar passage rates to the rise in experiential course enrollment are not supported by national statistics.” A more likely contributing cause for bar passage declines since 2014 is weaker credentials of incoming 1Ls.

At WashU, it was found that while taking at least the average number of bar courses is associated with increased likelihood of passing the bar, there was no statistically significant increase in bar passage associated with bottom-quartile LGPA graduates who took more than the school’s average. This was similar with graduates in the bottom half of their class at Wayne State. Results for both schools indicate that graduates in bottom quartile who take fewer than the average number of bar courses at their school were associated with a significant increase in bar failure. Further, at both schools, students entering with scores lower than 150 were associated with pass rates significantly below the school’s average.

This study concluded that the claim that the dramatic decline in bar passage rates is due to law students taking more experiential courses or fewer bar-related courses is not supported. It characterized efforts to cap experiential credits in order to improve bar passages rates are “misguided,” warning that schools should not expect that “mere exposure” to more bar courses will significantly improve bar passage rates.

Also see “Legal Skills Prof Blog” and “TaxProf Blog” for more posts on this study

%d bloggers like this: