Leading Edge Conference: Facing and Forming Legal Education’s Future with Insights, Data and Inclusive Thinking

Last week, I was fortunate to attend the 6th annual Leading Edge Conference hosted by Wolters Kluwer (WK) in Riverwoods, Illinois. It was my first experience with this particular conference. Using an unconference format and with a balance of old-timers and new attendees, WK brought together approximately 30 “thought leaders” for two+ days of intense discussion. Participants included professors and deans from a wide variety of law schools, representatives from law related entities such as LSAC, NITA and IAALS, education or pro-bono related entrepreneurs, and digitalization pioneers.

In addition to the conference, WK hosts the Leading Edge Webinar Series and just announced its 2nd annual Leading Edge prize. Ten Thousand Dollars ($10,000) will be awarded to two winning teams “to help implement their visions of improving student outcomes or expanding educational opportunities for law students.” Proposals are due August 15th.

I left the conferences with many “take-aways,” that I am only beginning to fully digest, and with a better sense of the continuing challenges facing legal education and our profession. Bernard A. Burk, Jerome M. Organ and Emma B. Rasiel recently published in the Nevada Law Review “Competitive Coping Strategies in the American Legal Academy: An Empirical Study”. Their research examined the response of law schools “to the substantial fall off in both the number and the conventional qualifications of applicants to law school that began after 2010.”

The “Competitive Coping Strategies” research also explains why more law schools have not closed and emphasizes the “widened distance” between current students’ needs and current school resources. The study found that in the face of plunging applications to law school, “Reputationally stronger schools” generally chose to preserve their entering Class Profile. This meant “thousands of viable candidates remained available to other law schools, effectively preventing the closing of as many as twenty Reputationally Weaker schools.”

Second, the study points out the implications of shrinking Class Size and discounting Tuition to preserve entering class profile. “As a practical matter, then, law schools ‘invested’ in Profile rather than in expanding their faculties, facilities or their access to clinical and experiential education. We encourage discussion of the implications of this investment choice.”

Third, the study noted that “some Reputationally Weaker law schools perversely were able to maintain or raise their average Net Tuition” and “the students with the least promising prospects for obtaining or making any economically sustainable use of their law degrees are paying the highest prices to obtain them. These inequalities expanded significantly after 2010.”

Fourth, the study highlights the millions of dollars in forgone Tuition Revenue “unavailable to meet the needs of students who at many law schools are significantly less prepared” than their predecessors and suggests this widening gap underlies the declining Bar Exam pass rate.

We seem to have reached a plateau in declining admissions to law school. But that plateau is not a place for us to settle in and rest. There are too many hard questions about where we are now.

How do we address the inequalities which have expanded since 2010 in law schools? What is the value we provide to those with the “least promising prospects?” Is it immoral that those least likely to make “any economically sustainable use of their law degrees are paying the highest prices to obtain them?” or that they may be undertaking crippling debt to obtain a law degree?

On the other hand, if we narrow the pathway into law schools even further, rejecting any who come to law school less credentialed or less prepared, will we be rejecting the dreams and hopes of those who desire a professional pathway? Will we be rejecting many who will find an economically sustainable and good life for themselves? Will we be playing God with students from less advantaged backgrounds just because we don’t know who will make it and who won’t? Will we be eliminating first generation students in larger numbers? Will we be amplifying the lack of diversity in our profession?

And what about the role of law schools in the community at large at this moment in our nation’s fledgling history? Shouldn’t we continue to exist as community laboratories which encourage civil discussion, uphold the rule of law, critique unjust legal systems and decisions, work to sustain democratic institutions and constitutional checks and balances, and produce new ideas about the role of law and legal systems in society?

Finally, if we espouse the “public good” values of my last two paragraphs as arguments for the continued existence of the legal academy and law schools, then do we prioritize these values in our faculty hiring, our strategic plans, and our prioritization of resources?
So, I leave you as I left the conference, with more questions than answers, but with a firm sense that we must continue to ask these important questions.

(Note: the author had her lodging, food and flights paid. She was not paid to write or post anything about the conference. Besides, she is pretty opinionated and not easily swayed.)

Dean Gerken’s Vision Versus Malcolm Gladwell’s Experience

“When we decide who is smart enough to be a lawyer, we use a stopwatch.”           Malcolm Gladwell

“Law school should be a time to luxuriate in ideas, to test their principles, and to think critically about the law and the profession.”  Dean Heather Gerken

On the same day I listened to Malcolm Gladwell’s fascinating podcast about the LSAT and test-taking speed, I also read Yale Dean Heather Gerken’s insightful Commentary, “Resisting the Theory/Practice Divide: Why the “Theory School” Is Ambitious About Practice.” Both are wonderful.  Together, they shine light on a dialectic tension within legal education.

Dean Gerken’s article inspires us to think about legal education in its biggest and broadest sense.  She posits that, “At its best, a J.D. is a thinking degree, a problem-solving degree, a leadership degree” and she notes that for students, “law school should be a time to luxuriate in ideas, to test their principles, and to think critically about the law and the profession.”

She envisions law school as a place where students engage in deep critical thinking about the law and the profession – both in the classroom and in clinics, and she discusses the interdependent relationship between the deep learning that should occur in both.

Dean Gerken eschews a mechanistic approach to both classroom and clinical teaching.  She points out that as doctrinal and clinical faculty, our collective, and symbiotic, goal should be to train our students to read closely, think deeply, skeptically, and critically.  She notes that we should help our students learn to question legal rules and principles in context of messy facts, to challenge existing legal rules, and develop new rules or applications of those rules, or as Dean Gerken puts it, to spend as much time thinking about “the ought as the is”.

Contrast Dean Gerken’s understanding of legal education with Gladwell’s podcast about his experience taking the LSAT.  In it, he posits: “when we decide who is smart enough to be a lawyer, we use a stopwatch.”   He notes that who gets into law school, and what law school they get into, rests largely on LSAT score differences – differences that may depend in part upon one’s ability to answer questions quickly rather than thoughtfully.

Gladwell recounts his experience with an LSAT test prep coach who urged him not to read the passages closely because he had no time to do that.  Amazed, Gladwell reflects on how, to get the best score, he must not spend time truly thinking about the issues raised by the problems he must answer.

In the podcast, Gladwell talks to Professor Bill Henderson, the author of a seminal article providing empirical evidence that test-taking speed is an independent variable in both the LSAT and timed law school exams.  Henderson, a former firefighter, talks about the times in his life he felt most time pressured.  As Gladwell remarks, Professor Henderson’s most time-pressured performances were not when responding to life-threatening emergencies.  Instead, they were when he took the LSAT and law school exams.

Gladwell’s podcast meanders into the world of championship chess.  Gladwell analogizes how the chess world decided not to value speed, and how that decision changes who is a top-ranked international chess champion.  He notes that the arbitrary value placed upon speed when it comes to the LSAT and law school exams defines who we consider smart.  He wonders what would happen if the ability to answer questions quickly were not in the mix.  The podcast then returns to Professor Henderson who talks about how allowing law students more time to take law school exams can change the outcome of who gets the best grades in a law school class, and hence who thinks of themselves as a smart person, and who gets hired by top law firms, etc.

As I listened to Gladwell’s podcast, I thought about the bar exam.  In an article Professors Chomsky, Kaufman and I wrote, it took us nearly 500 words to deconstruct the analytical process one must go through to answer one tort multiple choice bar exam question.  That analytical process begins after examinees read a question.  Bar examinees have approximately 1.8 minutes to read and answer each of the exam’s 200 multiple choice questions.

While perhaps quickly identifying the correct response is a necessary skill for some litigators, speedy answers to legal problems are not the cornerstone of most good lawyering.  Yet, starting with the LSAT, continuing in law school, and ending with the bar exam, as Gladwell observes, we reward the hare instead of the tortoise.  He asks “why”?

Dean Gerken’s vision speaks to why I became a law professor.  Gladwell’s observations speak to the experience of my students.  I am not sure how to reconcile the two beyond noting that we must first acknowledge the dialectic.  Only then can we decide if we want to  judge future lawyers’ potential and abilities based upon Gerken’s vision or  Gladwell’s experience.

Today’s Law Student Wellness Programs

There was a time in the not-so-distance past when wellness programs in U.S. law schools primarily focused on preventing substance abuse and suicide. This programming often involved a presentation from the state’s Lawyer Assistance Program. The speaker warned about depression, substance abuse, and suicide for members of the legal profession and the availability of help. While this was important programming, it was depressing and not inspiring for most of our students.

Today, law schools, law students, and lawyers take a broader view of attorney wellness and well-being. We now recognize that students and lawyers benefit from education and opportunities to develop into happy, thriving lawyers. Law school wellness education today is not just about prevention of negative outcomes. It centers around the proactive steps law students and lawyers can take to improve every dimension of their lives.

The ABA Commission on Lawyer Assistance Programs (CoLAP), the National Organization of Bar Counsel (NOBC), and the Association of Professional Responsibility Lawyers (APRL) formed the National Task Force on Lawyer Well-Being to address attorney and law student wellness. In its report, The Path to Lawyer Well-Being: Practical Recommendations for Positive Change (“Task Force Report”), the Task Force explains that there are six different aspects of lawyer well-being:

  • Social. Attorneys should work to develop “a sense of connection, belonging, and a well-developed support network while also contributing to our groups and communities.”
  • Occupational. The Task Force Report references the need to cultivate personal satisfaction and growth in our work. It also notes the importance of financial stability.
  • Physical. In the area of physical health, the Task Force recommends “regular physical activity, proper diet and nutrition, sufficient sleep, and recovery; minimizing the use of addictive substances. Seeking help for physical health when needed.”
  • Emotional. The Task Force Report encourages lawyers to seek support from professionals when they are struggling emotionally. It also emphasizes the importance of “developing the ability to identify and manage our own emotions to support mental health, achieve goals, and inform decision-making.”
  • Intellectual. An attorney focused on the intellectual dimension of wellness engages in “continuous learning and the pursuit of creative or intellectually challenging activities that foster ongoing development.”
  • Spiritual. The Task Force Report explains that the goal in this area is “developing a sense of meaningfulness and purpose in all aspects of life.”

The Task Force makes a number of specific recommendations for law schools. The suggestions include the following:

  • Create Best Practices for Detecting and Assisting Students Experiencing Psychological Distress.
  • Assess Law School Practices and Offer Faculty Education on Promoting Well-Being in the Classroom.
  • Empower Students to Help Fellow Students in Need.
  • Include Well-Being Topics in Courses on Professional Responsibility.
  • Commit Resources for Onsite Professional Counselors.
  • Facilitate a Confidential Recovery Network.
  • Provide Education Opportunities on Well-Being Related Topics.
  • Discourage Alcohol-Centered Social Events.
  • Conduct Anonymous Surveys Relating to Student Well-Being.

In subsequent posts, this blog will consider some examples of the wellness programs, classes, and initiatives that law schools have instituted to address these issues and other areas of need.


Improv for First-Year Law Students?

Just over a year ago, in search of a mid-life growth opportunity, I began taking improv (i.e., improvisational performance) classes at a small theater in Pittsburgh. For decades, I had been a fan of improv as a comedy form but did not have the confidence to think that I could step on a stage and do it myself. Then I happened upon Alan Alda’s book, If I Understood You, Would I Have This Look on my Face? Post his acting career, Alda has become a communications consultant of sorts, working primarily with scientists to help them explain complex ideas in ways that a lay audience can understand. One of the central messages of the book is that improv training and exercises can help professionals of all types relate to others more empathetically and communicate with others more confidently and clearly. Alda references studies showing the benefits of improv training and describes his own experiences running improv exercises for groups of engineers and other scientists.

After reading Alda’s book, I realized that my job as a law professor is to communicate complex ideas to a lay audience too. So, I decided that I could benefit from improv classes—and have some fun at the same time.

My improv experiences over the past year, including joining a team and performing numerous short sets before a live audience, have convinced me that, in short, legal education needs improv. More specifically, to improve the learning environment throughout law school, entering first-year law students need improv! (I could write a separate post on the salutary effects that improv training has had on my teaching, but I will focus for now on how exposure to improv could benefit law students.)

There are CLE courses on improv offered for practicing lawyers (for example, in California and Florida), and there is a blog on improvisational skills for lawyers. But how about improv for law students? Based on a cursory online search, it appears that a handful of law schools offer or have offered improv courses or workshops, including Drexel and Indiana University McKinney. How much improv work has been done with 1Ls, if any, is unclear.

The benefits of improv for law students seem most apparent in the context of skills or experiential courses involving oral communication. Indeed, I have begun using some limited improv exercises during oral argument lessons in my 1L legal writing course. No doubt faculty members at various law schools—Northwestern, for example—have used improv exercises in other skills courses. Thinking more broadly though, and extrapolating from studies discussed in Alda’s book, I believe that offering improv workshops to law students early in their law school careers could very well improve in-class performance and learning throughout law school.

Improv revolves heavily around a group or team dynamic. Someone on the team must initiate a scene by stepping out on stage and doing something, or saying something, or both. One or more of her teammates then must step out in support, accepting the reality established by whatever the first person did and adding to it to help build the scene. That is the essence of “yes, and,” the fundamental premise of improv. The priority in every scene is to make one’s scene mates look good and to never hang them out to dry.

A quick example: An improv teammate and I walk onto the stage. She purposefully walks to a corner of the stage where there is a chair, sits down, puts her head in her hands, sighs deeply, and then begins to sob. Since I have walked onto the stage too, I am the team member in this scene who must accept her established reality: she is distraught about something, and we are in a location where she has taken a seat. However, I must also add to that reality. There are myriad ways in which I could do so. I could, for example, walk over, put my hand on her shoulder, and attempt to comfort her by saying, “It’s alright, honey, I never really liked our BMW anyway, and the side of the house that you hit—we really don’t use it much anymore now that the kids are gone.” She in turn accepts the reality that I have created, and on it goes from there, each of us supporting the other as the scene develops.

I see multiple potential benefits for law students (and, in turn, their professors) that could come from some basic improv training with exercises, offered perhaps within an orientation program before classes begin. For starters, law students, like lawyers, need to work collaboratively and need to relate to each other in a civil and empathetic manner. Improv’s emphasis on teamwork can help in that regard, enhancing students’ abilities to work productively and constructively with classmates in group exercises and projects.

Then there is the classroom learning environment and the sometimes strained or unproductive exchanges that take place between professors and students. Consider the ways in which exposing students to improv could mitigate the impediments to learning existing in the following classroom scenarios, each of which should be familiar to most law professors:

  • Professor poses a question or discussion topic to the class and waits for a volunteer to raise a hand and respond. Nobody does. Or, in a similar scenario, professor calls on a specific student, and the student asks to pass (even though the student might very well have done the reading).
  • Professor poses a question to a specific student, and the student asks for the question to be repeated, or answers in a manner that is not directly responsive to the question.
  • Professor poses a question to a specific student, and the student couches his or her answer in the form of a question, not a statement, suggesting uncertainty and lack of confidence. (For example, in my Legislation & Regulation course, I might ask regarding a case, “Which of the three opinions—majority, concurrence, or dissent—seems to approach the statutory interpretation question most like a textualist would?” The student somewhat meekly responds, “Is it the dissent?”)
  • Professor poses a question that is not explicitly addressed in the reading for that class but rather concerns a hypothetical scenario or a thematic issue in the course that is implicated by the reading. In response, the student struggles to answer or fully engage with the question because, as some students are wont to say, “it wasn’t in the reading.”

These are usually not scenarios where the student is incapable of responding insightfully; rather, the student is just not confident enough to respond or too nervous to respond. Students who have done improv exercises involving initiating scenes and supporting teammates in scenes would naturally be less averse to speaking up in class and doing so in the form of confident and clear statements. They would also naturally be less averse to joining in a discussion after a classmate speaks up (akin to supporting one’s teammate). And they would naturally be more inclined to listen actively and carefully to the professor’s questions and their classmate’s statements. All of which is to say that each of the above scenarios might play out differently—with some robust student participation leading to more productive and constructive discussion. The final scenario in many ways gets to the heart of what improv is all about: going with the flow and accepting whatever comes your way. With exposure to improv, students would perchance be less phased by the question that, while technically not encompassed by the day’s reading, is still well within the scope of the course and their abilities.

In classroom discussion, students whose answers are “wrong” or whose contributions to discussion are somewhat off target tend to view the episode as an embarrassment and a reason not to ever answer a question in class again. I would expect that, with exposure to improv, students would be at least somewhat more inclined to view the episode as a learning experience, which is indeed how it should be viewed.

Posted on the wall of the green room at the theater where I take improv classes is a sign that reads “You Are Enough.” For any law student, improv can impart the message that, even though there will be struggles and mistakes along the way, you are indeed enough. I continue to get nervous before my improv team’s shows, and on more than a few occasions, I have said or done things on the improv stage that fell flat, did not effectively advance the scene, or otherwise just did not feel right. Yes, I get frustrated. But I keep confidently walking in front of the audience because that is how I will continue to learn and grow. So it should be for law students in the classroom, and ultimately in the practice of law, and in life.

Shifting the Focus of Legal Education Back to Just That: Education

I read an article a while ago that has stayed with me titled “Law Schools are Bad for Democracy.”  Given the extraordinary times in which we find ourselves today, I want to be able to disagree.  Law Schools may be one of the few means to save democracy – but not if we keep being distracted[i] by numbers of published article “hits” for rankings or the minutia to which assessments have gone.  In this short piece, I’d like to suggest that we reconsider the broader purpose of legal education and its uniqueness and then rethink some of our priorities for legal education in that context. The perspective we operate from today is the decade-old emphasis on the Big Law gold star standard. [ii]  Isn’t it time to consider what we’ve learned, re-think and make some changes?

We should pay more attention to the bigger picture of legal education, its place in a student’s continuum of education as well as where a legal education may best serve both students and our legal system[iii]. Students come to us with much narrower educational backgrounds than in the past, degreed with any number of concentrations from food or media studies to business, management, or marketing. There are undergraduate concentrations in sustainability. While all these concentrations or majors may help college-graduates enter the job market more quickly, the narrowness of the specializations make our applicants’ education incomplete for law school.  A job-specific specialization in undergraduate school likely means our students lack skills and perspective beyond what was necessary for their specialty.  We have students apply who have never written a full-length paper; have never read – whether in paper or electronically – any of the classics. We have students who have never taken a class in civics or government. When they come to us now, our students need to learn critical thinking not just as a “skill,” but as an integral part of how they think; they need help learning how perspective  informs understanding; and they need help understanding the foundations of law, along with the ability to read and apply precedent as well as rules. Our students need to learn context. Because of limited collegiate exposure and the inward-looking, somewhat impersonal experiences of the social-media generation, our students need to develop an understanding of the people whose disputes were addressed either privately through judicial decisions or publicly through legislation. Further still, our students need to learn to evaluate that foundation for its consistency with democratic values; to critique that foundation by not taking words at face value, and to consider the longer-term consequences to our legal system and society.

It seems to me, given our social times and the state of undergraduate education generally, the need to revisit our focus is imperative. Here are some more specific reasons.

Remember the 2007 Carnegie report and its criticism of law schools for failing to provide law school graduates with enough “skills” for practice? The beginning of the push by Big Law to make sure its incoming associates could do more practical work and thus require less practical firm training?  The year right before the 2008 financial crisis? That’s the year – 2007 – that one of the most recent Senate confirmations to the United States Court of Appeals graduated from law school.  Not quite twelve years ago.[iv]  Right around then, law schools were admonished to keep pace with “trends” in the profession that focused on being employable – already having practice skills and being technology-ready.[v][vi] Part of what makes such judicial inexperience significant to law schools is that in the last two years over 100 new federal judges have been confirmed by the Senate making that number confirmed more than under any previous president. Perhaps even more concerning is that these nominees received more “unqualified” votes than for nominees by all of the previous four presidents combined.[vii] Given the sheer numbers of new appointees alone, though, the role of lawyers in this adversarial system is at an all time high for importance.

But, encouraged by the LSAC,[viii] many law schools still have carved out their niches in narrow areas encouraging students to get a sort of parallel “expertise” that too many under-prepared undergraduate students bring.[ix]  Perhaps like college grads, law school graduates will be able to enter the market more quickly. However, law schools are still criticized for not making students “practice-ready” enough for some law firms; for costing too much, and for not doing enough to help students pass the bar exam[x].  And, to meet the skills and bar scores, law schools have spent hours on “assessments” as though what we really want students to learn is easily quantifiable and measurable. Maybe all of the criticisms are on point, but these repeated criticisms obscure two larger problems – that our students need more perspective and that we may be overly narrow in what we have chosen to “assess.”  Along with undergraduate schools, we have leaned pretty far into assessments.

Even undergraduate school assessment experts have begun to recognize the inadequacies of “assessments.”  In “Harsh Take on Assessment from Assessment Pros” by Doug Lederman,[xi] a group of assessment experts from undergraduate schools found little value in current assessments. The article suggests that efforts to help “’measure’ how much students learn are being used to judge the performance of colleges and universities rather than to help students learn.” The article also suggests that the kinds of information that could be used to help individual students learn and improve the quality of instruction at the university level are not as easily quantified for measurement. For law schools, the problem of using “assessments” as a way to judge law school performance is likely even greater given the multiple layers of accreditation and ranking.[xii]  Lederman cited support in several essays on criticism of assessment, especially “assessment as bureaucratic machine.”[xiii] I really appreciate that phrase:  “assessment as a bureaucratic machine.”

Drawing from the conclusion that assessments designed to measure student learning in a “quantifiable way have been counterproductive,” the article suggests “’backward design’ focused on ‘where do I want them to end up and how do I help them get there.”  That suggestion, though, is followed by repeated criticism of the “’inane’ focus on student learning outcomes[xiv]” as a reminder that reverse engineering ought not to be used to generate yet more narrow outcomes, but to look at evidence of student development and progress – progress that should be seen in light of a school’s identity and purpose.

In “Law Schools Are Bad for Democracy,” Samuel Moyn, professor of law and history at Yale University[xv]  examined how law schools, in his case, elite law schools, struggle to define their own identity and serve the aspirations of their students.  He briefly traced criticisms of elite law schools as serving to entrench social hierarchy. He noted how current criticism suggests law schools remain “factories for mass conversion of pliant subjects into large-firm lawyers” [xvi] while at the same time participation in “strategic legal activism”[xvii] tends to be more of a trend among students.  Prof. Moyn encouraged law schools to re-examine their first-year curriculum that requires learning tenets of judge-made private law and to instead expose students to more public-law courses and help them develop a stronger ability and criteria to critique what they read[xviii].

As part of re-examining law school missions that could be the start of backward design, Prof. Moyn asks: Are law schools supposed to be mostly about educating lawyers to solve other peoples’ private problems? Advance social justice?  And these big picture questions are being raised at the same time sharper criticism is being leveled at the schema used to quantify and “measure” whatever success students have toward whatever those goals are.[xix]

So, if we reverse-engineer from where we want our students – future lawyers and judges – to end up, should we not think more broadly than large law firms as the top priority? Given changing times, should we not reconsider what we prioritize for where our students wind up in their education?  To reverse engineer, evaluate what lawyers bring to our legal system that non-lawyers don’t. At the very least, our students should have knowledge of the law and a broad understanding of all that is our legal system along with their part in that system beyond their experiences as lay people.

What a re-imagined focus might yield for course coverage is complicated and will vary. Everyone may disagree, but I would look at whatever makes having a legal education most valuable – that takes advantage of uniqueness in legal education.  I would imagine a deeper dive into each legal subject putting saving greater breadth of coverage for upper level or additional courses. To allow for deeper examination, maybe reduce the number of “practice” requirements that could otherwise happen on-the-job such as client counseling at the same time still providing some of the more unique, lawyer-only experiences such as trials for context as well as integrated experiences such as in-house clinics that provide depth of knowledge combined with experience and context. Maybe legal writing would focus less on litigation-driven document-drafting; omit the objective memorandum; or focus less on citations because courts have different citation manuals and one really does not need to be a lawyer to prepare citations. Perhaps reconsider school-wide bar passage emphasis on the theory that the law school experience should provide the context and depth as education; bar review courses can, and do, perform the task of skimming the surface of a wide swath of topics in direct preparation for that test. Academic support could focus more on context for understanding law and our legal system rather than teach limited teaching skills like as case briefing.  Given where we are today, I hope re-designing legal education prompts us to focus less on detailed quantifiably measurable skills to ensure assessments are met and more toward striving for student learning that goes beyond what non-lawyers attain.

It seems to me our society would benefit now from law schools reconsidering how we prioritize what our students need to be lawyers, future judges and leaders in the context of the broader picture of our legal system and society. With one hundred new judges and more on the way, the need for well-rounded lawyers to participate in our legal system has never been greater.



[i] Ask who benefits most from those distractions.  You cannot possible think that our justice system benefits by the number of times a law school professor’s article receives a “hit.” There are bots that can generate those hits and downloads.  Johan Sebastian Bach; Henry David Thoreau; Emily Dickinson; Herman Melville; Vincent Van Gogh: these are people who have contributed to western society in incalculable ways but were never famous – didn’t receive the “hits” until after they died.  Counting cites or hits is a distraction.

[ii] https://www.americanbar.org/groups/law_practice/publications/law_practice_magazine/

[iii] There is no doubt that a lawyer’s ability to represent clients out of the gate is important. There is also no doubt that the cost of law school has put many students in the position of searching for the highest paying job.  These are important concerns, just not the subject of this short piece.

[iv] Judge Rudge Rushing obtained a clerkship with then-Judge for the Tenth Circuit Neil Gorsuch.

[v] Five years and one financial crisis later, “Failing Law Schools” hit the market and it criticized schools and faculty for prioritizing research and scholarship over teaching practical skills.

[vi] The book also slammed U.S. News & World Report’s ranking system in part as leading to corrupt practices to get law schools better rankings such as – inflating LSAT scores. Imagine what the author would say about a ranking system that counts what are in essence “hits” on someone’s publications.

[vii] https://news.bloomberglaw.com/us-law-week/trump-picks-more-not-qualified-judges-1

[viii] https://www.lsac.org/discover-law/pathways-legal-career

[ix] Cite Rebecca Flannigan, “The Kids Aren’t Alright: Rethinking the Law School Skills Deficit,” 2015 B.Y.U. Educ. & L.J. 135 (2015).

[x] When bar pass emphasis starts in the first year, I would say the hyper-focus on passing the bar comes close to  missing the point of a legal education.  The Bar exam has problems with its questions; problems as a standardized test, and with emphasis on topics that may or may not be important to future practice. Moreover, as many people have said, the Bar exam is a mile wide and an inch deep.  Our students should have knowledge beyond that surface and the ability to consider and factor much more than rule or rule outlines as they become lawyers.

[xi] Inside higher ed.

[xii] ABA Accreditation for the law school directly; for law schools as part of broader graduate schools Middle States or regional equivalent; evaluation by the larger university.

[xiii] Doug Lederman, “Harsh Take on Assessment from Assessment Pros,” Inside Higher Ed, April 17, 2019. https://www.insidehighered.com/news/2019/04/17/advocates-student-learning-assessment-say-its-time-different-approach at 3-4.

[xiv] Lederman at 5.

[xv] “Law Schools Are Bad for Democracy,” Samuel Moyn, The Chronicle of Higher Education December 16, 2018.  https://www.chronicle.com/article/Law-Schools-Are-Bad-for/245334

[xvi] Id. at 5.

[xvii] Id. at 6.  Other aspects of Prof. Moyn’s article focus more directly on obligations of elite law schools. Because there are more non-elite than elite law schools I have chosen to focus on and adapt concerns to law schools more generally.

[xviii] Interestingly, Prof. Moyn also questions the almost pedestal on which judges are placed. Such criticism seems apt considering the relative inexperience of many new federal judges. Such criticism, though, is important regardless of judicial experience if only because judges are human, too.

[xix]Lederman at 6. This article summarizes events from the Academic Resource Conference by college and universities – not law schools.

Law School Specialty Program Rankings: Is the Tail Wagging the Dog?

By:  Robert R. Kuehn, Associate Dean for Clinical Education, Professor of Law

In addition to its annual ranking of law schools, U.S. News ranks nine law school specialty programs. Although the school rankings rely on a number of factors to compute a final score (peer assessment, lawyers/judges assessment, LSAT, undergraduate GPA, etc.), specialty rankings are based solely on reputation. A faculty member teaching in the specialty area at each school is asked to assess the specialty programs at the almost 200 other ABA accredited schools. As this column explains, there are serious problems with this type of ranking.

In previous years, the person rating other schools was directed to choose the top 15 programs in that specialty area. U.S. News would then list the top 12 to 20 programs (and even more behind a paywall) based on how often the school was listed by raters. This year, U.S. News asked survey participants to rate specialty programs on a 5-point scale of Outstanding (5) to Marginal (1), mirroring the much-criticized method used to assess the academic reputation of schools among peers and lawyers/judges.

The response rates among the faculty who received this year’s specialty program ballot varied from 44% (International Law) to 65% (Clinical Training). With the new scale, U.S. News chose to rank in order every program that received at least 10 ratings. This resulted in publicized rankings from 1st to 186th, with individual school scores ranging from 4.9 to 1.0 and average program scores from 2.1 (Intellectual Property and International Law) to 2.9 (Dispute Resolution).

A glance at the new rankings shows noticeable bunching of schools ranked highest by U.S. News also at the top of most specialty programs and an overpopulation of lowest ranked schools at the bottom. Is this because the highest ranked schools also happen to have the highest quality specialty programs and vice-versa for the lowest ranked schools? Or, might the over-representation at the two ends be due in part to a halo effect where the academic reputation of a school is influencing the rating of that school’s specialty program? This effect could be especially pronounced this year, when raters were asked to not just opine on the quality of 15 specialty programs but to express an informed opinion on 200.

To examine the strength of the relationship between a school’s peer assessment and specialty program reputation, scores for each school in every specialty were collected and analyzed. The Pearson’s correlation coefficients between the two scores are set out below:



Where a coefficient of 1.0 indicates a perfect relationship between two variables, coefficients of .7 or greater are often characterized as “very strong.” The ratings of some specialty programs (e.g., International and Tax Law) stand out as very strongly associated with the school’s academic reputation, while two programs were negligible and lacked statistical significance to their school’s reputations (Dispute Resolution and Legal Writing). Law school reputation scores explains 80% of the variability in International and Tax Law specialty rankings and over 50% of the variance in Environmental and Intellectual Property Law scores. The large correlations suggest that some raters may be cueing off a school’s academic reputation and not independently judging the quality of its specialty program. But because correlations do not prove causation, it could also be that schools with better academic reputations also largely have better specialty programs.

A closer look at the distribution of Clinical Training scores suggests that some haloing is going on, at least when raters are asked about schools with the highest and lowest academic reputations. Clinical scores were strongly correlated with a school’s academic reputation (r=.66), with 44% of the variation in scores explained by school reputations. The graph below plots a school’s peer assessment score against its Clinical Training score with a regression line added (β=.51, p < .01) to illustrate the positive relationship.




Notice that every school with a peer assessment score of at least 4.0 also had a Clinical Training score above 3.0. Conversely, only one out of thirty-two schools with a peer assessment score of 1.5 or less garnered a clinical rating above 3.0. Clinical scores for schools in the peer assessment mid-range appear to be more dispersed. Correlation analysis supports these impressions — correlation coefficients for the 50 top-rated and bottom-rated schools were .69 and .65 respectively, while the coefficient for middle-ranked schools was only .37.

Data from the ABA and surveys of clinical programs by the Center for the Study of Applied Legal Educating (CSALE) provide additional variables potentially associated with clinical program rankings. There is a moderately strong association between the size of a school’s J.D. enrollment and its Clinical Training score, with a statistically significant correlation of .49 (explaining 24% of the variance in clinical ranking). There is a similar association between the number of “slots available” in law clinics and program scores (r=.48), but beware that slots available reporting by schools is largely unregulated and varies widely. The number of full-time law clinic and externship faculty reported in CSALE surveys is also related to program ranking, although less strongly (r=.31). Do these correlations indicate that larger schools tend to have higher quality clinical programs or are their programs and number of clinical faculty just larger and more noticeable?

An increasing number of schools are guaranteeing or requiring their J.D. students a law clinic or externship experience before graduation. This potential signal of clinical experience availability or quality, now at 75 schools, is only weakly associated with program ranking scores (r=.17).

U.S. News explained that its goal in moving to the new scale was to rank more schools (five to six times more) and provide better understanding of the scope of the differences between schools. Yet when there is strong evidence that many raters may cue off a school’s overall reputation and not independently assess the quality of its specialty program, U.S. News’ rank ordering of school specialty programs down to 192 is of dubious validity and should also call into question the use of this method in its annual law school ranking.


Time to Remedy the Ills Afflicting ABA Council’s Standard 316 Proposal

By: Professor Judith Welch Wegner, Burton Craige Professor of Law, Emerita


         Many readers of this blog are well aware of ongoing efforts by the ABA’s Council of the Section on Legal Education and Admission to the Bar to revise law school accreditation standards in troubling ways.  <https://bestpracticeslegaled.albanylawblogs.org/?s=standard+316>


         The Council’s proposal is to amend Standard 316, which currently requires law schools to demonstrate a 75% bar passage rate of graduates within 5 years of graduation, but allows law schools to provide evidence justifying an extended period in which to come into compliance based on a variety of factors (including trends, academic support efforts, transfer patterns, and more). https://www.americanbar.org/content/dam/aba/publications/misc/legal_education/Standards/2018-2019ABAStandardsforApprovalofLawSchools/2018-2019-aba-standards-chapter3.pdf


         The Council’s proposed revised standard would instead provide that “At least 75 percent of a law school’s graduates who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation.”  That’s it. A shortened period for demonstrating compliance, and a bright-line standard that does not consider mission, near compliance, remedial efforts, or anything else.  As a result of this proposed revised standard, a significant number of law schools  would have their accreditation put at risk, particularly in jurisdictions such as California where bar passage rates are routinely much lower than the proposed 75% passing score, and particularly for schools whose missions reflect a commitment to enrolling and increasing the opportunities for non-traditional students and people of color.  For more information on the proposal’s history and the views of supporters and critics, see < https://www.americanbar.org/groups/legal_education/about_us/leadership/council_meetings/>


         But, wait, there’s more!  The ABA Council has brought this proposal forward to the House of Delegates twice before (in 2016 and in 2019) and, significantly, twice, it has been roundly defeated as a result of concerns about the impact of the revision on diversity within the profession, its uneven application given the widely varying rates of bar passage in recent times, the Council’s failure to engage in meaningful research about disparate impact, and a lack of transparency in the Council’s proceedings.  The ABA House of Delegates defeated the proposed revision most recently at its mid-year February 2019 meeting in Las Vegas when the vote was 88 in favor and 334 opposed.  < https://taxprof.typepad.com/taxprof_blog/2019/01/aba-house-of-delegates-again-rejects-75-bar-passage-within-2-years-accreditation-standard-79-to-21-f.html>


         Nonetheless, the ABA Council has put the proposed revision on its agenda for its May 16-18 meeting once again, since it is only obliged to give the House of Delegates a chance to express its views, not to follow those views.  The ABA Council has also recently published data on law school bar passage rates, including data from 2016 and preliminary data from 2018.  <https://www.americanbar.org/news/abanews/aba-news-archives/2019/04/aba-section-of-legal-education-releases-comprehensive-report-on-/>  The ABA press release puts a favorable spin on these data, stating that “First-time takers in 2018 achieved a 74.82 percent pass rate, which compares to a 77.34 percent pass rate for 2017.”  But interested readers should look beyond these generalized numbers to see the impact demonstrated of the proposed new standard based on the data in the associated tables (that show schools as risk by state and state bar passage rates).


         I appreciate the ABA Council’s desire to simplify and impose bright-line rules as a matter of bureaucratic convenience.  I also understand that the Council likely is responding from pressure from those who believe that, when students invest in legal education, they should not be left high and dry without anticipated careers should they fail to pass licensing examinations.  But these views are excessively simplistic.


         Over-simplification should not win the day particularly in the absence of much-needed, in-depth analysis.


  1. The ABA Council is proposing an action that will have significant disparate impact on minority candidates seeking admission to the bar, but has not undertaken meaningful research on this topic.
  2. If the theoretical basis of the ABA action is tied to a requirement that law school graduates demonstrate a basic level of competence, it cannot reasonably claim that state passing percentages are tied to such a determination given their widely disparate passing rates. Able scholars such as Professor Joan Howarth have demonstrated that other professions that employ a national licensing examination (such as the Uniform Bar Exam) employ a single standard “cut score” tied to competence, but American jurisdictions continue to have widely different cut scores, suggesting that a national standard cut score needs to be developed sooner than later.  https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3010168
  3. Significant questions should be raised regarding the authority of the ABA Council in effect to prohibit the continuation of law schools in jurisdictions such as North and South Dakota or institutions serving minority populations in jurisdictions where state legislatures have failed to provide adequate funding for public K-12 education in areas with substantial minority populations. If state legislatures and state courts believe that law schools in these contexts are serving a worthwhile purpose, why should the insular and insulated ABA Council be allowed to shut them down?
  4. Recent developments in federal antitrust law should also be reckoned with. The United States Supreme Court’s decision in North Carolina Dental Board v. FTC, 574 U.S. ___ (2015), provided an important gloss on the “state action” exemption.  States with licensing boards populated by significant numbers of professionals they regulate need to consider carefully whether there is adequate “active supervision” to assure that licensing boards (such as bar examiners) are not limiting access to protect their own economic interests.
  5. The ABA Council’s proposal assumes that existing bar examinations can actually be assumed to document basic competence for lawyers. However, the ABA Council has in not justified this assumption, and more notably has not demonstrated that a 75% bar passage rate that varies widely from state to state reflects a shared judgment on competence.  Nor has the ABA Council  addressed the importance of wide swings in performance across all jurisdictions (whether using the Uniform Bar Examination or not).  The National Conference of Bar Examiners has recently established a task force on testing that is responsible for examining potential changes in bar examinations moving forward.  < http://www.ncbex.org/statistics-and-research/testingtaskforce/>  A good case can therefore be made the ABA Council should not adopt changes in accreditation standards before the NCBE engages in further analysis and provides better documentation about the validity of the bar exam nationally.


         Happily, a better way through this morass is apparent, thanks to the work of <Professor Bill Patton> (previously at Whittier, now at USC/UCLA).  He has developed a compelling analysis of the implications of the ABA’s Council’s plan to revise standard 316 and has effectively demonstrated the error of the ABA Council’s approach.  His work demonstrates the significant adverse effect on minority bar applicants in California (and on California law schools) if the ABA Council’s proposal is adopted. Patton rejects the ABA Council’s focus on requiring a 75% bar pass rate in individual jurisdictions and instead urges consideration of a standard that would focus on a national standard of competence benchmarked against performance on the UBA/multi-state multiple choice exam.  He shows that graduates of many California schools at risk under the proposed ABA revised standard have actually outperformed projections/indicators and have demonstrated competence that exceeds graduates of law schools in New York, Connecticut, Georgia and elsewhere.  In effect, Professor Patton’s study demolishes the ABA Council’s superficial analysis and offers an alternative approach that would have a less significant disparate impact on law schools which have shown that they can educate minority lawyers who out-perform their admissions indicators.  At the very least, the ABA Council should step back and consider his analysis before taking further steps to adopt their initial poorly-conceived and poorly-justified proposal.


         The stakes are high for the integrity of legal education.  In this difficult time, it is more important than ever that diverse candidates enter the legal profession and represent those who have traditionally been at the margins of society and social justice.  Readers of this blog should also recognize that the membership of the ABA Council on Legal Education is determined by the vote of members, including faculty members who have membership by virtue of law school faculty membership.  The ABA Council’s nominating committee has just revealed its nominees for Council membership < https://www.americanbar.org/content/dam/aba/administrative/legal_education_and_admissions_to_the_bar/19-20-legal-ed-council-slate-announcement.pdf> and it remains striking to me how poorly the Council reflects the insights of broad swaths of the legal profession and legal education community, and how little meaningful research has been done by ABA Council staff to support arguments that would significantly curtail minority participation in the legal profession and respond to the clearly-articulated concerns of the ABA House of Delegates.


          Make your views known.  It is important for all law faculty members and concerned observers to convince the ABA Council that, if they wish to change the existing standard,  they need to adopt the Patton proposal that benchmarks law student bar performance against national bar examination cut scores rather than individual state percentage pass rates.  Please write to the ABA’s Consultant on Legal Education, Barry Currier, and insist that he share your views with the Council before they take steps to decimate minority admission in states such as California.  Currier’s email is: Barry.Currier@americanbar.org.  Please also take care to consider whether you believe the ABA Council represents your views.  Its membership is listed at https://www.americanbar.org/groups/legal_education/about_us/leadership/


          Doing nothing allows the ABA Council to undercut the diversity of the legal profession without engaging with core principles.  Is that an acceptable practice in your view?  The ABA now charges law schools more than $20,000 per year to fund its activities, while acting on simplistic bureaucratic principles.  Is that an acceptable use of limited law school funds in your view.


Step up.  Express your views.  The future of the legal profession depends on you.




American Bar Association                                                                 April 27, 2019 Council on Legal Education and Admission to the Bar                                          321 N. Clark Street, 19th Floor                                                                                    Chicago, IL 60654




          Conflicting reports indicate that either nineteen (19) or sixteen (16) ABA law schools’ 2015 graduates failed to meet the proposed 75% in 2-year bar passage standard.[1]  The number of ABA law schools whose 2016 graduates failed to meet that standard increased to twenty-five (25)[2], and nine out of the schools that failed that standard for 2015 graduates performed even worse for 2016 graduates.  


          But more problematic are the predictions from the National Conference of Bar Examiners and from the 2018 bar passage results that the number of ABA law schools that may fail to meet that proposed standard will dramatically increase for 2018 and later graduates.  The ABA’s recent publication on the bar passage rates of ABA law schools in 2018 demonstrates that forty-nine (49) ABA schools had 2018 cumulative first-time bar passage rates below 65% and thus are predicted to have great difficulty in meeting the proposed 75% in 2-year bar standard for 2018 graduates in 2020.[3]


          According to the ABA 509 disclosures regarding the demographic composition of 2018 first-year classes, these 49 at-risk schools have a cumulative total of 1162 Hispanic and 1137 Black entering students.[4]  According to the ABA report on 2018 enrollment by race, these 49 schools account for 37% of all Black first-year students (1,137/3,033) and 35% of Hispanic first-year students (1,692/4,808).[5]  If even half of those 49 schools that are  predicted to fail the proposed 75% in 2-year standard actually fail, diversity in the bar will be alarmingly reduced.[6]


          The following analysis demonstrates that the Council’s 2015 and 2016 graduates’ studies on the effects of the proposed 75% in 2-year bar passage standard substantially underestimate the future number of ABA approved law schools that will fail that test for 2018 and later graduates.  For example, in 2018 according to the Council’s own predictive model [first-time bar passage rates below 65%], the following nine (9) California ABA approved law schools may be de-accredited:  (1) California Western; (2) Golden Gate; (3) La Verne; (4) Southwestern; (5) Thomas Jefferson; (6) USF; (7) Western State; (8) U. C. Hastings; and (9) McGeorge.    All nine of these California ABA law schools have mean bar examination passage rates for July 2017 through July 2018 well below 65%, and their 2018 overall passage rates are even lower:  [7]





          The potential de-accreditation of  those nine at-risk California ABA schools will result in a loss of 59% of Hispanic first-year California ABA law students (497 out of 842) [see Table 4 infra.], and a loss of 51% of Black first-year California ABA law students (105/205).[8]  Those 9 California ABA law schools’ Hispanic (H) and Black (B) 2018 first-year enrollments were:  (1)  Cal Western:  41 H + 12 B; (2) U.C. Hastings:  59 H + 6 B; (3) Golden Gate:  63 H + 33 B; (4) McGeorge:  37 H + 8 B; (5) Santa Clara:  61 H + 7 B; (6) Southwestern:  75 H + 12 B; (7) Thomas Jefferson:  17 H + 9 B; (8) La Verne:  38 H + 5 B; and (9) Western State:  70 H +6 B.   These alarming diversity statistics, alone, warrant the Council abandoning or radically altering its proposed 75% bar passage rate within 2-years of students’ graduation.


          Making matters worse, in order to meet the proposed 75% in 2-year standard schools must have decent “repeater” bar examination passage rates.  However, just like the California first-time passage rates, the “repeater rates” have fallen dramatically since 2014 making it much more difficult for schools to meet the proposed standard.  For instance, consider the significant reduction in California repeater passage rates between 2014 and 2018 at the nine at-risk California ABA law schools:[9]




          It is also alarming that the NCBE has predicted that “repeater” bar passage rates will continue to fall for the next few years.  The NCBE has demonstrated that the number and percentage of repeat takers is dramatically increasing as schools’ first-time bar passage rates and as the mean MBE scores continue to fall to historic lows.   The gist of those NCBE analyses is that:  (1)  the slight MBE mean increase in July 2016 and July 2017 was followed by a precipitous decline to the lowest  July MBE mean since 1984[11]; (2) because of the significantly lower first-time passage rates on the bar, a greater number of February bar takers are “repeaters”[12]; (3) the February 2018 mean MBE score is lowest score in MBE history, and (4) the “mean MBE score in February has been declining every year since 2013”.[13]    These “repeater” trends (significantly more repeaters with lower passage scores) must be factored by the Council in predicting which schools will fail to meet the 75% in 2-year standard in 2018 and beyond.


          The NCBE data predicts even lower California annual bar passage rates in the next few years for three reasons.  First, the February 2018 and July 2018 national MBE mean scores are at historic lows.  Second, California ABA law schools’ annual repeater bar passage rates in 2018 have fallen between 9% to 21% in 2018.  And, finally, the number and percentage of California repeat test takers on the February bar exam have steadily  increased since 2015:  (1) In Feb. 2015 repeaters were 76% of test takers; (2)  in Feb. 2016 repeaters were 79%; (3) in Feb. 2017 repeaters were 81%; and (4) in Feb. 2018 repeaters were 82% of all CA ABA test takers.[14]  As California passage rates continue to decrease, the number of repeat test takers will also continue to substantially increase.


          It is interesting that one of strongest proponents of promulgating the proposed 75% in 2-year standard, Law School Transparency,  has submitted an analysis to the Council that supports my prediction that at least 9 CA ABA schools will fail that standard for 2018 graduates.[15]  LST’s optimistic analysis states that if a law school has a first-time bar passage rate of at least 60% [compared to the Council’s predictive standard of 65%] and “repeater” passage rates decline 50% for each of the next 3 bar administrations [30%/15%/7.5%], a school should be able to meet the 75% in 2-year standard.[16]  However, the 9 ABA law schools that I am predicting will fail the proposed standard for 2018 graduates have first-time and repeater passage rates well below LST’s prediction for success.  For instance, in 2018 eight out the nine California at-risk law schools had first-time passage rates below 60% and eight of the nine schools had repeater passage rates below 30%. 


          For example, Golden Gate had a 34% July 2018 first-time passage rate and its February/July 2018 repeater passage rate was only 19%[17].  In order to meet Law School Transparency’s prediction of reaching a 75% cumulative passage rate in 2-years Golden Gate will need to more than double its 19% repeater passage rate.  However, the NCBE has predicted that “repeater passage rates” will continue to fall for several years[18].  Also consider McGeorge which had a July 2018 first-time passage rate of 50%, but a February/July 2018 repeater passage rate of only 23%.  Again, under LST’s and the Council’s predictive models McGeorge will need to substantially increase its repeater passage rate in order to meet the proposed standard.  Further, even California Western currently fails LST’s predictive model since its July 2018 first-time passage rate is only 52% and its February/July 2018 repeater passage rate is only 30%.


  1. Using Law Schools’ Mean MBE Scores in Relation to the National Mean MBE Score Will Protect Student Consumers and Will Have a Substantially Reduced Effect on Diversity Than Sole Reliance on Bar Exam Passage Rates.


          In my article, A Blueprint for A Fairer ABA Standard for Judging Law Graduates’ Competence:  How A Standard Based on Students’ Scores in Relation to the National Mean MBE Score Properly Balances Consumer Safety with Increased Diversity in the Bar, 24 Wash. & Lee J. of Civ. Rts & Soc. Justice 3-62 (2017), I analyze the benefit of including schools’ mean MBE scores as another variable in determining the quality of schools’ educational programs.  I am not proposing the elimination of bar passage rates as a relevant accreditation variable, but rather, the following proposed accreditation standard consists of two alternative means of showing compliance with ABA Standards 301, 316, and 501:  (1) bar passage rates; and/or (2) mean MBE scores.  I am proposing the following amendment to the Council’s proposed changes to Standard 316:


At least 75 percent of a law school’s graduates who sat for a bar               examination must have passed a bar examination administered within two years of their date of graduation, or a law school must be within 10 per cent of         the national mean MBE score on two out of the four bar examinations                             administered since the date of the students’ graduation.

          The advantages of my proposed “mean MBE score” standard over a standard like 316 based solely on bar percentage passage rates are[19]:


  1. This standard brings the ABA into conformity with other U.S. accreditation organizations such as those in medicine and nursing by using a “national mean standard” that the DOE has historically approved and that is not affected by states’ specific licensing standards, such as bar examination cut scores;
  2. It provides a uniform standard for every law student attending almost every ABA approved law school;[20]
  3. It avoids disparate diversity impact among the states that is inherent in the current and in the proposed amendments to the Standard 316 bar passage rule; and,
  4. It provides prospective law students with much more information about schools’ academic performance outcomes and permits comparisons among ABA schools regarding their “mean MBE scores” in relation to their students’ LSAT’s, GPA’s, and diversity [value-added instruction].


          It is important to note that at the Council’s February 2019 Open Session several members addressed whether an accreditation standard based on the MBE would be workable and effective.  For instance, Barry Currier asked Judith Gundersen, NCBE President, whether using the MBE as an accreditation metric was possible.  Gundersen responded by saying that MBE scores are a representative model for total bar examination performance.  She clarified that the MBE score is an accurate proxy for students’ performance on the essay portion of bar examinations.  At that point Gregory Murphy responded that the MBE is a more reliable testing instrument than the various essay alternatives offered on the states’ different bar exams.[21]  Therefore, the Open Session comments have provided substantiation to my proposal to use the national mean MBE  score as an alternative accreditation metric to demonstrate whether, under Standard 301(a), law schools are offering a “program of legal education that prepares its students, upon graduation, for admission to the bar…” and under Standard 501(b)  that “[a] law school shall only admit applicants who appear capable of satisfactorily completing its program of legal education and being admitted to the bar.”[22] 


          Although I am proposing a standard that requires law schools to have mean MBE scores within 10% of the national mean for two out of the four bar examination administrations since students’ graduation, that standard could need a slight “tweaking” since not all mean MBE school data is publicly available for consideration at this time.   On April 25, 2019 I contacted Judith Gundersen, NCBE President, and requested the mean MBE scores for HBCU law schools from February 2014 through July 2018 in order to determine the impact of my proposed “10% within the national MBE score” standard on those schools.[23]  Regrettably,  the NCBE does not have a list of specific law schools’ mean MBE scores.[24]  However, there are two simple methods for schools to obtain their MBE data.  First, like California law schools, each school can obtain the school’s MBE mean from their state bar association or bar examination administrator that receives directly from the NCBE a list of those students who took the MBE for the bar exam.[25]  Second, if a state bar association refuses to provide schools with their students’ MBE scores, those schools can submit a list of their students to the NCBE which can provide that data directly to each law school in order to calculate the school’s mean MBE score for each bar examination.  The administrative cost to law schools is minimal in determining mean MBE scores in relation to other data, such as employee statistics, that the ABA currently requires to be reported.


A)      Examples of The Different Results Between the Council’s Proposed 75% in 2-Year           Amendment to Standard 316 and My Proposed National Mean MBE Standard.


          The Council’s recent study of 2015 graduates’ 2-year bar passage rates found that four California law schools during 2016-2017 had bar passage rates below 65% and might be at risk under the proposed 75% in 2-year bar passage standard.[26]  The following analysis of Southwestern’s and California Western’s  California bar exam passage rates in relation to their mean MBE scores from July 2014 to July 2018 demonstrates the benefits and fairness of my national mean MBE standard over only using the Council’s proposed 75% in 2-year bar passage standard.


          Table 1 demonstrates that Southwestern law school from February 2017 to July 2018 did not score 65% or better on the California bar examination; however, during those four bar administrations it exceeded the national mean MBE score on 4 out of 4 (100%) of the examinations.  However, since Southwestern failed to achieve a 65% passage rate on all of those bar exams, under the Council’s predictive model it may be at risk of failing to meet the Council’s proposed 75% in 2-year bar passage standard.


          Rather than being placed at risk of de-accreditation, schools like Southwestern that enroll a diverse class, who provide those students with rigorous academic support, and which achieve MBE mean scores well beyond the expectations of students’ entering LSAT/GPA’s, should be lauded, not de-accredited.  Southwestern Law School  for more than 100 years has provided access to the profession for a very diverse student body.[27]  In 2018 it enrolled 75 Hispanic and 12 Black law students.[28]   Southwestern students’ mean MBE scores are higher than those at other higher ranked schools which have higher entering student LSAT/GPA’s even though Southwestern’s student body is substantially more diverse.[29]


          Also consider another example,  California Western.  Its  2017-2018 bar passage rates and mean MBE scores were:                                                     


          As Table 2 clearly demonstrates, from February 2017 to July 2018, California Western has only scored 65% on the California bar exam on 1 out of those 4 exams.   And on the most recent exam, July 2018, its first-time passage rate fell to 52%.   Even though California Western students scored below 65% on the bar exam in 3 out of those 4 tests, its students  far exceeded the national mean MBE score on all 4 out of 4 exams:  Feb. 2017:  +4.40; July 2017:  +5.0; Feb. 2018:  +10.80; July 2018:  +3.30.  California Western’s mean MBE score from February 2017 to July 2018 was so high, 142.9, that its students would have been admitted to the bar in almost every other state except Delaware (145).  It is beyond absurd that schools, such as California Western, whose students demonstrate great proficiency on the bar examination (MBE) in relation to all other students nation-wide taking the same national examination should be at risk of failing an ABA  bar passage standard or at risk of being de-accredited.


B)     The Council’s Report on the 75% in 2-Year Standard Does Not Adequately Discuss           the Law Schools That Are Close to and Trending Toward Failing That Proposed              Bar Passage Standard. 

          In the Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION:  REVISIONS TO STANDARD 316:  BAR PASSAGE, it failed to sufficiently discuss the number of law schools nationally that are very close to failing the proposed 75% in 2-year bar passage standard and did not discuss schools whose bar passage rates were trending toward failing to meet that proposed standard.  That data on the list of law schools close to failing to meet the 75% in 2-year standard is critical to the discussion of whether or not to approve the proposed bar examination passage standard.  In addition, the Council’s analysis did not include the many more ABA schools projected to fail the proposed standard as a result of very low 2018 bar passage rates.


          The Council’s analysis only included these cryptic statements regarding schools with bar passage rates between 65% to 69%:  (1)  For 2015 “seven more schools had rates between 60 and 69 %”; (2) for 2016 the Council did not report on the number of schools between 60 and 69%; and (3) for 2017 the Council, again, failed to report on those schools between 60 and 69%.  The Council has not produced any evidence so far about how many schools are currently meeting the 75% in 2-year standard, but which are trending toward failure.  That data is essential in order to help predict how many more schools might be adversely impacted by the proposed rule and how those dis-accreditations might affect diversity and access to justice.

          For example, consider two California ABA law schools, Chapman and Pepperdine, that easily met the 75% in 2-year standard in 2015, but which are trending quickly toward a failure to meet that standard in the near future.


  1. Chapman’s Plunging Bar Passage Rates in 2016-2018.


          As demonstrated in Table 3, Chapman’s first-time bar passage rate on the July 2014 California exam was an excellent 75%; however, since July 2016 it is “trending” toward failure of the proposed 75% in 2-year standard.  Its mean July 2016 to July 2018 bar passage rate is only 60.3%, well below the Council’s predicted success of meeting the proposed 75% in 2-year bar standard. 



          The discrepancy between Chapman’s 2016-2018 low bar passage rates which all fell below the Council’s predicted success rate of 65% and Chapman students’ performance on the MBE is astonishing.  In July 2016 with a 57% bar passage rate the students exceeded the national MBE mean by 6.6 points.  In July 2017 the students had a 64% bar passage but exceeded the MBE national mean by 6.3 points.  And in July 2018 the 60% bar passage rate exceeded the MBE national mean by 4.9 points.


          Chapman’s trend predicts that its 2018 graduates will have great difficulty meeting the 75% in 2-year standard under the Council’s predictive index.  First, its July 2018 first-time passage rate fell to 60% (15% lower than in July 2014).  But equally troubling is the trend in Chapman’s “repeat” test takers.   Consider the following comparison of Chapmans’ number and bar passage rates of its “repeat test takers” in Feb/July 2014 with those in Feb/July 2018:


                   Feb/July 2014 Repeaters                                 Feb/July 2018 Repeater  Number Taking        Mean Passage Rate         Number Taking           Mean Passage Rate             81                                 56.0%                        116   (+70%)               39% (-17%)  


This dramatic increase in the number of Chapman “repeaters” and their substantially lower passage rates will make it more difficult for Chapman in 2018 to meet the proposed 75% in 2-year standard.


  1. Pepperdine Law School’s Falling Bar Passage Rates.


          From July 2014 to July 2018 Pepperdine’s first-time California bar exam passage rates have been steadily declining from a high of 78% in 2014 to 65% in 2017, and its trending passage rate (July 2017 and July 2018) is only 65.6%, barely meeting the Council’s predictive index of successfully meeting the proposed 75% in 2-year standard.[30]  In addition, Pepperdine’s “repeater” bar passage rate is trending much lower (20% in July 2018) versus 56% in July 2014, making it even more difficult for the school to meet the 75% in 2-year standard in 2018.[31]


          The discrepancy between Pepperdine’s falling first-time and repeater bar passage rates and its students’ mean MBE scores is illustrative of the unfairness in the Council’s proposed 75% in 2-year bar passage rate.  For instance, even though Pepperdine’s first-time bar passage rate in July 2017 was only 65% (the minimum predictive Council score for meeting the proposed standard), its mean MBE score was 145.1 [3.4 points above the national MBE mean].  Similarly, even though Pepperdine scored only 66% on the exam in July 2018, its MBE mean of 148.1 was 8.6 points higher than the 139.5 national MBE mean.  This data demonstrates that Pepperdine is not only providing its students with an excellent education, but also that its students are performing well enough to pass every other bar exam in the country.


          None of the Council’s reports have provided data on how many ABA law schools, like Chapman and Pepperdine, are in 2018 trending toward failure of that standard in the near future.  However, before the Council votes, it owes the public a full report on the impact of the proposed rule on the number of law schools likely to soon fail the proposed standard based on passage rate trends, as well as  the impact of the loss of those trending schools on diversity.[32]


2)     Student Consumers’ Limited Access to Law Schools’ Mean MBE Data Distorts                Schools’ Student Outcome Measures and Undervalues California Schools’                      “Value-   Added” Educational Programs.


          It is unclear why some states publish law schools’ mean MBE scores while others choose to keep that information secret.  Although I have disclosed California law schools’ mean MBE scores in my Blueprint article, supra., the following discussion publishes the mean MBE scores for a few out-of-state law schools and compares those out-of-state schools’ “mean MBE scores/bar passage rates” with California ABA law schools.  This data also demonstrates that these California law schools are achieving higher mean MBE scores even though their students are much more diverse than the comparative schools that are located in low cut score states.  That result is known as “value-added” education.

                                                                                                                                                                    Tables 5 and 6, infra.,  for the first time present clear evidence that permits us to test whether the Council’s proposed 75% in 2-year bar passage standard adequately balances the gist of ABA Standard 301 (student entering statistics), Standard 503 (student outcome measures), and Standard 206 (diversity in the profession).   This data also challenges the Council’s declared purpose, student consumer protection, for its proposed amendment to Standard 316.  The following comparisons among California ABA law schools and out-of-state ABA law schools demonstrates that the proposed 75% in 2-year standard rewards out-of-state schools with much worse student outcomes on the MBE (demonstrated knowledge of substantive law after graduation).  In addition, the Council’s proposed standard will keep better qualified diversity candidates (schools with high diversity, high MBE, and low bar passage rates due to high bar exam cut scores) from practicing law.


        For example, compare Southwestern with Mercer.  Even though both schools’ entering student LSAT’s and GPA’s are very similar, Southwestern students’ mean MBE score was 1.7 points higher than Mercer’s.  That higher MBE mean score would not be remarkable were it not for the great disparity in the diversity of the two schools’ students.  Mercer’s 2015 entering class only included 23 Hispanic and Black law students combined (18.4% of first-year students).  However, Southwestern’s entering class included 97 Hispanic and Black students (30.6% of first-year students).  Something exciting is obviously occurring at Southwestern since its high-diversity students are outperforming their MBE predictive index and outperforming similar out-of-state schools.  However, now compare Mercer’s mean MBE score with its July 2017 bar passage rate with those of Southwestern Students:


 Mercer                   Mean MBE Score:  140.2           First-Time Bar Passage:  84.2%     Southwestern      Mean MBE Score:  141.9           First-Time Bar Passage:   57.0%


        By applying the Council’s proposed 75% in 2-year passage standard we are left with the following results:  Southwestern is at risk of de-accreditation (substantially below a 65% bar passage rate) even though its students outperformed Mercer’s and even though Southwestern students exceeded the national mean MBE score.  The problem, of course, is that Mercer is judged by a 135 cut score and Southwestern is judged by a 144-cut score.  In essence, this comparison of Mercer and Southwestern makes it as clear as it can be that the Council has not drafted a national accreditation standard, but rather has ceded its authority to the vagaries of state politics and cut scores.  Although the Council is obviously free to make such a decision, it must also take full responsibility for needlessly reducing diversity in law schools that are providing a value-added education to students that meet or exceed the national mean MBE score.


        In addition, when Mercer’s statistics are compared with those of  Golden Gate and USF the result of the application of the proposed 75% in 2-year standard is equaling troubling.  Although Golden Gate’s LSAT/GPA metrics are far below Mercer’s, Golden Gate’s mean MBE of 139.2 was only 1 point lower than Mercer’s, but because of the difference in cut scores, Mercer had an 82.4% bar passage rate compared to Golden Gate’s 51.0% rate.  Golden Gate’s passage rate, 52.0%, is predicted to fail the Council’s proposed standard.  The comparison between Mercer and USF is similar.  Even though USF had a mean MBE score only .7% lower than Mercer, its bar passage rate was only 54.0% compared to Mercer’s 82.4%.  Although USF provided its students with an equivalent education, it is at risk of de-accreditation.  Further, USF students performed similarly to Mercer’s on the bar even though USF had a 29.9% diversity rate for Hispanic and Black students compared to Mercer’s 18.4% diversity rate.


        Table 6 demonstrates that the Council’s proposed 75% in 2-year bar passage standard has substantially undervalued Standard 206 (diversity), has applied a very narrow and constricted definition of Standard 503 (outcome measures), and provides virtually no weight to schools’ “value-added” educational outcomes (MBE scores much greater than student LSAT/GPA’s predictive index). [33]




        The “mean MBE score/ cut score relationship” on the July 2018 bar examination further demonstrates the unfairness of the Council’s proposed 75% in 2-year standard.   Mercer students and Southwestern students who took that bar exam had almost identical LSAT’s and GPA’s according to the ABA 509 data [slightly higher GPA’s for Mercer and slightly higher LSAT’s for Southwestern].  The differences in diversity continued [Mercer 18.4% H + B and Southwestern 30.3% H + B students].  Even so, Southwestern students substantially outperformed Mercer students on the MBE [Southwestern’s mean MBE 140.3 versus Mercer’s 137.1].  However, again, Mercer’s students who performed substantially worse than Southwestern students earned a 73.3% bar passage rate versus Southwestern’s 53.0% passage rate.  This data predicts that Southwestern may be de-accredited under the 75% in 2-year standard, but Mercer will be saved based solely on its state political decision of adopting a much easier cut score. 


        But more startling is the comparison between Connecticut students’ “mean MBE/bar passage percentage” in relation to Southwestern’s students.[34]  Connecticut students have  substantially higher LSAT’s (158/156/154) and GPA’s (3.62/3.31/3.08) than Southwestern students’ LSAT’s (155/152/149) and GPA’s (3.46/3.18/3.00).  Therefore, it would be predicted that Connecticut’s mean MBE score would be much higher than Southwestern’s.  However, on the July 2018 bar exam Southwestern students outscored Connecticut students by 3.10 points (140.9 vs. 137.8) on the MBE.  What is more remarkable is that Connecticut’s entering class only included 27 Hispanic and Black students compared to Southwestern’s 84 Hispanic and Black students.  How does the Council’s proposed 75% in 2-year standard reward Southwestern for substantially exceeding its predicted MBE mean score and for performing significantly better than another out-of-state ABA law school – it predicts that based on Southwestern’s 53.0% passage rate that Southwestern will be de-accredited unless it significantly increases its bar passage rates.  As this example demonstrates, the 75% in 2-year standard fails to adequately inform prospective student consumers of the real “value-added” legal education at schools like Southwestern, and provides an over-inflated metric on Connecticut Law School students’ outcome measure (bar passage rate vs. actual MBE performance in relation to the national mean MBE).  My proposal to add the mean MBE performance to the 509 disclosures helps to cure this disparity and will provide prospective students with much more comparative data on the quality of schools’ educational program outcome measures.


3)     Even Though California Law Schools Consistently Outperform Out-of-State                  Law Schools on the MBE, They Are Penalized by ABA Standards Based on                    California’s Politically Set High 144 Cut Score.


        Professor Derek Muller has charted the comparative California Law Schools’ mean MBE scores in Comparison to the national mean MBE combined scores for all other law schools and demonstrated that California schools consistently perform better on that exam[35]:



        There is something terribly wrong with an ABA accreditation standard that would de-accredit a law school that routinely meets or substantially exceeds the national mean MBE score on the bar examination, but which condones other schools with lower MBE mean scores just because they are situated in an easy bar exam cut score state.  No other U.S. national accreditation agency, including those in medicine and nursing, has ever disaccredited an educational institution that has routinely met or exceeded the national mean on its accreditation examination.  The ABA should not be the one outlier accreditation agency in this country to disaccredit schools whose students are performing at the national mean, especially since such a standard will substantially decrease diversity in the profession.


        The following scatter charts visually demonstrate the dramatic effects of state bar exam cut scores on ABA accreditation.  These charts demonstrate how the California 144 cut score penalizes California law schools that have much higher diversity rates and higher MBE mean scores than out-of-state law schools that have much lower diversity and lower mean MBE outcome scores under the Council’s proposed 75% bar passage rate within 2-year standard:






        California is currently suffering a crisis in the diversity of its bar, not only because the attorney/minority ratio is one of the lowest in the United States[36], but also because several of its current ABA approved law schools may soon close or lose ABA accreditation:  (1) Whittier; (2) Thomas Jefferson; and (3) Western State.  First, because of Standard 316 Whittier was not able to enroll a sufficient number of qualified students to sustain its program.  The closure of Whittier will result in the loss of between 60 to 70 Hispanic students and 10-15 Black students in California law schools annually.[37]  Thomas Jefferson is currently on ABA probation and is suffering a similar fate as Whittier.   It has had to sell its law building and its 2018 entering class is only 57 students (26 Hispanic and Black students combined) compared to its 2014 class of 357 students (98 combined Hispanic and Black students).[38]  Finally, in January 2019 Western State was placed into receivership because its parent, Dream Center Education Holdings, is apparently not economically viable.  As a result, Western State has already started to shed some of its administrative staff.[39]  If Western State closes its usual minority admissions spots (approximately 60-70 Hispanic and 9-15 Black students) will also be lost in California.[40]


        Because of the new diversity crisis in California ABA law school admissions, and because of the disparate impact of California’s 144 cut score on California schools’ ability to achieve high bar passage rates, it is the worst of times for the Council to increase the difficulty of meeting Standard 316 by adopting the proposed 75% passage rate within 2 years of students’ graduation.  However, by adding my proposed “mean MBE score” alternative to Standard 316, some of the disparate racial impact of current or amended Standard 316 will be mollified.


4)     Council Members’ Arguments at The Open Session Based on a Medical School            Accreditation Analogy Are Misplaced for Several Reason.


        In the February 2019 Council Open Session an analogy to medical school accreditation was mentioned by several Council members.    One argument characterized law school accreditation and medical school accreditation as similar processes that are part of the larger field of “regulated industries”.   The regulated industries reference was used to argue that the ABA simply has no control over what state supreme courts set as a passing bar passage cut score, and therefore, states, like California, with outlier cut scores should not form the basis for rejecting the proposed 75% in two-year bar passage standard.  That regulated industries theory will lead to an extreme reduction of diversity in the bar, and its application to the comparison between medical school accreditation and law school accreditation is misplaced.


    A.     The Regulated Industries Analogy to Medical School Accreditation Is                              Seriously Flawed.


        The regulated industries model analogy to medical school accreditation is seriously flawed for the following reasons:


  1. Unlike ABA Standards, Medical Accreditation Standards Do Not Include A Specific Minimally Acceptable Medical Board Passing Rate.


        The Liaison Committee on Medical Education (LCME) accredits medical education programs in the United States.[41]   Unlike the ABA bar passage standard that proscribes a specific acceptable passage rate standard, the medical accreditation standards do not contain any specific metric for students’ medical board examination scores.  The LCME accreditation standard for student medical school board scores merely states that a medical school will be compared with “national norms”:


8.4           Program Evaluation:  A medical school collects and uses a variety of outcome data, including national norms of accomplishment, to demonstrate the extent to which medical students are achieving medical education program objectives and to enhance medical education program quality.


        In actual implementation, the LSME considers how close a medical school’s students score to the national mean on the medical boards [similar to my proposal to compare law schools’ mean MBE scores to the national MBE mean].  For instance, when the LCME threatened to place the University of Louisville School of Medicine on probation for reasons other than low student board scores, the LCME noted that the school’s USMLE Step 1 and Step 2 student board examination scores were “above” or near “the national means” and were therefore acceptable.[42]


  1. Medical Schools Are Not De-Accredited Based on Students’ Board Exam Scores Because, Unlike Bar Exams, Passage Rates on the Boards Are So High.


        The reason why no medical school in the United States or Canada has ever been sanctioned based solely on low student board test scores is because the mean passage rate on the medical boards is so high.  For example, the mean first-time score on the Step 1 medical board exam (taken after two years of medical school) for students who attended U.S./Canadian medical schools was 96% in 2016 and 96% in 2017.[43]   And the passage rates on the next  set of medical board exams, the Step 2 CK, were 97% in 2016 and 96% in 2017.[44]


        The use of the medical school accreditation model to form law school accreditation standards is misplaced because:  (1) medical school accreditation does not have a specific minimal exam passage rate, but merely compares schools’ scores with the mean national board test score; (2) the mean medical board scores are so high that no medical school is likely to have a problem with a significant percentage of its students failing the exam; and (3) by using a national mean score rather than a specific percentage passing score racial disparity among states based on the medical board examination is less significant in medical school accreditation.  Therefore, the Council should reject the analogy to the medical school accreditation model as a justification for passing the proposed 75% in 2-year standard.  The Council should also reject the argument that the ABA should not consider states’ outlier high bar examination cut scores in balancing consumer protection, law school accountability, diversity in the profession, and access to justice in promulgating law school standards. 


    B.     A Regulated Industry Model That Disregards Outlier States’ Cut Scores Will                  Have A Devastating Effect on Diversity.


        In Table 4 [see Appendix],  I demonstrate how a constricted regulated industries model which does not account for state specific cut scores will have a serious and unconscionable effect on Hispanic law student enrollment in California.  The California ABA Hispanic student matriculants represent 17% of all Hispanic students admitted to ABA law schools in 2018.  According to the regulated industries theory, even though a significant percentage of all Hispanic law students are enrolled in California, a national accreditor, like the ABA, is not required to draft a bar passage standard that takes into consideration the disparate racial impact of its standard on those students because California has an outlier bar exam cut score [144].  I call on the Council to reject the regulated industries model as defined in the Open Session, and instead to continue using a model that as best as possible reduces racial disparities in the ABA accreditation process created by state specific cut score decisions.


        Because adoption of the proposed 75% in 2-years standard will dramatically reduce diversity in the bar, especially in California, the Council should reject the medical model and continue studying other accreditation methods such as my proposed national MBE median criteria.  In addition, before the Council votes on the proposed amendments to Standard 316, it should publish for public comment a complete analysis demonstrating not only which law schools would currently be predicted to fail the 75% in 2-year standard, but also a list of those schools in 2018 that are trending toward failure of that bar standard in the near future.  Without a complete report on the probable effects of the new rule on all ABA approved law schools in 2018 and beyond, the Council cannot possibly balance all three of the ABA’s goals:  (1) student consumer protection; (2) diversity; and (3) access to justice.  Even if the Council as a private accreditation association[45] arguably may have the legal ability to promulgate standards with known serious racial disparities and effects, that power does not mean that the Council should promulgate such a standard. 




Respectfully submitted,


William Wesley Patton                                                                                                    Lecturer in Law, USC Gould School of Law; Assistant                                              Clinical Vol Professor, UCLA David Geffen School of Medicine,                    Department of Psychiatry                                                                        wpatton@law.usc.edu






[1]   The ABA’s official report that was updated on April 12, 2018 lists 19 failing schools. (https://www.americanbar.org/groups/legal_education/resources/statistics/).  However, the correct number of failing schools could now be 16. (https://www.azcentral.com/story/news/local/arizona-investigations/2019/03/04/american-bar-association-council-to-take-up-tougher-law-school-sanctions-in-may/2702306002/).The Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION, Revisions to Standard 316:  Bar Passage, at 6, states that schools with first-time bar passage rates below 65% will have difficulty meeting the 75% in 2-year standard.  The media have reported that three schools have corrected their 2-year bar passage rates.   

[2]   Appalachian 69.8%; Arizona Summit 50.0%; New England 71.1%; Pontifical 54.5%; Inter American 57.2%; U.D.C. 71.4%; Atlanta John Marshall 59.8%; Valparaiso 66.0%; Florida Coastal 64.5%; Golden Gate 74.0%; South Dakota 67.9%; Howard 54.6%; Barry 71.1%; Whittier 57.3%; Ave Maria 73.3%; Thomas Jefferson 63.8%; Dayton 69.2%; Puerto Rico 69.9%; Appalachian 69.8%; La Verne 74.5%; USF 67.4%; McGeorge 73.5%; Elon 66.3%; Vermont 74.3%; and, Western Michigan 69.0%. 

[3]  The Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION, Revisions to Standard 316:  Bar Passage, at 6, states that schools with first-time bar passage rates below 65% will have difficulty meeting the 75% in 2-year standard.  The media have reported that three schools have corrected their 2-year bar passage rates.  

[4]   (http://abarequireddisclosures.org/Disclosure509.aspx).

[5]   (https://www.americanbar.org/groups/legal_education/resources/statistics/).

[6]   Unfortunately, there is a discrepancy between two of the ABA’s reports on schools’ 2018 first-time bar passage rates.  One report, First Time Bar Passage Calendar Year 2018 (Last Updated April 24th 2019), includes more than 49 schools with rates below 65%.  However, another report, First Time Bar Passage by School by Jurisdiction 2018 (Last Updated April 22nd 2019), includes only 49 schools with cumulative 2018 passage rates for all jurisdictions.  For this study I have selected the more conservative report that 49 schools had 2018 bar passage rates below 65% in order to avoid overstating the effects of the proposed 75% in 2-year standard on diversity.

[7]   See ABA Section on Legal Education and Admission to the Bar, First-Time Bar Passage by Jurisdiction 2018.

[8]   See ABA 509 data (http://abarequireddisclosures.org/).

[9]   These passage rates are based on the cumulative number of students taking the February and July 2014 and the number taking the February and July 2014 versus the number passing the exams based on information supplied by the California State Bar.  (http://www.calbar.ca.gov/Admissions/Law-School-Regulation/Exam-Statistics).

[10]   Since La Verne had fewer than 11 repeat takers for the February 2018 exam the California State Bar did not publish their statistics.

[11]   Mark A. Albanese, The Testing Column:  July 2018 MBE: The Storm Surge, Again, The Bar Examiner, at 30 (Fall 2018).

[12]   “[T]he decent of performances on the bar examination that began in July 2014 results has created an accumulating group of repeaters.”  Erica Moeser, President’s Page, at 5, the Bar Examiner, June 2017.   Repeaters comprised 67% of all examinees on the February 2018 MBE.  Mark A. Albanese, The Testing Column:  February 2018:  The MBE Storm Surge Continues, at 28, The Bar Examiner, Summer 2018.

[13]   Id., at 27.

[14]   See California February 2016-2018 bar examination statistics at (http://www.calbar.ca.gov/Admissions/Law-School-Regulation/Exam-Statistics). 

[15]   Letter from Kyle McEntee to Council (https://lawschooltransparency.com/documents/2016-07-27_LST_to…).

[16]   Id., at 4.

[17]   Obviously, since we do not yet have the repeater passage rates for July 2018 graduates, my predictions are based on the last several years of repeater bar passage rates for these 9 CA ABA schools.  Further, since the NCBE is predicting that repeater rates will continue to fall for the next few years, it is unlikely that the CA ABA law schools will be able to dramatically increase their repeater rates for the February 2019, July 2019, and February 2020 California bar exams.

[18]  Supra., notes 6-7.

[19]   I am well aware of the debate over why Hispanic and Black students often score lower than White students on the MBE.  Although the NCBE has published several articles defending the MBE against claims of racial bias, others continue to argue that the MBE is structurally biased.  See, e g, Christina Shu Jien Chong, Battling Biases:  How Can Diverse Students Overcome Test Bias on the Multistate Bar Examination, 18 University of Maryland L. J. of Race, Religion, Gender and Class 31 (2018); Susan M. Case, Persistence on the Bar Exam, The Bar Examiner 20 (Dec. 2012); Susan M. Case, Urban Legends About the Bar Exam, The Bar Examiner 44 (May 2008).  However, in the context of  the proposed 75% passage rate within 2-years of Standard 316, even if the MBE is racially biased, its use is warranted at this time as an alternative accreditation standard because it mollifies the impossibility of law schools’ ability to maintain and/or increase diversity in states with extremely high bar exam cut scores. 

[20]   Since I am proposing a dual standard [bar passage and/or mean MBE score], even if a state does not use the MBE on its bar exam the alternative of bar passage rates is still is still operative in this “dual standard”.

[21]   The NCBE has consistently stated that “The MBE total score is highly reliable…[however] scores

from   written   tests   have   much lower  reliability.”  Susan M. Case, The Testing Column:  Best Practices With Weighting Examination Components, The Bar Examiner, February 2008, at 43-44.  In fact, the MBE is used to “scale” states’ essay question so that the comparable difficulty of essay questions on different bar exams remain similar.  The essays are scaled to the MBE because it is a more reliable and stable testing instrument:  “This process allows the written  portion  of  the  exam,  which changes with every administration but cannot be equated like the MBE, to capitalize on the stability of the MBE,  thereby  accounting  for  any differences in question difficulty or

grader stringency.” Judith A. Gundersen, President’s Page, The Bar Examiner, Winter 2017-2018, at 5.

[22]   As my law review article, A Blueprint…, supra, page 2, demonstrates, California law schools with bar passage rates that would fail the proposed 75% in 2-year rule are graduating students whose mean MBE scores would admit them into almost every other bar in the United States.

[23]   April 25, 2019 email from William Wesley Patton to jgundersen@ncbex.org regarding “ABA Council and Proposed Bar Exam Standard”.

[24]   Id. 

[25]   April 26, 2019 Email response from Christina Doell, California State Bar Association, to William Wesley Patton:  “NCBE provides the State Bar with the MBE score of every applicant that took the California Bar Examination. The State Bar attaches this score to each applicant’s exam record. These scores are later used to compile the [schools’ MBE mean] statistics.” 

[26]   SECTION OF LEGAL EDUCATION AND ADMISSIONS TO THE BAR, AMERICAN BAR ASSOCIATION, REVISIONS TO STANDARD 316:  BAR PASSAGE, at 6.  I have demonstrated that the Council’s prediction on the number of schools that will fail the 75% in 2-year standard is an underestimation.

[27]   (https://www.swlaw.edu/admissions-financial-aid/catalogs-brochures).

[28]   (http://abarequireddisclosures.org/Disclosure509.aspx).

[29]   For a comparison among Southwestern entering students’ LSAT/GPA’s and median MBE score success and those of other higher rated out-of-state law schools, see, Patton, A Blueprint…, supra., 24 Wash. & Lee J. of Civ. Rts & Soc. Justice 3, at 43-45 (2017).  Hispanic and Black graduates consistently underperform White candidates on the California Bar Exam.  For instance, here are the July 2018 CA bar exam first-time passage rates:  (1)  Whites:  69.5%; (2) Hispanic: 56.3%; and, (3) Black: 45.1%. (http://www.calbar.ca.gov/Portals/0/documents/admissions/JULY2018_CBX_Statistics.pdf).   It is amazing that Southwestern with a combined Hispanic and Black student population of 30% was able to meet or exceed the national mean MBE score consistently for the last decade. 

[30]   See (http://www.calbar.ca.gov/Admissions/Law-School-Regulation/Exam-Statistics), and SUPPLEMENTAL STATISTICS REPORT FOR THE JULY 2016, JULY 2017, AND JULY 2018 CALIFORNIA BAR EXAMINATION.

[31]   Id.

[32]   Santa Clara is  another California ABA law school that is “trending” toward future failure of the 75% in 2-year standard.  Santa Clara from February 2014 through July 2018 had a bar passage mean of 66.4%.  But on the most recent exam, July 2018, Santa Clara only had a 58% first-time passage rate and a 20% “repeater” passage rate.  However, even on the July 2018 exam, its lowest passage rate in a decade, its students still scored a mean MBE of 144.6, 5.1 points higher than the national mean MBE (139.5).  In 2018 Santa Clara’s entering first-year class had 61 Hispanic and 7 Black students.

[33]   This data is found at This mean MBE data and passage rates were obtained from (https://www.gabaradmissions.org/georgia-bar-examination-statistics), (https://www.jud.ct.gov/cbec/stats.htm), and (http://abarequireddisclosures.org/Disclosure509.aspx ).


[34]  As Table 6 also demonstrates, even though entering students’ LSAT’s and GPA’s are almost identical at Southwestern and Quinnipiac, Southwestern outscored Quinnipiac by 3.40 points (140.9 vs. 137.5) even though Southwestern had more than four times as many diversity students.  However, based on cut scores Quinnipiac had an 82.0% bar passage rate vs. a 53.0% bar passage rate for Southwestern, thus placing it within the at-risk range under the Council’s proposed 75% in 2-year standard.

[35]   Derek Muller, California bar exam takers are far more able than others nationwide but fail at much higher rates, Excess of Democracy, Nov. 21, 2015 (http://excessofdemocracy.com/blog/2015/11/california-bar-exam-takers-are-far-more-able-than-others-nationwide-but-fail-at-much-higher-rates). 

[36]   For instance, Hispanics comprise approximately 38% of California’s population, but there are fewer than 8,000 Hispanic licensed attorneys practicing in California.  Patton, supra., at 6.

[37]   See 509 reports (file:///C:/Users/bpatt/Downloads/Std509InfoReport%20(2).pdf).

[38]   Id; (https://www.americanbar.org/groups/legal_education/resources/aba_approved_law_schools/).

[39]   Stacy Zaretsky, Law Students Unable to Pay Rent Without Student Loans After School Goes Into Receivership, Above the Law, Feb. 8, 2019.   A March 4, 2019 letter from Barry Currier to Western State students informed them that the ABA has only two remedies available: (1) a review process that could result in the school’s loss of accreditation, or (2) requiring the school to file a “teach out” plan for orderly school closure.  (https://www.insidehighered.com/sites/default/server_files/media/Western%20State.Memo_.March%204%202019.pdf).

[40]   (http://abarequireddisclosures.org/Disclosure509.aspx); (https://www.wsulaw.edu/prospective-students/aba-required-disclosures).

[41]   Hunt, et al, The Variables That Lead to Severe Action Decisions by the Liaison Committee on Medical Education, 91 Academic Medicine 87 (2017), at 87.

[42]   MEMORANDUM:  Liaison Committee on Medical Education, Secretary of the ad hoc Survey Team That Visited University of Louisville School of Medicine on April 14-17, 2013, at 21 and 49; University of Louisville School of Medicine:  Request for LCME Reconsideration of Accreditation Findings and Actions, January 28, 2014 arguing that the school’s board scores are “at or above the national pass rate….”

[43]   2017 PERFORMANCE DATA:  STEP 1  (https://www.usmle.org/performance-data/default.aspx#2017_step-1).

[44]   2017 PERFORMANCE DATA:  STEP 2 CK (https//www.usmle.org/performance-data/default.aspx#2017_step-2-ck).   The passage rates for the two remaining medical boards, the Step 2 CS and the Step 3 are also between 96%-98%.  (Id).

[45]   Professional Massage Training Center, Inc. v. Accreditation…, 781 F.3d 161, 169 (2015).

%d bloggers like this: