Law School Specialty Program Rankings: Is the Tail Wagging the Dog?

By:  Robert R. Kuehn, Associate Dean for Clinical Education, Professor of Law

In addition to its annual ranking of law schools, U.S. News ranks nine law school specialty programs. Although the school rankings rely on a number of factors to compute a final score (peer assessment, lawyers/judges assessment, LSAT, undergraduate GPA, etc.), specialty rankings are based solely on reputation. A faculty member teaching in the specialty area at each school is asked to assess the specialty programs at the almost 200 other ABA accredited schools. As this column explains, there are serious problems with this type of ranking.

In previous years, the person rating other schools was directed to choose the top 15 programs in that specialty area. U.S. News would then list the top 12 to 20 programs (and even more behind a paywall) based on how often the school was listed by raters. This year, U.S. News asked survey participants to rate specialty programs on a 5-point scale of Outstanding (5) to Marginal (1), mirroring the much-criticized method used to assess the academic reputation of schools among peers and lawyers/judges.

The response rates among the faculty who received this year’s specialty program ballot varied from 44% (International Law) to 65% (Clinical Training). With the new scale, U.S. News chose to rank in order every program that received at least 10 ratings. This resulted in publicized rankings from 1st to 186th, with individual school scores ranging from 4.9 to 1.0 and average program scores from 2.1 (Intellectual Property and International Law) to 2.9 (Dispute Resolution).

A glance at the new rankings shows noticeable bunching of schools ranked highest by U.S. News also at the top of most specialty programs and an overpopulation of lowest ranked schools at the bottom. Is this because the highest ranked schools also happen to have the highest quality specialty programs and vice-versa for the lowest ranked schools? Or, might the over-representation at the two ends be due in part to a halo effect where the academic reputation of a school is influencing the rating of that school’s specialty program? This effect could be especially pronounced this year, when raters were asked to not just opine on the quality of 15 specialty programs but to express an informed opinion on 200.

To examine the strength of the relationship between a school’s peer assessment and specialty program reputation, scores for each school in every specialty were collected and analyzed. The Pearson’s correlation coefficients between the two scores are set out below:



Where a coefficient of 1.0 indicates a perfect relationship between two variables, coefficients of .7 or greater are often characterized as “very strong.” The ratings of some specialty programs (e.g., International and Tax Law) stand out as very strongly associated with the school’s academic reputation, while two programs were negligible and lacked statistical significance to their school’s reputations (Dispute Resolution and Legal Writing). Law school reputation scores explains 80% of the variability in International and Tax Law specialty rankings and over 50% of the variance in Environmental and Intellectual Property Law scores. The large correlations suggest that some raters may be cueing off a school’s academic reputation and not independently judging the quality of its specialty program. But because correlations do not prove causation, it could also be that schools with better academic reputations also largely have better specialty programs.

A closer look at the distribution of Clinical Training scores suggests that some haloing is going on, at least when raters are asked about schools with the highest and lowest academic reputations. Clinical scores were strongly correlated with a school’s academic reputation (r=.66), with 44% of the variation in scores explained by school reputations. The graph below plots a school’s peer assessment score against its Clinical Training score with a regression line added (β=.51, p < .01) to illustrate the positive relationship.




Notice that every school with a peer assessment score of at least 4.0 also had a Clinical Training score above 3.0. Conversely, only one out of thirty-two schools with a peer assessment score of 1.5 or less garnered a clinical rating above 3.0. Clinical scores for schools in the peer assessment mid-range appear to be more dispersed. Correlation analysis supports these impressions — correlation coefficients for the 50 top-rated and bottom-rated schools were .69 and .65 respectively, while the coefficient for middle-ranked schools was only .37.

Data from the ABA and surveys of clinical programs by the Center for the Study of Applied Legal Educating (CSALE) provide additional variables potentially associated with clinical program rankings. There is a moderately strong association between the size of a school’s J.D. enrollment and its Clinical Training score, with a statistically significant correlation of .49 (explaining 24% of the variance in clinical ranking). There is a similar association between the number of “slots available” in law clinics and program scores (r=.48), but beware that slots available reporting by schools is largely unregulated and varies widely. The number of full-time law clinic and externship faculty reported in CSALE surveys is also related to program ranking, although less strongly (r=.31). Do these correlations indicate that larger schools tend to have higher quality clinical programs or are their programs and number of clinical faculty just larger and more noticeable?

An increasing number of schools are guaranteeing or requiring their J.D. students a law clinic or externship experience before graduation. This potential signal of clinical experience availability or quality, now at 75 schools, is only weakly associated with program ranking scores (r=.17).

U.S. News explained that its goal in moving to the new scale was to rank more schools (five to six times more) and provide better understanding of the scope of the differences between schools. Yet when there is strong evidence that many raters may cue off a school’s overall reputation and not independently assess the quality of its specialty program, U.S. News’ rank ordering of school specialty programs down to 192 is of dubious validity and should also call into question the use of this method in its annual law school ranking.


Time to Remedy the Ills Afflicting ABA Council’s Standard 316 Proposal

By: Professor Judith Welch Wegner, Burton Craige Professor of Law, Emerita


         Many readers of this blog are well aware of ongoing efforts by the ABA’s Council of the Section on Legal Education and Admission to the Bar to revise law school accreditation standards in troubling ways.  <>


         The Council’s proposal is to amend Standard 316, which currently requires law schools to demonstrate a 75% bar passage rate of graduates within 5 years of graduation, but allows law schools to provide evidence justifying an extended period in which to come into compliance based on a variety of factors (including trends, academic support efforts, transfer patterns, and more).


         The Council’s proposed revised standard would instead provide that “At least 75 percent of a law school’s graduates who sat for a bar examination must have passed a bar examination administered within two years of their date of graduation.”  That’s it. A shortened period for demonstrating compliance, and a bright-line standard that does not consider mission, near compliance, remedial efforts, or anything else.  As a result of this proposed revised standard, a significant number of law schools  would have their accreditation put at risk, particularly in jurisdictions such as California where bar passage rates are routinely much lower than the proposed 75% passing score, and particularly for schools whose missions reflect a commitment to enrolling and increasing the opportunities for non-traditional students and people of color.  For more information on the proposal’s history and the views of supporters and critics, see <>


         But, wait, there’s more!  The ABA Council has brought this proposal forward to the House of Delegates twice before (in 2016 and in 2019) and, significantly, twice, it has been roundly defeated as a result of concerns about the impact of the revision on diversity within the profession, its uneven application given the widely varying rates of bar passage in recent times, the Council’s failure to engage in meaningful research about disparate impact, and a lack of transparency in the Council’s proceedings.  The ABA House of Delegates defeated the proposed revision most recently at its mid-year February 2019 meeting in Las Vegas when the vote was 88 in favor and 334 opposed.  <>


         Nonetheless, the ABA Council has put the proposed revision on its agenda for its May 16-18 meeting once again, since it is only obliged to give the House of Delegates a chance to express its views, not to follow those views.  The ABA Council has also recently published data on law school bar passage rates, including data from 2016 and preliminary data from 2018.  <>  The ABA press release puts a favorable spin on these data, stating that “First-time takers in 2018 achieved a 74.82 percent pass rate, which compares to a 77.34 percent pass rate for 2017.”  But interested readers should look beyond these generalized numbers to see the impact demonstrated of the proposed new standard based on the data in the associated tables (that show schools as risk by state and state bar passage rates).


         I appreciate the ABA Council’s desire to simplify and impose bright-line rules as a matter of bureaucratic convenience.  I also understand that the Council likely is responding from pressure from those who believe that, when students invest in legal education, they should not be left high and dry without anticipated careers should they fail to pass licensing examinations.  But these views are excessively simplistic.


         Over-simplification should not win the day particularly in the absence of much-needed, in-depth analysis.


  1. The ABA Council is proposing an action that will have significant disparate impact on minority candidates seeking admission to the bar, but has not undertaken meaningful research on this topic.
  2. If the theoretical basis of the ABA action is tied to a requirement that law school graduates demonstrate a basic level of competence, it cannot reasonably claim that state passing percentages are tied to such a determination given their widely disparate passing rates. Able scholars such as Professor Joan Howarth have demonstrated that other professions that employ a national licensing examination (such as the Uniform Bar Exam) employ a single standard “cut score” tied to competence, but American jurisdictions continue to have widely different cut scores, suggesting that a national standard cut score needs to be developed sooner than later.
  3. Significant questions should be raised regarding the authority of the ABA Council in effect to prohibit the continuation of law schools in jurisdictions such as North and South Dakota or institutions serving minority populations in jurisdictions where state legislatures have failed to provide adequate funding for public K-12 education in areas with substantial minority populations. If state legislatures and state courts believe that law schools in these contexts are serving a worthwhile purpose, why should the insular and insulated ABA Council be allowed to shut them down?
  4. Recent developments in federal antitrust law should also be reckoned with. The United States Supreme Court’s decision in North Carolina Dental Board v. FTC, 574 U.S. ___ (2015), provided an important gloss on the “state action” exemption.  States with licensing boards populated by significant numbers of professionals they regulate need to consider carefully whether there is adequate “active supervision” to assure that licensing boards (such as bar examiners) are not limiting access to protect their own economic interests.
  5. The ABA Council’s proposal assumes that existing bar examinations can actually be assumed to document basic competence for lawyers. However, the ABA Council has in not justified this assumption, and more notably has not demonstrated that a 75% bar passage rate that varies widely from state to state reflects a shared judgment on competence.  Nor has the ABA Council  addressed the importance of wide swings in performance across all jurisdictions (whether using the Uniform Bar Examination or not).  The National Conference of Bar Examiners has recently established a task force on testing that is responsible for examining potential changes in bar examinations moving forward.  <>  A good case can therefore be made the ABA Council should not adopt changes in accreditation standards before the NCBE engages in further analysis and provides better documentation about the validity of the bar exam nationally.


         Happily, a better way through this morass is apparent, thanks to the work of <Professor Bill Patton> (previously at Whittier, now at USC/UCLA).  He has developed a compelling analysis of the implications of the ABA’s Council’s plan to revise standard 316 and has effectively demonstrated the error of the ABA Council’s approach.  His work demonstrates the significant adverse effect on minority bar applicants in California (and on California law schools) if the ABA Council’s proposal is adopted. Patton rejects the ABA Council’s focus on requiring a 75% bar pass rate in individual jurisdictions and instead urges consideration of a standard that would focus on a national standard of competence benchmarked against performance on the UBA/multi-state multiple choice exam.  He shows that graduates of many California schools at risk under the proposed ABA revised standard have actually outperformed projections/indicators and have demonstrated competence that exceeds graduates of law schools in New York, Connecticut, Georgia and elsewhere.  In effect, Professor Patton’s study demolishes the ABA Council’s superficial analysis and offers an alternative approach that would have a less significant disparate impact on law schools which have shown that they can educate minority lawyers who out-perform their admissions indicators.  At the very least, the ABA Council should step back and consider his analysis before taking further steps to adopt their initial poorly-conceived and poorly-justified proposal.


         The stakes are high for the integrity of legal education.  In this difficult time, it is more important than ever that diverse candidates enter the legal profession and represent those who have traditionally been at the margins of society and social justice.  Readers of this blog should also recognize that the membership of the ABA Council on Legal Education is determined by the vote of members, including faculty members who have membership by virtue of law school faculty membership.  The ABA Council’s nominating committee has just revealed its nominees for Council membership <> and it remains striking to me how poorly the Council reflects the insights of broad swaths of the legal profession and legal education community, and how little meaningful research has been done by ABA Council staff to support arguments that would significantly curtail minority participation in the legal profession and respond to the clearly-articulated concerns of the ABA House of Delegates.


          Make your views known.  It is important for all law faculty members and concerned observers to convince the ABA Council that, if they wish to change the existing standard,  they need to adopt the Patton proposal that benchmarks law student bar performance against national bar examination cut scores rather than individual state percentage pass rates.  Please write to the ABA’s Consultant on Legal Education, Barry Currier, and insist that he share your views with the Council before they take steps to decimate minority admission in states such as California.  Currier’s email is:  Please also take care to consider whether you believe the ABA Council represents your views.  Its membership is listed at


          Doing nothing allows the ABA Council to undercut the diversity of the legal profession without engaging with core principles.  Is that an acceptable practice in your view?  The ABA now charges law schools more than $20,000 per year to fund its activities, while acting on simplistic bureaucratic principles.  Is that an acceptable use of limited law school funds in your view.


Step up.  Express your views.  The future of the legal profession depends on you.




American Bar Association                                                                 April 27, 2019 Council on Legal Education and Admission to the Bar                                          321 N. Clark Street, 19th Floor                                                                                    Chicago, IL 60654




          Conflicting reports indicate that either nineteen (19) or sixteen (16) ABA law schools’ 2015 graduates failed to meet the proposed 75% in 2-year bar passage standard.[1]  The number of ABA law schools whose 2016 graduates failed to meet that standard increased to twenty-five (25)[2], and nine out of the schools that failed that standard for 2015 graduates performed even worse for 2016 graduates.  


          But more problematic are the predictions from the National Conference of Bar Examiners and from the 2018 bar passage results that the number of ABA law schools that may fail to meet that proposed standard will dramatically increase for 2018 and later graduates.  The ABA’s recent publication on the bar passage rates of ABA law schools in 2018 demonstrates that forty-nine (49) ABA schools had 2018 cumulative first-time bar passage rates below 65% and thus are predicted to have great difficulty in meeting the proposed 75% in 2-year bar standard for 2018 graduates in 2020.[3]


          According to the ABA 509 disclosures regarding the demographic composition of 2018 first-year classes, these 49 at-risk schools have a cumulative total of 1162 Hispanic and 1137 Black entering students.[4]  According to the ABA report on 2018 enrollment by race, these 49 schools account for 37% of all Black first-year students (1,137/3,033) and 35% of Hispanic first-year students (1,692/4,808).[5]  If even half of those 49 schools that are  predicted to fail the proposed 75% in 2-year standard actually fail, diversity in the bar will be alarmingly reduced.[6]


          The following analysis demonstrates that the Council’s 2015 and 2016 graduates’ studies on the effects of the proposed 75% in 2-year bar passage standard substantially underestimate the future number of ABA approved law schools that will fail that test for 2018 and later graduates.  For example, in 2018 according to the Council’s own predictive model [first-time bar passage rates below 65%], the following nine (9) California ABA approved law schools may be de-accredited:  (1) California Western; (2) Golden Gate; (3) La Verne; (4) Southwestern; (5) Thomas Jefferson; (6) USF; (7) Western State; (8) U. C. Hastings; and (9) McGeorge.    All nine of these California ABA law schools have mean bar examination passage rates for July 2017 through July 2018 well below 65%, and their 2018 overall passage rates are even lower:  [7]





          The potential de-accreditation of  those nine at-risk California ABA schools will result in a loss of 59% of Hispanic first-year California ABA law students (497 out of 842) [see Table 4 infra.], and a loss of 51% of Black first-year California ABA law students (105/205).[8]  Those 9 California ABA law schools’ Hispanic (H) and Black (B) 2018 first-year enrollments were:  (1)  Cal Western:  41 H + 12 B; (2) U.C. Hastings:  59 H + 6 B; (3) Golden Gate:  63 H + 33 B; (4) McGeorge:  37 H + 8 B; (5) Santa Clara:  61 H + 7 B; (6) Southwestern:  75 H + 12 B; (7) Thomas Jefferson:  17 H + 9 B; (8) La Verne:  38 H + 5 B; and (9) Western State:  70 H +6 B.   These alarming diversity statistics, alone, warrant the Council abandoning or radically altering its proposed 75% bar passage rate within 2-years of students’ graduation.


          Making matters worse, in order to meet the proposed 75% in 2-year standard schools must have decent “repeater” bar examination passage rates.  However, just like the California first-time passage rates, the “repeater rates” have fallen dramatically since 2014 making it much more difficult for schools to meet the proposed standard.  For instance, consider the significant reduction in California repeater passage rates between 2014 and 2018 at the nine at-risk California ABA law schools:[9]




          It is also alarming that the NCBE has predicted that “repeater” bar passage rates will continue to fall for the next few years.  The NCBE has demonstrated that the number and percentage of repeat takers is dramatically increasing as schools’ first-time bar passage rates and as the mean MBE scores continue to fall to historic lows.   The gist of those NCBE analyses is that:  (1)  the slight MBE mean increase in July 2016 and July 2017 was followed by a precipitous decline to the lowest  July MBE mean since 1984[11]; (2) because of the significantly lower first-time passage rates on the bar, a greater number of February bar takers are “repeaters”[12]; (3) the February 2018 mean MBE score is lowest score in MBE history, and (4) the “mean MBE score in February has been declining every year since 2013”.[13]    These “repeater” trends (significantly more repeaters with lower passage scores) must be factored by the Council in predicting which schools will fail to meet the 75% in 2-year standard in 2018 and beyond.


          The NCBE data predicts even lower California annual bar passage rates in the next few years for three reasons.  First, the February 2018 and July 2018 national MBE mean scores are at historic lows.  Second, California ABA law schools’ annual repeater bar passage rates in 2018 have fallen between 9% to 21% in 2018.  And, finally, the number and percentage of California repeat test takers on the February bar exam have steadily  increased since 2015:  (1) In Feb. 2015 repeaters were 76% of test takers; (2)  in Feb. 2016 repeaters were 79%; (3) in Feb. 2017 repeaters were 81%; and (4) in Feb. 2018 repeaters were 82% of all CA ABA test takers.[14]  As California passage rates continue to decrease, the number of repeat test takers will also continue to substantially increase.


          It is interesting that one of strongest proponents of promulgating the proposed 75% in 2-year standard, Law School Transparency,  has submitted an analysis to the Council that supports my prediction that at least 9 CA ABA schools will fail that standard for 2018 graduates.[15]  LST’s optimistic analysis states that if a law school has a first-time bar passage rate of at least 60% [compared to the Council’s predictive standard of 65%] and “repeater” passage rates decline 50% for each of the next 3 bar administrations [30%/15%/7.5%], a school should be able to meet the 75% in 2-year standard.[16]  However, the 9 ABA law schools that I am predicting will fail the proposed standard for 2018 graduates have first-time and repeater passage rates well below LST’s prediction for success.  For instance, in 2018 eight out the nine California at-risk law schools had first-time passage rates below 60% and eight of the nine schools had repeater passage rates below 30%. 


          For example, Golden Gate had a 34% July 2018 first-time passage rate and its February/July 2018 repeater passage rate was only 19%[17].  In order to meet Law School Transparency’s prediction of reaching a 75% cumulative passage rate in 2-years Golden Gate will need to more than double its 19% repeater passage rate.  However, the NCBE has predicted that “repeater passage rates” will continue to fall for several years[18].  Also consider McGeorge which had a July 2018 first-time passage rate of 50%, but a February/July 2018 repeater passage rate of only 23%.  Again, under LST’s and the Council’s predictive models McGeorge will need to substantially increase its repeater passage rate in order to meet the proposed standard.  Further, even California Western currently fails LST’s predictive model since its July 2018 first-time passage rate is only 52% and its February/July 2018 repeater passage rate is only 30%.


  1. Using Law Schools’ Mean MBE Scores in Relation to the National Mean MBE Score Will Protect Student Consumers and Will Have a Substantially Reduced Effect on Diversity Than Sole Reliance on Bar Exam Passage Rates.


          In my article, A Blueprint for A Fairer ABA Standard for Judging Law Graduates’ Competence:  How A Standard Based on Students’ Scores in Relation to the National Mean MBE Score Properly Balances Consumer Safety with Increased Diversity in the Bar, 24 Wash. & Lee J. of Civ. Rts & Soc. Justice 3-62 (2017), I analyze the benefit of including schools’ mean MBE scores as another variable in determining the quality of schools’ educational programs.  I am not proposing the elimination of bar passage rates as a relevant accreditation variable, but rather, the following proposed accreditation standard consists of two alternative means of showing compliance with ABA Standards 301, 316, and 501:  (1) bar passage rates; and/or (2) mean MBE scores.  I am proposing the following amendment to the Council’s proposed changes to Standard 316:


At least 75 percent of a law school’s graduates who sat for a bar               examination must have passed a bar examination administered within two years of their date of graduation, or a law school must be within 10 per cent of         the national mean MBE score on two out of the four bar examinations                             administered since the date of the students’ graduation.

          The advantages of my proposed “mean MBE score” standard over a standard like 316 based solely on bar percentage passage rates are[19]:


  1. This standard brings the ABA into conformity with other U.S. accreditation organizations such as those in medicine and nursing by using a “national mean standard” that the DOE has historically approved and that is not affected by states’ specific licensing standards, such as bar examination cut scores;
  2. It provides a uniform standard for every law student attending almost every ABA approved law school;[20]
  3. It avoids disparate diversity impact among the states that is inherent in the current and in the proposed amendments to the Standard 316 bar passage rule; and,
  4. It provides prospective law students with much more information about schools’ academic performance outcomes and permits comparisons among ABA schools regarding their “mean MBE scores” in relation to their students’ LSAT’s, GPA’s, and diversity [value-added instruction].


          It is important to note that at the Council’s February 2019 Open Session several members addressed whether an accreditation standard based on the MBE would be workable and effective.  For instance, Barry Currier asked Judith Gundersen, NCBE President, whether using the MBE as an accreditation metric was possible.  Gundersen responded by saying that MBE scores are a representative model for total bar examination performance.  She clarified that the MBE score is an accurate proxy for students’ performance on the essay portion of bar examinations.  At that point Gregory Murphy responded that the MBE is a more reliable testing instrument than the various essay alternatives offered on the states’ different bar exams.[21]  Therefore, the Open Session comments have provided substantiation to my proposal to use the national mean MBE  score as an alternative accreditation metric to demonstrate whether, under Standard 301(a), law schools are offering a “program of legal education that prepares its students, upon graduation, for admission to the bar…” and under Standard 501(b)  that “[a] law school shall only admit applicants who appear capable of satisfactorily completing its program of legal education and being admitted to the bar.”[22] 


          Although I am proposing a standard that requires law schools to have mean MBE scores within 10% of the national mean for two out of the four bar examination administrations since students’ graduation, that standard could need a slight “tweaking” since not all mean MBE school data is publicly available for consideration at this time.   On April 25, 2019 I contacted Judith Gundersen, NCBE President, and requested the mean MBE scores for HBCU law schools from February 2014 through July 2018 in order to determine the impact of my proposed “10% within the national MBE score” standard on those schools.[23]  Regrettably,  the NCBE does not have a list of specific law schools’ mean MBE scores.[24]  However, there are two simple methods for schools to obtain their MBE data.  First, like California law schools, each school can obtain the school’s MBE mean from their state bar association or bar examination administrator that receives directly from the NCBE a list of those students who took the MBE for the bar exam.[25]  Second, if a state bar association refuses to provide schools with their students’ MBE scores, those schools can submit a list of their students to the NCBE which can provide that data directly to each law school in order to calculate the school’s mean MBE score for each bar examination.  The administrative cost to law schools is minimal in determining mean MBE scores in relation to other data, such as employee statistics, that the ABA currently requires to be reported.


A)      Examples of The Different Results Between the Council’s Proposed 75% in 2-Year           Amendment to Standard 316 and My Proposed National Mean MBE Standard.


          The Council’s recent study of 2015 graduates’ 2-year bar passage rates found that four California law schools during 2016-2017 had bar passage rates below 65% and might be at risk under the proposed 75% in 2-year bar passage standard.[26]  The following analysis of Southwestern’s and California Western’s  California bar exam passage rates in relation to their mean MBE scores from July 2014 to July 2018 demonstrates the benefits and fairness of my national mean MBE standard over only using the Council’s proposed 75% in 2-year bar passage standard.


          Table 1 demonstrates that Southwestern law school from February 2017 to July 2018 did not score 65% or better on the California bar examination; however, during those four bar administrations it exceeded the national mean MBE score on 4 out of 4 (100%) of the examinations.  However, since Southwestern failed to achieve a 65% passage rate on all of those bar exams, under the Council’s predictive model it may be at risk of failing to meet the Council’s proposed 75% in 2-year bar passage standard.


          Rather than being placed at risk of de-accreditation, schools like Southwestern that enroll a diverse class, who provide those students with rigorous academic support, and which achieve MBE mean scores well beyond the expectations of students’ entering LSAT/GPA’s, should be lauded, not de-accredited.  Southwestern Law School  for more than 100 years has provided access to the profession for a very diverse student body.[27]  In 2018 it enrolled 75 Hispanic and 12 Black law students.[28]   Southwestern students’ mean MBE scores are higher than those at other higher ranked schools which have higher entering student LSAT/GPA’s even though Southwestern’s student body is substantially more diverse.[29]


          Also consider another example,  California Western.  Its  2017-2018 bar passage rates and mean MBE scores were:                                                     


          As Table 2 clearly demonstrates, from February 2017 to July 2018, California Western has only scored 65% on the California bar exam on 1 out of those 4 exams.   And on the most recent exam, July 2018, its first-time passage rate fell to 52%.   Even though California Western students scored below 65% on the bar exam in 3 out of those 4 tests, its students  far exceeded the national mean MBE score on all 4 out of 4 exams:  Feb. 2017:  +4.40; July 2017:  +5.0; Feb. 2018:  +10.80; July 2018:  +3.30.  California Western’s mean MBE score from February 2017 to July 2018 was so high, 142.9, that its students would have been admitted to the bar in almost every other state except Delaware (145).  It is beyond absurd that schools, such as California Western, whose students demonstrate great proficiency on the bar examination (MBE) in relation to all other students nation-wide taking the same national examination should be at risk of failing an ABA  bar passage standard or at risk of being de-accredited.


B)     The Council’s Report on the 75% in 2-Year Standard Does Not Adequately Discuss           the Law Schools That Are Close to and Trending Toward Failing That Proposed              Bar Passage Standard. 

          In the Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION:  REVISIONS TO STANDARD 316:  BAR PASSAGE, it failed to sufficiently discuss the number of law schools nationally that are very close to failing the proposed 75% in 2-year bar passage standard and did not discuss schools whose bar passage rates were trending toward failing to meet that proposed standard.  That data on the list of law schools close to failing to meet the 75% in 2-year standard is critical to the discussion of whether or not to approve the proposed bar examination passage standard.  In addition, the Council’s analysis did not include the many more ABA schools projected to fail the proposed standard as a result of very low 2018 bar passage rates.


          The Council’s analysis only included these cryptic statements regarding schools with bar passage rates between 65% to 69%:  (1)  For 2015 “seven more schools had rates between 60 and 69 %”; (2) for 2016 the Council did not report on the number of schools between 60 and 69%; and (3) for 2017 the Council, again, failed to report on those schools between 60 and 69%.  The Council has not produced any evidence so far about how many schools are currently meeting the 75% in 2-year standard, but which are trending toward failure.  That data is essential in order to help predict how many more schools might be adversely impacted by the proposed rule and how those dis-accreditations might affect diversity and access to justice.

          For example, consider two California ABA law schools, Chapman and Pepperdine, that easily met the 75% in 2-year standard in 2015, but which are trending quickly toward a failure to meet that standard in the near future.


  1. Chapman’s Plunging Bar Passage Rates in 2016-2018.


          As demonstrated in Table 3, Chapman’s first-time bar passage rate on the July 2014 California exam was an excellent 75%; however, since July 2016 it is “trending” toward failure of the proposed 75% in 2-year standard.  Its mean July 2016 to July 2018 bar passage rate is only 60.3%, well below the Council’s predicted success of meeting the proposed 75% in 2-year bar standard. 



          The discrepancy between Chapman’s 2016-2018 low bar passage rates which all fell below the Council’s predicted success rate of 65% and Chapman students’ performance on the MBE is astonishing.  In July 2016 with a 57% bar passage rate the students exceeded the national MBE mean by 6.6 points.  In July 2017 the students had a 64% bar passage but exceeded the MBE national mean by 6.3 points.  And in July 2018 the 60% bar passage rate exceeded the MBE national mean by 4.9 points.


          Chapman’s trend predicts that its 2018 graduates will have great difficulty meeting the 75% in 2-year standard under the Council’s predictive index.  First, its July 2018 first-time passage rate fell to 60% (15% lower than in July 2014).  But equally troubling is the trend in Chapman’s “repeat” test takers.   Consider the following comparison of Chapmans’ number and bar passage rates of its “repeat test takers” in Feb/July 2014 with those in Feb/July 2018:


                   Feb/July 2014 Repeaters                                 Feb/July 2018 Repeater  Number Taking        Mean Passage Rate         Number Taking           Mean Passage Rate             81                                 56.0%                        116   (+70%)               39% (-17%)  


This dramatic increase in the number of Chapman “repeaters” and their substantially lower passage rates will make it more difficult for Chapman in 2018 to meet the proposed 75% in 2-year standard.


  1. Pepperdine Law School’s Falling Bar Passage Rates.


          From July 2014 to July 2018 Pepperdine’s first-time California bar exam passage rates have been steadily declining from a high of 78% in 2014 to 65% in 2017, and its trending passage rate (July 2017 and July 2018) is only 65.6%, barely meeting the Council’s predictive index of successfully meeting the proposed 75% in 2-year standard.[30]  In addition, Pepperdine’s “repeater” bar passage rate is trending much lower (20% in July 2018) versus 56% in July 2014, making it even more difficult for the school to meet the 75% in 2-year standard in 2018.[31]


          The discrepancy between Pepperdine’s falling first-time and repeater bar passage rates and its students’ mean MBE scores is illustrative of the unfairness in the Council’s proposed 75% in 2-year bar passage rate.  For instance, even though Pepperdine’s first-time bar passage rate in July 2017 was only 65% (the minimum predictive Council score for meeting the proposed standard), its mean MBE score was 145.1 [3.4 points above the national MBE mean].  Similarly, even though Pepperdine scored only 66% on the exam in July 2018, its MBE mean of 148.1 was 8.6 points higher than the 139.5 national MBE mean.  This data demonstrates that Pepperdine is not only providing its students with an excellent education, but also that its students are performing well enough to pass every other bar exam in the country.


          None of the Council’s reports have provided data on how many ABA law schools, like Chapman and Pepperdine, are in 2018 trending toward failure of that standard in the near future.  However, before the Council votes, it owes the public a full report on the impact of the proposed rule on the number of law schools likely to soon fail the proposed standard based on passage rate trends, as well as  the impact of the loss of those trending schools on diversity.[32]


2)     Student Consumers’ Limited Access to Law Schools’ Mean MBE Data Distorts                Schools’ Student Outcome Measures and Undervalues California Schools’                      “Value-   Added” Educational Programs.


          It is unclear why some states publish law schools’ mean MBE scores while others choose to keep that information secret.  Although I have disclosed California law schools’ mean MBE scores in my Blueprint article, supra., the following discussion publishes the mean MBE scores for a few out-of-state law schools and compares those out-of-state schools’ “mean MBE scores/bar passage rates” with California ABA law schools.  This data also demonstrates that these California law schools are achieving higher mean MBE scores even though their students are much more diverse than the comparative schools that are located in low cut score states.  That result is known as “value-added” education.

                                                                                                                                                                    Tables 5 and 6, infra.,  for the first time present clear evidence that permits us to test whether the Council’s proposed 75% in 2-year bar passage standard adequately balances the gist of ABA Standard 301 (student entering statistics), Standard 503 (student outcome measures), and Standard 206 (diversity in the profession).   This data also challenges the Council’s declared purpose, student consumer protection, for its proposed amendment to Standard 316.  The following comparisons among California ABA law schools and out-of-state ABA law schools demonstrates that the proposed 75% in 2-year standard rewards out-of-state schools with much worse student outcomes on the MBE (demonstrated knowledge of substantive law after graduation).  In addition, the Council’s proposed standard will keep better qualified diversity candidates (schools with high diversity, high MBE, and low bar passage rates due to high bar exam cut scores) from practicing law.


        For example, compare Southwestern with Mercer.  Even though both schools’ entering student LSAT’s and GPA’s are very similar, Southwestern students’ mean MBE score was 1.7 points higher than Mercer’s.  That higher MBE mean score would not be remarkable were it not for the great disparity in the diversity of the two schools’ students.  Mercer’s 2015 entering class only included 23 Hispanic and Black law students combined (18.4% of first-year students).  However, Southwestern’s entering class included 97 Hispanic and Black students (30.6% of first-year students).  Something exciting is obviously occurring at Southwestern since its high-diversity students are outperforming their MBE predictive index and outperforming similar out-of-state schools.  However, now compare Mercer’s mean MBE score with its July 2017 bar passage rate with those of Southwestern Students:


 Mercer                   Mean MBE Score:  140.2           First-Time Bar Passage:  84.2%     Southwestern      Mean MBE Score:  141.9           First-Time Bar Passage:   57.0%


        By applying the Council’s proposed 75% in 2-year passage standard we are left with the following results:  Southwestern is at risk of de-accreditation (substantially below a 65% bar passage rate) even though its students outperformed Mercer’s and even though Southwestern students exceeded the national mean MBE score.  The problem, of course, is that Mercer is judged by a 135 cut score and Southwestern is judged by a 144-cut score.  In essence, this comparison of Mercer and Southwestern makes it as clear as it can be that the Council has not drafted a national accreditation standard, but rather has ceded its authority to the vagaries of state politics and cut scores.  Although the Council is obviously free to make such a decision, it must also take full responsibility for needlessly reducing diversity in law schools that are providing a value-added education to students that meet or exceed the national mean MBE score.


        In addition, when Mercer’s statistics are compared with those of  Golden Gate and USF the result of the application of the proposed 75% in 2-year standard is equaling troubling.  Although Golden Gate’s LSAT/GPA metrics are far below Mercer’s, Golden Gate’s mean MBE of 139.2 was only 1 point lower than Mercer’s, but because of the difference in cut scores, Mercer had an 82.4% bar passage rate compared to Golden Gate’s 51.0% rate.  Golden Gate’s passage rate, 52.0%, is predicted to fail the Council’s proposed standard.  The comparison between Mercer and USF is similar.  Even though USF had a mean MBE score only .7% lower than Mercer, its bar passage rate was only 54.0% compared to Mercer’s 82.4%.  Although USF provided its students with an equivalent education, it is at risk of de-accreditation.  Further, USF students performed similarly to Mercer’s on the bar even though USF had a 29.9% diversity rate for Hispanic and Black students compared to Mercer’s 18.4% diversity rate.


        Table 6 demonstrates that the Council’s proposed 75% in 2-year bar passage standard has substantially undervalued Standard 206 (diversity), has applied a very narrow and constricted definition of Standard 503 (outcome measures), and provides virtually no weight to schools’ “value-added” educational outcomes (MBE scores much greater than student LSAT/GPA’s predictive index). [33]




        The “mean MBE score/ cut score relationship” on the July 2018 bar examination further demonstrates the unfairness of the Council’s proposed 75% in 2-year standard.   Mercer students and Southwestern students who took that bar exam had almost identical LSAT’s and GPA’s according to the ABA 509 data [slightly higher GPA’s for Mercer and slightly higher LSAT’s for Southwestern].  The differences in diversity continued [Mercer 18.4% H + B and Southwestern 30.3% H + B students].  Even so, Southwestern students substantially outperformed Mercer students on the MBE [Southwestern’s mean MBE 140.3 versus Mercer’s 137.1].  However, again, Mercer’s students who performed substantially worse than Southwestern students earned a 73.3% bar passage rate versus Southwestern’s 53.0% passage rate.  This data predicts that Southwestern may be de-accredited under the 75% in 2-year standard, but Mercer will be saved based solely on its state political decision of adopting a much easier cut score. 


        But more startling is the comparison between Connecticut students’ “mean MBE/bar passage percentage” in relation to Southwestern’s students.[34]  Connecticut students have  substantially higher LSAT’s (158/156/154) and GPA’s (3.62/3.31/3.08) than Southwestern students’ LSAT’s (155/152/149) and GPA’s (3.46/3.18/3.00).  Therefore, it would be predicted that Connecticut’s mean MBE score would be much higher than Southwestern’s.  However, on the July 2018 bar exam Southwestern students outscored Connecticut students by 3.10 points (140.9 vs. 137.8) on the MBE.  What is more remarkable is that Connecticut’s entering class only included 27 Hispanic and Black students compared to Southwestern’s 84 Hispanic and Black students.  How does the Council’s proposed 75% in 2-year standard reward Southwestern for substantially exceeding its predicted MBE mean score and for performing significantly better than another out-of-state ABA law school – it predicts that based on Southwestern’s 53.0% passage rate that Southwestern will be de-accredited unless it significantly increases its bar passage rates.  As this example demonstrates, the 75% in 2-year standard fails to adequately inform prospective student consumers of the real “value-added” legal education at schools like Southwestern, and provides an over-inflated metric on Connecticut Law School students’ outcome measure (bar passage rate vs. actual MBE performance in relation to the national mean MBE).  My proposal to add the mean MBE performance to the 509 disclosures helps to cure this disparity and will provide prospective students with much more comparative data on the quality of schools’ educational program outcome measures.


3)     Even Though California Law Schools Consistently Outperform Out-of-State                  Law Schools on the MBE, They Are Penalized by ABA Standards Based on                    California’s Politically Set High 144 Cut Score.


        Professor Derek Muller has charted the comparative California Law Schools’ mean MBE scores in Comparison to the national mean MBE combined scores for all other law schools and demonstrated that California schools consistently perform better on that exam[35]:



        There is something terribly wrong with an ABA accreditation standard that would de-accredit a law school that routinely meets or substantially exceeds the national mean MBE score on the bar examination, but which condones other schools with lower MBE mean scores just because they are situated in an easy bar exam cut score state.  No other U.S. national accreditation agency, including those in medicine and nursing, has ever disaccredited an educational institution that has routinely met or exceeded the national mean on its accreditation examination.  The ABA should not be the one outlier accreditation agency in this country to disaccredit schools whose students are performing at the national mean, especially since such a standard will substantially decrease diversity in the profession.


        The following scatter charts visually demonstrate the dramatic effects of state bar exam cut scores on ABA accreditation.  These charts demonstrate how the California 144 cut score penalizes California law schools that have much higher diversity rates and higher MBE mean scores than out-of-state law schools that have much lower diversity and lower mean MBE outcome scores under the Council’s proposed 75% bar passage rate within 2-year standard:






        California is currently suffering a crisis in the diversity of its bar, not only because the attorney/minority ratio is one of the lowest in the United States[36], but also because several of its current ABA approved law schools may soon close or lose ABA accreditation:  (1) Whittier; (2) Thomas Jefferson; and (3) Western State.  First, because of Standard 316 Whittier was not able to enroll a sufficient number of qualified students to sustain its program.  The closure of Whittier will result in the loss of between 60 to 70 Hispanic students and 10-15 Black students in California law schools annually.[37]  Thomas Jefferson is currently on ABA probation and is suffering a similar fate as Whittier.   It has had to sell its law building and its 2018 entering class is only 57 students (26 Hispanic and Black students combined) compared to its 2014 class of 357 students (98 combined Hispanic and Black students).[38]  Finally, in January 2019 Western State was placed into receivership because its parent, Dream Center Education Holdings, is apparently not economically viable.  As a result, Western State has already started to shed some of its administrative staff.[39]  If Western State closes its usual minority admissions spots (approximately 60-70 Hispanic and 9-15 Black students) will also be lost in California.[40]


        Because of the new diversity crisis in California ABA law school admissions, and because of the disparate impact of California’s 144 cut score on California schools’ ability to achieve high bar passage rates, it is the worst of times for the Council to increase the difficulty of meeting Standard 316 by adopting the proposed 75% passage rate within 2 years of students’ graduation.  However, by adding my proposed “mean MBE score” alternative to Standard 316, some of the disparate racial impact of current or amended Standard 316 will be mollified.


4)     Council Members’ Arguments at The Open Session Based on a Medical School            Accreditation Analogy Are Misplaced for Several Reason.


        In the February 2019 Council Open Session an analogy to medical school accreditation was mentioned by several Council members.    One argument characterized law school accreditation and medical school accreditation as similar processes that are part of the larger field of “regulated industries”.   The regulated industries reference was used to argue that the ABA simply has no control over what state supreme courts set as a passing bar passage cut score, and therefore, states, like California, with outlier cut scores should not form the basis for rejecting the proposed 75% in two-year bar passage standard.  That regulated industries theory will lead to an extreme reduction of diversity in the bar, and its application to the comparison between medical school accreditation and law school accreditation is misplaced.


    A.     The Regulated Industries Analogy to Medical School Accreditation Is                              Seriously Flawed.


        The regulated industries model analogy to medical school accreditation is seriously flawed for the following reasons:


  1. Unlike ABA Standards, Medical Accreditation Standards Do Not Include A Specific Minimally Acceptable Medical Board Passing Rate.


        The Liaison Committee on Medical Education (LCME) accredits medical education programs in the United States.[41]   Unlike the ABA bar passage standard that proscribes a specific acceptable passage rate standard, the medical accreditation standards do not contain any specific metric for students’ medical board examination scores.  The LCME accreditation standard for student medical school board scores merely states that a medical school will be compared with “national norms”:


8.4           Program Evaluation:  A medical school collects and uses a variety of outcome data, including national norms of accomplishment, to demonstrate the extent to which medical students are achieving medical education program objectives and to enhance medical education program quality.


        In actual implementation, the LSME considers how close a medical school’s students score to the national mean on the medical boards [similar to my proposal to compare law schools’ mean MBE scores to the national MBE mean].  For instance, when the LCME threatened to place the University of Louisville School of Medicine on probation for reasons other than low student board scores, the LCME noted that the school’s USMLE Step 1 and Step 2 student board examination scores were “above” or near “the national means” and were therefore acceptable.[42]


  1. Medical Schools Are Not De-Accredited Based on Students’ Board Exam Scores Because, Unlike Bar Exams, Passage Rates on the Boards Are So High.


        The reason why no medical school in the United States or Canada has ever been sanctioned based solely on low student board test scores is because the mean passage rate on the medical boards is so high.  For example, the mean first-time score on the Step 1 medical board exam (taken after two years of medical school) for students who attended U.S./Canadian medical schools was 96% in 2016 and 96% in 2017.[43]   And the passage rates on the next  set of medical board exams, the Step 2 CK, were 97% in 2016 and 96% in 2017.[44]


        The use of the medical school accreditation model to form law school accreditation standards is misplaced because:  (1) medical school accreditation does not have a specific minimal exam passage rate, but merely compares schools’ scores with the mean national board test score; (2) the mean medical board scores are so high that no medical school is likely to have a problem with a significant percentage of its students failing the exam; and (3) by using a national mean score rather than a specific percentage passing score racial disparity among states based on the medical board examination is less significant in medical school accreditation.  Therefore, the Council should reject the analogy to the medical school accreditation model as a justification for passing the proposed 75% in 2-year standard.  The Council should also reject the argument that the ABA should not consider states’ outlier high bar examination cut scores in balancing consumer protection, law school accountability, diversity in the profession, and access to justice in promulgating law school standards. 


    B.     A Regulated Industry Model That Disregards Outlier States’ Cut Scores Will                  Have A Devastating Effect on Diversity.


        In Table 4 [see Appendix],  I demonstrate how a constricted regulated industries model which does not account for state specific cut scores will have a serious and unconscionable effect on Hispanic law student enrollment in California.  The California ABA Hispanic student matriculants represent 17% of all Hispanic students admitted to ABA law schools in 2018.  According to the regulated industries theory, even though a significant percentage of all Hispanic law students are enrolled in California, a national accreditor, like the ABA, is not required to draft a bar passage standard that takes into consideration the disparate racial impact of its standard on those students because California has an outlier bar exam cut score [144].  I call on the Council to reject the regulated industries model as defined in the Open Session, and instead to continue using a model that as best as possible reduces racial disparities in the ABA accreditation process created by state specific cut score decisions.


        Because adoption of the proposed 75% in 2-years standard will dramatically reduce diversity in the bar, especially in California, the Council should reject the medical model and continue studying other accreditation methods such as my proposed national MBE median criteria.  In addition, before the Council votes on the proposed amendments to Standard 316, it should publish for public comment a complete analysis demonstrating not only which law schools would currently be predicted to fail the 75% in 2-year standard, but also a list of those schools in 2018 that are trending toward failure of that bar standard in the near future.  Without a complete report on the probable effects of the new rule on all ABA approved law schools in 2018 and beyond, the Council cannot possibly balance all three of the ABA’s goals:  (1) student consumer protection; (2) diversity; and (3) access to justice.  Even if the Council as a private accreditation association[45] arguably may have the legal ability to promulgate standards with known serious racial disparities and effects, that power does not mean that the Council should promulgate such a standard. 




Respectfully submitted,


William Wesley Patton                                                                                                    Lecturer in Law, USC Gould School of Law; Assistant                                              Clinical Vol Professor, UCLA David Geffen School of Medicine,                    Department of Psychiatry                                                              






[1]   The ABA’s official report that was updated on April 12, 2018 lists 19 failing schools. (  However, the correct number of failing schools could now be 16. ( Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION, Revisions to Standard 316:  Bar Passage, at 6, states that schools with first-time bar passage rates below 65% will have difficulty meeting the 75% in 2-year standard.  The media have reported that three schools have corrected their 2-year bar passage rates.   

[2]   Appalachian 69.8%; Arizona Summit 50.0%; New England 71.1%; Pontifical 54.5%; Inter American 57.2%; U.D.C. 71.4%; Atlanta John Marshall 59.8%; Valparaiso 66.0%; Florida Coastal 64.5%; Golden Gate 74.0%; South Dakota 67.9%; Howard 54.6%; Barry 71.1%; Whittier 57.3%; Ave Maria 73.3%; Thomas Jefferson 63.8%; Dayton 69.2%; Puerto Rico 69.9%; Appalachian 69.8%; La Verne 74.5%; USF 67.4%; McGeorge 73.5%; Elon 66.3%; Vermont 74.3%; and, Western Michigan 69.0%. 

[3]  The Council’s report, SECTION OF LEGAL EDUCATION AND ADMISSION TO THE BAR, AMERICAN BAR ASSOCIATION, Revisions to Standard 316:  Bar Passage, at 6, states that schools with first-time bar passage rates below 65% will have difficulty meeting the 75% in 2-year standard.  The media have reported that three schools have corrected their 2-year bar passage rates.  

[4]   (

[5]   (

[6]   Unfortunately, there is a discrepancy between two of the ABA’s reports on schools’ 2018 first-time bar passage rates.  One report, First Time Bar Passage Calendar Year 2018 (Last Updated April 24th 2019), includes more than 49 schools with rates below 65%.  However, another report, First Time Bar Passage by School by Jurisdiction 2018 (Last Updated April 22nd 2019), includes only 49 schools with cumulative 2018 passage rates for all jurisdictions.  For this study I have selected the more conservative report that 49 schools had 2018 bar passage rates below 65% in order to avoid overstating the effects of the proposed 75% in 2-year standard on diversity.

[7]   See ABA Section on Legal Education and Admission to the Bar, First-Time Bar Passage by Jurisdiction 2018.

[8]   See ABA 509 data (

[9]   These passage rates are based on the cumulative number of students taking the February and July 2014 and the number taking the February and July 2014 versus the number passing the exams based on information supplied by the California State Bar.  (

[10]   Since La Verne had fewer than 11 repeat takers for the February 2018 exam the California State Bar did not publish their statistics.

[11]   Mark A. Albanese, The Testing Column:  July 2018 MBE: The Storm Surge, Again, The Bar Examiner, at 30 (Fall 2018).

[12]   “[T]he decent of performances on the bar examination that began in July 2014 results has created an accumulating group of repeaters.”  Erica Moeser, President’s Page, at 5, the Bar Examiner, June 2017.   Repeaters comprised 67% of all examinees on the February 2018 MBE.  Mark A. Albanese, The Testing Column:  February 2018:  The MBE Storm Surge Continues, at 28, The Bar Examiner, Summer 2018.

[13]   Id., at 27.

[14]   See California February 2016-2018 bar examination statistics at ( 

[15]   Letter from Kyle McEntee to Council (…).

[16]   Id., at 4.

[17]   Obviously, since we do not yet have the repeater passage rates for July 2018 graduates, my predictions are based on the last several years of repeater bar passage rates for these 9 CA ABA schools.  Further, since the NCBE is predicting that repeater rates will continue to fall for the next few years, it is unlikely that the CA ABA law schools will be able to dramatically increase their repeater rates for the February 2019, July 2019, and February 2020 California bar exams.

[18]  Supra., notes 6-7.

[19]   I am well aware of the debate over why Hispanic and Black students often score lower than White students on the MBE.  Although the NCBE has published several articles defending the MBE against claims of racial bias, others continue to argue that the MBE is structurally biased.  See, e g, Christina Shu Jien Chong, Battling Biases:  How Can Diverse Students Overcome Test Bias on the Multistate Bar Examination, 18 University of Maryland L. J. of Race, Religion, Gender and Class 31 (2018); Susan M. Case, Persistence on the Bar Exam, The Bar Examiner 20 (Dec. 2012); Susan M. Case, Urban Legends About the Bar Exam, The Bar Examiner 44 (May 2008).  However, in the context of  the proposed 75% passage rate within 2-years of Standard 316, even if the MBE is racially biased, its use is warranted at this time as an alternative accreditation standard because it mollifies the impossibility of law schools’ ability to maintain and/or increase diversity in states with extremely high bar exam cut scores. 

[20]   Since I am proposing a dual standard [bar passage and/or mean MBE score], even if a state does not use the MBE on its bar exam the alternative of bar passage rates is still is still operative in this “dual standard”.

[21]   The NCBE has consistently stated that “The MBE total score is highly reliable…[however] scores

from   written   tests   have   much lower  reliability.”  Susan M. Case, The Testing Column:  Best Practices With Weighting Examination Components, The Bar Examiner, February 2008, at 43-44.  In fact, the MBE is used to “scale” states’ essay question so that the comparable difficulty of essay questions on different bar exams remain similar.  The essays are scaled to the MBE because it is a more reliable and stable testing instrument:  “This process allows the written  portion  of  the  exam,  which changes with every administration but cannot be equated like the MBE, to capitalize on the stability of the MBE,  thereby  accounting  for  any differences in question difficulty or

grader stringency.” Judith A. Gundersen, President’s Page, The Bar Examiner, Winter 2017-2018, at 5.

[22]   As my law review article, A Blueprint…, supra, page 2, demonstrates, California law schools with bar passage rates that would fail the proposed 75% in 2-year rule are graduating students whose mean MBE scores would admit them into almost every other bar in the United States.

[23]   April 25, 2019 email from William Wesley Patton to regarding “ABA Council and Proposed Bar Exam Standard”.

[24]   Id. 

[25]   April 26, 2019 Email response from Christina Doell, California State Bar Association, to William Wesley Patton:  “NCBE provides the State Bar with the MBE score of every applicant that took the California Bar Examination. The State Bar attaches this score to each applicant’s exam record. These scores are later used to compile the [schools’ MBE mean] statistics.” 

[26]   SECTION OF LEGAL EDUCATION AND ADMISSIONS TO THE BAR, AMERICAN BAR ASSOCIATION, REVISIONS TO STANDARD 316:  BAR PASSAGE, at 6.  I have demonstrated that the Council’s prediction on the number of schools that will fail the 75% in 2-year standard is an underestimation.

[27]   (

[28]   (

[29]   For a comparison among Southwestern entering students’ LSAT/GPA’s and median MBE score success and those of other higher rated out-of-state law schools, see, Patton, A Blueprint…, supra., 24 Wash. & Lee J. of Civ. Rts & Soc. Justice 3, at 43-45 (2017).  Hispanic and Black graduates consistently underperform White candidates on the California Bar Exam.  For instance, here are the July 2018 CA bar exam first-time passage rates:  (1)  Whites:  69.5%; (2) Hispanic: 56.3%; and, (3) Black: 45.1%. (   It is amazing that Southwestern with a combined Hispanic and Black student population of 30% was able to meet or exceed the national mean MBE score consistently for the last decade. 


[31]   Id.

[32]   Santa Clara is  another California ABA law school that is “trending” toward future failure of the 75% in 2-year standard.  Santa Clara from February 2014 through July 2018 had a bar passage mean of 66.4%.  But on the most recent exam, July 2018, Santa Clara only had a 58% first-time passage rate and a 20% “repeater” passage rate.  However, even on the July 2018 exam, its lowest passage rate in a decade, its students still scored a mean MBE of 144.6, 5.1 points higher than the national mean MBE (139.5).  In 2018 Santa Clara’s entering first-year class had 61 Hispanic and 7 Black students.

[33]   This data is found at This mean MBE data and passage rates were obtained from (, (, and ( ).


[34]  As Table 6 also demonstrates, even though entering students’ LSAT’s and GPA’s are almost identical at Southwestern and Quinnipiac, Southwestern outscored Quinnipiac by 3.40 points (140.9 vs. 137.5) even though Southwestern had more than four times as many diversity students.  However, based on cut scores Quinnipiac had an 82.0% bar passage rate vs. a 53.0% bar passage rate for Southwestern, thus placing it within the at-risk range under the Council’s proposed 75% in 2-year standard.

[35]   Derek Muller, California bar exam takers are far more able than others nationwide but fail at much higher rates, Excess of Democracy, Nov. 21, 2015 ( 

[36]   For instance, Hispanics comprise approximately 38% of California’s population, but there are fewer than 8,000 Hispanic licensed attorneys practicing in California.  Patton, supra., at 6.

[37]   See 509 reports (file:///C:/Users/bpatt/Downloads/Std509InfoReport%20(2).pdf).

[38]   Id; (

[39]   Stacy Zaretsky, Law Students Unable to Pay Rent Without Student Loans After School Goes Into Receivership, Above the Law, Feb. 8, 2019.   A March 4, 2019 letter from Barry Currier to Western State students informed them that the ABA has only two remedies available: (1) a review process that could result in the school’s loss of accreditation, or (2) requiring the school to file a “teach out” plan for orderly school closure.  (

[40]   (; (

[41]   Hunt, et al, The Variables That Lead to Severe Action Decisions by the Liaison Committee on Medical Education, 91 Academic Medicine 87 (2017), at 87.

[42]   MEMORANDUM:  Liaison Committee on Medical Education, Secretary of the ad hoc Survey Team That Visited University of Louisville School of Medicine on April 14-17, 2013, at 21 and 49; University of Louisville School of Medicine:  Request for LCME Reconsideration of Accreditation Findings and Actions, January 28, 2014 arguing that the school’s board scores are “at or above the national pass rate….”

[43]   2017 PERFORMANCE DATA:  STEP 1  (

[44]   2017 PERFORMANCE DATA:  STEP 2 CK (https//   The passage rates for the two remaining medical boards, the Step 2 CS and the Step 3 are also between 96%-98%.  (Id).

[45]   Professional Massage Training Center, Inc. v. Accreditation…, 781 F.3d 161, 169 (2015).

Bylaws and business meetings: a 1L experiential module

Ruth Anne Robbins, Distinguished Clinical Professor of Law, Rutgers Law School

The first year of law school rightfully has been criticized for overly prioritizing the litigation model and for making it the central focus of our teaching. This emphasis lulls students into believing that the judicial audience is the primary consumer of legal communications. To counteract that skewing, those of us teaching in the 1L curriculum are often exhorted to find ways to discuss transactional forms of legal writing. But, contract-drafting is not easily built into a curriculum already bursting at the seams with the must-have’s that we cram into the lower-credited experiential classes of the 1L year.

Enter the idea of dedicating part of two or three classes to small-organization bylaws and business meetings. The bylaws of a small organization are constitutional, so this type of teaching module fits in nicely with what they are learning in other introductory courses. And while some students may know a little bit about bylaws and business meetings from previous experiences in college, religious groups, or other volunteer activities, most students probably won’t have a great deal of knowledge. Learning about these ideas will appeal to them because of the immediate applicability to the very student-run organizations in which, as rising 2Ls, they are poised to assume leadership positions.

I begin by asking those students with a little bit of knowledge to help me outline, on the board, the setup and order of a business meeting. Typically, at least one or two students in a group of 20 will be able to walk others through it with a little bit of prompting. We talk about why a roll call must happen right after the call to order and opening ceremonies. Ask your own students how many of them know something about quorum—you may be startled to learn how few students do. Teaching them what quorum is and how it relates to business-agenda items engages the students and almost immediately makes them realize just how practical this module is.

Discussions about business meetings naturally leads to a conversation about the rudiments of Robert’s Rules of Order and how voting happens on an agenda item.[1] I have sometimes run a class or two in a business-meeting format, inviting students to make formal motions about some of the softer deadlines in the course. As part of that, students must calculate quorum to hold class at all. I always ask them the lowest number of votes it would take to carry a vote, assuming we had exactly quorum present. Students are awoken to the fact that in a class of 20 students, 6 students might be able to bind the other 14. (That is: quorum for a group of 20 students is 11. And if only 11 are present, a simple majority to carry a vote is 6). “It’s important to show up and have your vote counted,” I have remarked. The message isn’t lost on them.

Students also have the opportunity to step into role for actual representation work. A few years ago, knowing this module, our Women’s Law Caucus president approached me and asked if the 1Ls in my class might provide some advice about issues her executive board had identified in their bylaws. Naturally, I immediately agreed. To prepare students for their client, they first looked at a larger set of bylaws I had worked on for a local high school boosters organization. I changed a few items to take the bylaws out of compliance with the New Jersey statutes governing non-profit organizations (a relatively easy statutory scheme). Fifteen questions later, they knew enough to issue-spot in the much simpler student-organization bylaws. Then, in small groups, they looked at the Women’s Law Caucus bylaws and a week later offered their recommendations to the officers. Who adopted almost all of the advice.

This was such a feel-good moment for all involved that I have made it an annual module. Depending on the year, I have had students conclude with a client letter written by the small groups together, or I have simplified it even further and simply had the 1L students meet with the organization’s officer in class to offer their verbal recommendations (I act as scribe for the  officer in those circumstances). Each year I walk away impressed with the speed of absorption my 1L students have for this material. They take the representation seriously, and I think that they also enjoy it. I am likewise impressed with the 2L and 3L student’ willingness to serve as the client for my 1Ls even though it will net them extra work down the road as they work through the bylaws-amending process. I think they also feel that they learn valuable lessons by being the client. Having just completed this year’s project, I already have received a request from an organization’s new president to have my next year’s 1L students put her organization’s bylaws under their microscope.

This assignment is win-win for all involved. It is low-stakes for the 1L students, but it engages them in professional identity development, statutory analysis, problem-solving, and client-counseling skills. The module provides a pragmatic experience—who among us hasn’t been part of a business meeting or bylaws consultation?—and it offers a different perspective on legal practice. To put it simply: it’s relatively easy, it’s fun, and it’s real-world. I highly recommend it to others.

[1]The essentials of Robert’s Rules can be found online although the 11thedition is still a to-purchase item.


I’ve been a devotee of Parker Palmer ever since I read The Courage to Teach.[1]  I often think of his statement: “We Teach Who We Are.”[2]   In January, David Brooks, Op-Ed columnist for the New York Times, echoed a similar sentiment.  Brooks’ piece, entitled Students Learn from People They Love,[3] told of how a class he taught at Yale softened around him after he had to cancel office hours a few years ago, having shared with his students that he “was dealing with some personal issues and a friend was coming up to help me sort through them.”  Recognizing something that many of us have long known, Brooks drew the connection between emotional relationships and learning.  Thus his Palmer-like statement: “[W]hat teachers really teach is themselves—their contagious passion for their subjects and students.”[4]

But we teachers are much more than our passion for our subjects and our students. We are human beings who bring into our teaching the accumulation of all the innate and environmental influences and experiences of our entire lives.  While these influences implicitly affect how we teach, each of us strikes a balance of how much of the personal we explicitly bring into our interactions with students, both in the classroom and outside.  My tendency is to share a great deal about myself, to share my personal stories.

I am an advocate of holistic lawyering, of the essentiality of understanding that a client comes into a lawyer’s office with a host of needs, only some of which are legal.  The lawyers I regard as most effective, the ones I most admire, are those who recognize their clients’ multiplicity of extra-legal challenges and, where appropriate, address them, if only by suggesting or referring to other professionals.

In writing this blog entry, I came to realize that I might appropriately call myself a holistic teacher. I’m not only committed to teaching my students the knowledge, skills, and values of the profession they are studying to enter.  I care about how they will bring their entire beings into their careers.  All of their signature strengths as well as their challenges.  If they are struggling in any part of their lives, it will likely bleed into their performance as students and, ultimately, left unaddressed, into their careers as lawyers.

I teach a general civil externship seminar.  I have the luxury of inviting my students to focus on key aspects of successful lawyering generally taught, if at all, only in clinics and externships.  These include, above all, the people skills so essential to effective lawyering:  communication; cultural competence; emotional intelligence; self-care; and finding realistic and healthy balance among work, family, friends, and self.

A joke I tell about myself and often share with students in distress is if a student comes into my office complaining of a hangnail, I recommend talk therapy, because it has been so helpful in my own life.  I have long been open about my history of episodic clinical depression, and have shared it with students and others struggling with their own mental demons.[5]   Sometimes, however, I wonder if I risk crossing the line between teacher and therapist.

Here’s a recent example.  I have been working on a pro bono case with a student now in her last semester.  We’ll call her Susan; not her real name.  Several times Susan had promised to get me a draft of a letter to a Congressperson for my review by a certain date and had not done so.  I had told her that I understood she had a lot else on her plate, and just to send me an email if she wasn’t able to get it to me by the date she had promised.  A week or so ago, she assured me that wouldn’t be necessary; she would absolutely get it to me by the end of the following weekend.  That didn’t happen.  Last week Susan and I were talking after a lunch hour program about her upcoming interview for post-bar exam employment at a firm with which she was currently externing.  It was an encouraging and upbeat conversation.  At the end, I gently mentioned that I had neither received the draft letter or an email explaining she was unable to get to it.   She confessed that she had begun to draft the email, but was just too anxious to finish it.  Apparently, this was not an isolated instance; she has long been plagued by anxiety.  A deeper conversation ensued about the importance of being in communication for the career she was entering.  I mentioned something I often said to students, that they wouldn’t have gotten as far as they had if they didn’t have an awful lot going for them.  But if something wasn’t working and they were unable to fix it on their own, there was no shame in seeking professional help.  Susan shared that she knew this, and also that she had stopped going to therapy several years ago when her therapist had suggested she might benefit from anti-anxiety medication.  Susan was and is adamantly opposed to psychiatric medication.  I stressed the importance of not burying her head in the sand, that the choice of whether to take medication would be her own but that not doing anything to solve a seemingly intractable problem was not a rational choice.  I reminded her of my own mental health history.  She later wrote, thanking me for the advice.

I write all of this with admitted ambivalence.  I even question whether it belongs in the “Best Practices” Blog.  For I struggle with my tendency to so readily recommend therapy to my students.

There are times when crafting my journal prompts, I have to remind myself to relate them to my students’ lives as externs and the lawyers they are becoming.  An example, from our exploration of Emotional Intelligence:

  1. Reflect on how well you think you know yourself, your MO (modus operandi). For example, are you aware of what your immediate response is to an upsetting or difficult situation?  Are there automatic responses you have that you would like to change?  Specifically, do any of these responses tend to create problems for you professionally, to keep you from being the law student or lawyer you most want to be? If so, what steps can you take to change them?

and another based on the chapter I contributed to Learning from Practice,[6] on Work and Well-being:

  1. Considering chapter 25 and our class discussion on Monday, reflect on what, if any, habits or practices you have developed in law school that detract from your sense of well-being. What, if any, habits or practices contribute positively to your sense of well-being?  What if anything might you change to improve your well-being, now and going forward?

Is it good practice to probe so personally into my students’ inner lives?  Is it sufficient that I offer them the option of writing about something less personal?  Although I may have doubts, I find that these prompts often elicit some of the most thoughtful reflections of any my students write.  Self-awareness, like awareness of others’ emotional contexts, is so critically important to being an effective lawyer.  Where in the curriculum can we explore this if not in experiential courses and, specifically, in journals?  Here’s a recent example from one of my student’s journal responding to this prompt:

One habit developed during law school that detracts from my well-being is that I have stopped going to the gym and eating healthy. I was always very into fitness and living a healthy lifestyle….  The time constraints of law school and working fulltime have forced me to essentially eliminate this from my life. . . .  My physical health and body image definitely play an important role in my mental health.  In the future, I think it will be important for me to carve out time to keep this part of my life….

I value the importance of introducing my students to positive psychology[7] and mindfulness practices, both empirically demonstrated to provide a plethora of benefits.[8]  For more than ten years, I have had a regular morning meditation practice which has been hugely beneficial in my own life and work.  For many years I have introduced my students to mindfulness meditation in the first class of the semester.  I usually show a wonderfully accessible twelve-minute video of Anderson Cooper’s introduction to meditation at a weekend silent retreat with Jon Kabat Zinn.  For even longer than that, I have begun each of my classes with two minutes of what I have come to call “settling in,” accompanied by an introduction, and invitation, to mindful breathing.  Some students find it to be an invaluable tool for settling their minds and reducing their anxiety, in and out of class.  Many others are agreeable to practicing it in class, but not inspired to try it elsewhere.  Still others find it to be a hippy-dippy waste of time.  I know it alienates some students, but that’s a cost I deem worth it for the possibly life-long benefits it provides for others.

I consciously model vulnerability, fallibility, and taking responsibility for messing up.  I admit my MO—being a scold.  I am naturally impatient with students who haven’t lived up to their responsibilities, who haven’t exhibited the professionalism becoming a lawyer requires.  I work hard at not acting out of my “default position.”  Too often, I fail.  Even if I say nothing, it shows on my non-poker face.  To put it mildly, this does not improve the climate in the classroom.   Here’s an example from this semester of failing and recovering:

I have the smallest seminar I have ever had: only five students.  We meet on Mondays, late afternoon.  On a Tuesday that was a “legislative” Monday[10] following Presidents’ Day weekend, only three students showed up.  The absent students hadn’t notified me. The following Monday, three students showed up.  One of the absent students had let me know that she had a stomach flu; I heard nothing from the other.

The assignment for that particular class was to prepare for partnered simulations based on the ethical dilemma hypotheticals in chapters 10 and 11 of Learning from Practice.[11]  The instructions I sent with the assignment right after the prior class, in bolded text, instructed the students to do two things:  1) coordinate with their simulation partner in advance and 2) notify me if they weren’t going to be in class so that I could make alternative assignments.  Of the three students who showed up, only one had read all of the assigned pages, and none had communicated with their partners about the simulations.

I did my best not to blow up but I was practically ready to end class then and there.  Instead, I took a deep breath, gave them a few minutes to read the hypotheticals, and left the classroom for a few minutes to cool down.  When I returned, we discussed the scenarios, rather than acting them out.  It was the best I could think to do at the time, and the discussion was sufficient to get us through the remainder of class.  It probably goes without saying that it was not a great class.

My true recovery actually occurred the following day when, having sufficient distance in time and place, I drafted, edited and emailed the class a missive I titled “IMPORTANT.”  After reciting the concerning events of the previous two classes, I added:

I appreciate that you are all juggling multiple responsibilities and substantial workloads.  This is training for careers as lawyers.  You are in the process of developing your professional identities.  I am committed to supporting you in that process.

Your professional training to become lawyers requires you to be accountable and in communication.  If you need to miss a class for which you have been assigned a particular role or task, you should inform your professor and any affected classmates in advance, or, if not possible, as soon as you can.

Law school generally, and the externship and clinical programs in particular, serve as a laboratory for developing the professionalism habits you will need for your future careers.  Towards that end, I … have attached … a Professionalism rubric[12]… .  I ask all of you, as you prepare your mid-semester self-evaluation to rate yourself on this rubric and see where you need and want to improve.  I will ask you to do the same at the end of the semester.  It’s up to you whether you want to share your rubric with me.

We are a very small group.  That has advantages and disadvantages, the latter having been evident for the past two classes.  We all need to work hard to live up to our obligations in order to maximize this learning experience for you.

I am posting the above on the Discussion forum and invite replies.  Or contact me privately and/or anonymously.

I’m committed to your success and know that you are, too.

No one took me up on the offer to post on the Discussion forum, nor to contact me otherwise.  Nonetheless, at the following class, all five students were present, thoroughly prepared for the simulations, and completely engaged.  It was a terrific class thanks to the work they put into it.

* * *

I have never been a trial lawyer or practiced law in a private firm.  I don’t have much in the way of war stories relevant to my externs’ placement experiences.  But I do have stories gleaned from seven decades of lived experience.  I have wisdom gained from pursuing my three major life passions:  One, to write and speak about more healing, relational and non-adversarial methods of achieving justice, resolving conflict, and ordering legal affairs.  Two, to decrease the shame and stigma around mental illness, having suffered six episodes of major clinical depression over the past forty-four years.  And three, to help my students envision and strive for careers that will make them excited and happy to get out of bed every morning.  I have been blessed with a career that has enabled me to pursue all three.

When I look back on my more than 36 years of teaching, I see that I have lived them holistically.  My work has been almost seamlessly integrated into the rest of my life, not separate and apart.  In both I have experienced the full gamut of emotions: joy, sadness, frustration, contentment—but never boredom.

We teach who we are.



[1] Parker Palmer, The Courage to Teach: Exploring the Inner Landscape of a Teacher’s Life (1st ed., 1998).

[2] Id. at 1.

[3] David Brooks, Students Learn from People They Love, New York Times, Op Ed 1/17/19.

[4] Id.

[5] See, e.g., Marjorie A. Silver, Healing Classrooms, in Transforming Justice, Lawyers, and the Practice of Law 264-65 (Marjorie A. Silver, ed., 2017);  Marjorie A. Silver, A Transformational Melancholy: One Law Professor’s Journey Through Depression, (2011).

[6] Marjorie A. Silver, Chap. 25: Work and Well-Being 699-724 in Learning from Practice (Wortham, et al. eds., 3rd ed. 2016).

[7] 700-05.

[8] See, e.g., Shailini Jandial George, The Cure for the Distracted Mind: Why Law Schools Should Teach Mindfulness, 53 Duq. L. Rev. 215 (2015).

[9] omitted.

[10] This is Touro’s term for following a particular day’s schedule on a different day of the week.

[11] Lisa G. Lerman & Lisa V. Martin, Ch.10: Ethical Issues in Externships: An Introduction 261-78; Alexis Anderson, Ch. 11: Ethical Issues in Confidentiality  279-93 in Learning from Practice (Wortham, et al. eds., 3rd ed. 2016).

[12] See

A Pedagogical Twist for the 1L Appellate Brief and Oral Argument

For those who teach legal writing to first-year law students, it is the season for appellate oral argument. Yes, the long-standing tradition of requiring first-year students to complete an appellate oral argument in the legal writing course continues today at the large majority of American law schools–at just under 75% of them, according to recent data. At those schools, the oral argument, which is commonly the capstone exercise near the end of the spring semester, has become something of a rite of passage for the students.

In a 2011 article, Legal Research and Writing as Proxy, I argued that assigning an appellate brief and appellate oral argument in the 1L legal writing course remains a pedagogically sound practice, even though a large majority of practicing attorneys will never engage in appellate practice, let alone complete an appellate oral argument. I still retain that view but won’t rehash my arguments here. Rather, I will focus on a pedagogical opportunity afforded by the brief/oral argument sequence of assignments that I discovered more recently.

In the last few iterations of my legal writing course, the appellate brief and oral argument assignments have proven an excellent vehicle for a bit of a pedagogical twist: A few weeks before the brief is due, not after, I teach lessons on oral argument and require the students to complete a practice oral argument round in front of my 2L teaching assistants. (The formal rounds of oral argument in front of a trio of local attorneys still occur after the briefs are submitted.) For many years, I kept brief writing and oral argument entirely separate—only after the briefs were completed and submitted would I shift the students’ attention to oral argument. (After all, that mimics the realities of the “real world“ of appellate practice.) But as a pedagogical matter, just like writing the brief helps in preparing an oral argument, working on an oral argument–and thereby having to talk out and defend one’s positions–can help in preparing a brief.

A few weeks before the brief is due, most students will have a scattered and underdeveloped array of arguments. Completing a practice oral argument can help them–or, in the case of those students who are spinning their wheels, force them–to organize and further develop those arguments for the purposes of the brief. In pursuit of this goal, I ask my TAs to give extensive feedback to both students after each practice round. Moreover, I require every student to attend two additional practice rounds as observers. At each round, the student representing Petitioner, the student representing Respondent, and the students attending as observers also begin to appreciate the formalities and peculiarities of oral argument, thus helping them to prepare for the formal rounds that will occur after submission of their briefs.

This semester, shortly after the practice rounds (just over a week before the briefs were due), my students graciously agreed to provide me some feedback on the experience. One of my students volunteered to solicit comments from all of her classmates, anonymize those comments and her own, and then send them to me. Twelve out of fourteen students in my small section gave a positive review. I include two of the more thoughtful evaluations here:

  1. I found doing the practice oral arguments before my brief was fully written to be helpful. Arguing my side in the courtroom and fielding questions from the TAs helped me more precisely narrow the theme of my arguments and determine how I wanted to frame my position in the brief itself. After receiving pushback from the TAs on certain points, I was able to refine my responses to common criticisms that would come from the other side. Additionally, I now feel more comfortable going into the “official” oral arguments having completed a practice round. However, I would have liked to participate in another mandatory practice round with the TAs after my brief is written; the substance of my oral argument has substantially changed since my first practice round.
  2. Practice oral arguments were a large motivator to get my arguments organized. I found it really helpful to speak out loud about the arguments. Doing so really helped me understand what my points were and whether or not they held up against scrutiny. Speaking about the arguments also helped me understand how they related to each other. The TA’s did a good job of making us feel comfortable throughout the process. I think overall the exercise is going to be beneficial as long as the practice round is kept informal. We were all stressed about how to perform the oral arguments, so maybe there could be a concession in the formality/process of the oral argument that could make us more comfortable.

Good food for thought, as I continue the tradition of appellate oral argument again next spring.

Are the Students Failing the Bar Exam Today Canaries in the Coal Mine warning us of a More General Need to Change Legal Education?

Thank you so much to Best Practices for Legal Education for inviting me to blog again and to Elizabeth Murad for her remarkable work in keeping contributors in touch and on track.  So much is written about the very real decline in bar passage that it is easy for schools with high pass rates–or at least high in relation to other schools in their state– to ignore the need to change what goes on in the classroom and dismiss the excellent work being done in effective law teaching as a problem for “lesser schools” in “lower tiers.”

We know, as legal educators , members of the bar and even members of the public, that bar passage rates have been falling.  And we also know that many, if not most, law schools are admitting students today with LSAT scores lower than those that they  admitted ten years ago. So it’s easy to see a correlation between lower scores and falling rates.  After all, the bar exam is a test much like the LSAT–why wouldn’t there be a relationship?   But even if students are failing the bar exam for the same reasons they are getting low LSAT Scores,  we still have the opportunity to intervene in ways that we know raise pass rates.  This blog contains so many resources for those who want to teach more effectively.   Why wouldn’t we want this for all our students?

Everyone at a school with a “bar passage problem” is well aware that we cannot continue to do the same things we always have when they are no longer working the way they used to.  But we hear this less at schools satisfied with their bar passage  Perhaps the students who are failing are really canaries in the coal mine and a warning to all of legal education that all of today’s law students find it more difficult translating their legal education into the very peculiar format required for bar passage-regardless of LSAT score? Everyone who has ever studied for the bar exam remembers it as a grueling, unpleasant, and highly intensive process–but until very recently that process started after graduation and barring personal disaster almost always resulted in passage.  Even when it didn’t, the consequences of were lower.  Today, students safely employed in September find themselves fired if October brings news of failure.  We need to consider bar passage as an issue both for students who fail and for those who pass–after all, both groups spend the same three years in law school.

Anecdotal evidence (which we could easily substitute for actual data by doing some surveys) suggests that bar passage anxiety spreads well beyond those students most at risk.  All students know that the stakes are high and many believe that their chances of passing are lower than students in the past.  Does that affect their choices while in law school?  Could they be doing more to prepare for their future careers if we could provide them more effective instruction?

Medical students and educators are expressing the same kinds of concerns about their curriculum being shaped by a test as we should be about ours.   We can’t easily change the bar exam–but we can adopt more direct methods of instruction that support not just bar passage but create time for the more complex and less exam focused thinking that we want to be going on in class.

I hope over the week to share resources that would encourage everyone to consider how studying for a very old fashioned test is negatively shaping the education of all of today’s law students. (and because it always warrants reposting-here is a recently revised article by, Louis Schulze of what they have done at FIU to apply the “science of learning” across the curriculum in support of higher bar passage.


New Rubrics Available to Help Law Schools that Have Adopted Learning Outcomes Related to Professional Identity Formation

By: Professor Benjamin V. Madison, III


A recent blog by Andi Curcio and Dean Alexis Martinez addressed the manner in which well-developed rubrics help law schools in program assessment. As newcomers to assessment of program learning outcomes, see Article, law schools need guidance on best practices for program assessment.

Rubrics are clearly a key part of assessing whether law students, by the time they leave law school, have attained skills, competencies, and traits embodied in a given school’s program learning outcomes. The Holloran Center for Ethical Leadership in the Professions created a database of program learning outcomes adopted by law schools. See Database. The program learning outcomes that many of us find most intriguing are those under ABA Standard 302(c) (exercise of professional and ethical responsibilities to clients and the legal system) and Standard 302(d) (professional skills needed for competent and ethical participation as a member of the legal profession). The competencies and skills in learning outcomes adopted by law schools under these categories include: Cultural Competency (46 schools), Integrity (27 schools), Professionalism (31 schools), Self-Directedness (41 schools), and Teamwork/Collaboration (52).

Associated with St. Thomas School of Law, the Holloran Center brought together two leaders in the professional formation movement, Professor Neil Hamilton and Professor Jerry Organ of St. Thomas Law, with faculty and staff from other law schools that have committed to pursuing professional identity formation as part of their law schools’ effort to produce complete lawyers. Like Professor Hamilton and Professor Organ and St. Thomas, these faculty, administrators, and staff–and their law schools–have demonstrated a commitment to the professional identity formation movement—a movement inspired by the 2007 publication of the Carnegie Report and of Best Practices in Legal Education. Recently, rubrics developed over the past year by working groups assigned to specific competencies were added to the Holloran Center web site, see Holloran Competency Milestones.

The Holloran Competency Milestones offer any law school that has published a program learning outcome in the competencies listed above—competencies that some educators may consider too challenging to assess. If anyone believes these competencies are impossible to assess, however, the Holloran Competency Milestone rubrics show otherwise. A law school must decide in what courses, or in what contexts (possibly clinical settings), the school uses the rubrics to assess attainment of a given competency. However, the Milestones are a valuable tool for assessing these competencies.

The work of the Holloran Center, and of those of us on the working groups that developed these first rubrics will continue. (The persons and schools who have participated in this project to date are identified on the site with the Milestones.) Law schools that have not previously been involved in development of rubrics have recently committed to developing further rubrics. Continuing the progress that has begun will provide rubrics for program assessment of competencies for which assessment tools have not been developed. For instance, these schools are likely to address competencies such as Reflection/Self-Evaluation (36 schools include in published learning outcomes), Active Listening (31 schools include in published learning outcomes), and Judgment (18 schools include in published learning outcomes).

Anyone who considers the competencies discussed here to be too abstract to include in a law school’s program of instruction ought to review the impressive survey by Educating Tomorrows Lawyers (ETL), called the Foundations of Practice Survey. There, ETL’s survey of more than 24,000 lawyers nationwide demonstrated that the very competencies discussed above (1) were among the most important factors in employers’ decisions whether to hire law students, and (2) determined whether the student is likely to succeed in law practice. See Foundations of Practice Report (The Whole Lawyer and the Character Quotient).

In short, the law schools that adopted learning outcomes designed to produce lawyers who are not only legal technicians but whole persons are on the right track. The law schools that adopted competencies that go beyond traditional competencies (analytical skill, writing, etc.) showed they believed a complete lawyer needed other competencies to be complete. The efforts described here validate the decision of such schools to adopt learning outcomes that go beyond the traditional ones. The hope, of course, is that law schools now use these rubrics to do program assessment of competencies such as cultural competency, integrity, professionalism, self-directedness, and teamwork/collaboration.

May these efforts ultimately produce more lawyers that embody these competencies.

%d bloggers like this: