Leading Edge Conference: Facing and Forming Legal Education’s Future with Insights, Data and Inclusive Thinking

Last week, I was fortunate to attend the 6th annual Leading Edge Conference hosted by Wolters Kluwer (WK) in Riverwoods, Illinois. It was my first experience with this particular conference. Using an unconference format and with a balance of old-timers and new attendees, WK brought together approximately 30 “thought leaders” for two+ days of intense discussion. Participants included professors and deans from a wide variety of law schools, representatives from law related entities such as LSAC, NITA and IAALS, education or pro-bono related entrepreneurs, and digitalization pioneers.

In addition to the conference, WK hosts the Leading Edge Webinar Series and just announced its 2nd annual Leading Edge prize. Ten Thousand Dollars ($10,000) will be awarded to two winning teams “to help implement their visions of improving student outcomes or expanding educational opportunities for law students.” Proposals are due August 15th.

I left the conferences with many “take-aways,” that I am only beginning to fully digest, and with a better sense of the continuing challenges facing legal education and our profession. Bernard A. Burk, Jerome M. Organ and Emma B. Rasiel recently published in the Nevada Law Review “Competitive Coping Strategies in the American Legal Academy: An Empirical Study”. Their research examined the response of law schools “to the substantial fall off in both the number and the conventional qualifications of applicants to law school that began after 2010.”

The “Competitive Coping Strategies” research also explains why more law schools have not closed and emphasizes the “widened distance” between current students’ needs and current school resources. The study found that in the face of plunging applications to law school, “Reputationally stronger schools” generally chose to preserve their entering Class Profile. This meant “thousands of viable candidates remained available to other law schools, effectively preventing the closing of as many as twenty Reputationally Weaker schools.”

Second, the study points out the implications of shrinking Class Size and discounting Tuition to preserve entering class profile. “As a practical matter, then, law schools ‘invested’ in Profile rather than in expanding their faculties, facilities or their access to clinical and experiential education. We encourage discussion of the implications of this investment choice.”

Third, the study noted that “some Reputationally Weaker law schools perversely were able to maintain or raise their average Net Tuition” and “the students with the least promising prospects for obtaining or making any economically sustainable use of their law degrees are paying the highest prices to obtain them. These inequalities expanded significantly after 2010.”

Fourth, the study highlights the millions of dollars in forgone Tuition Revenue “unavailable to meet the needs of students who at many law schools are significantly less prepared” than their predecessors and suggests this widening gap underlies the declining Bar Exam pass rate.

We seem to have reached a plateau in declining admissions to law school. But that plateau is not a place for us to settle in and rest. There are too many hard questions about where we are now.

How do we address the inequalities which have expanded since 2010 in law schools? What is the value we provide to those with the “least promising prospects?” Is it immoral that those least likely to make “any economically sustainable use of their law degrees are paying the highest prices to obtain them?” or that they may be undertaking crippling debt to obtain a law degree?

On the other hand, if we narrow the pathway into law schools even further, rejecting any who come to law school less credentialed or less prepared, will we be rejecting the dreams and hopes of those who desire a professional pathway? Will we be rejecting many who will find an economically sustainable and good life for themselves? Will we be playing God with students from less advantaged backgrounds just because we don’t know who will make it and who won’t? Will we be eliminating first generation students in larger numbers? Will we be amplifying the lack of diversity in our profession?

And what about the role of law schools in the community at large at this moment in our nation’s fledgling history? Shouldn’t we continue to exist as community laboratories which encourage civil discussion, uphold the rule of law, critique unjust legal systems and decisions, work to sustain democratic institutions and constitutional checks and balances, and produce new ideas about the role of law and legal systems in society?

Finally, if we espouse the “public good” values of my last two paragraphs as arguments for the continued existence of the legal academy and law schools, then do we prioritize these values in our faculty hiring, our strategic plans, and our prioritization of resources?
So, I leave you as I left the conference, with more questions than answers, but with a firm sense that we must continue to ask these important questions.

(Note: the author had her lodging, food and flights paid. She was not paid to write or post anything about the conference. Besides, she is pretty opinionated and not easily swayed.)

More Resources Re Teaching, Learning, and Bar Passage

Thank you to Best Practices for Legal Education Blog for having me as a blogger this week.  I hope the examples I’ve provided about methods medical schools use to evaluate their curriculum, test the effect of new programs, and look for factors that affect success on licensing exams.  As I mentioned at the end of my last post, the most comprehensive source for research based information about bar passage programs as well as a source of funding for sources is AccessLex.  There is a growing literature of articles from schools which have implemented successful bar passage programs.  Here’s an article by Louis Schulze about his work at FIU.

You might also be interested in a series of articles from back in 2009-2010 when those at the front lines of legal education, first year faculty and legal writing and research faculty, began to see significant differences in performance between the students they were teaching and those in the past.  These articles provide information about how substantial changes to the k-college education system in the U.S.A. impacts law students’ transition to law school. This article by Rebecca Flanagan is a good overview.  Prof. Patricia Grande here.  A literature review of law learning strategies by Profs Jennifer M. Cooper and Regan A.R. Gurung.   One more by Profs Susan Stuart and Ruth Vance

Here are the proceedings of a 2014 Symposium entitled “Teaching the Academically Underprepared Law Student” and I invited readers to take advantage of the comments section of this blog to share other publications—including the many more recent ones.  My point here is historical, not bibliographical.  And here, as a quick reminder of one of the crucial skills the bar doesn’t test– research.  Caroline L. Osborne

Finally, something I’ve mentioned elsewhere, the new collaboration between LSAC and Khan Academy providing free, on-line, very high quality LSAT preparation may have something to offer law students.  The skills underlying LSAT performance, close reading and legal reasoning, are not immutable—students can get better at them after enrolling in law school and may find some time with these materials a helpful and interesting way to brush up on these skills.

 

 

What can Law Schools Learn about Bar Passage from Medical Schools’ Approach to Studying Students Who Struggle with Licensing Exams?

It’s not unusual for a provost or a colleague or a relative at Thanksgiving to ask a legal academic why law students have so much trouble passing the bar exam when the pass rates for medical students are usually in the high 90th percent.  The short answer to that question is that the two processes are completely different—and there’s no obvious trick, technique, or intervention that could convert our bar passage rates into their licensure passage rates.   For one thing, it’s the wrong question.  “Passing” the medical licensing exams is certainly important, but unlike the “all or nothing” process of passing the bar exam, the score achieved on Step 1 affects medical students’ entire career path.  But there is a lot to learn about the methods that medical schools use in studying the very few students who have trouble as well as how they evaluate the effect of changes to their curriculums on scores on the licensing exams.

Quick recap on professional licensing—future doctors take a series of three exams over the first six years of their undergraduate medical education and the start of their residency.  (more links in a post I wrote earlier this year here).  The exams are almost entirely national although the actual process of being licensed is conducted on a state by state basis.   Law students take a licensing exam in the state where they intend to practice upon graduation.  For purposes of this post, the closest analogy to the bar exam is the more academic Step One students take during their second year of medical school.  Like  our NCBE, the National Board of Medical Examiners which produces United States Medical Licensing Examination works with medical licensing boards and depends on their confidence.  It issues annual reports.

The focus of this post is on the methods that medical schools use to study the small number of their students who do have trouble passing the licensing the exams as well as the factors that can affect the scores students achieve.  I’ve tried to focus on articles outside of paywalls, and would certainly encourage you to conduct your own searches in the various data bases to which you have access.  There are several journals devoted directly to studying medical education—although these articles can pop up anywhere.

Medical educators use a wide range of research techniques to learn more about students who struggle with licensure exams.  Like us, medical schools would prefer students pass the first time and many articles like this one look for characteristics who fail the first time but eventually pass.  Others look for characteristics of students at risk for failure here and here  or even  what students think of the exam.    Another area for inquiry involves the role stress plays in the score students achieve.   In partnership with social scientists at our schools or in our communities, we too could be conducting studies to help us learn more about students who face difficulty passing the bar exam.  These studies can be part of graduate student work or may even be funded by groups like Access which is making money available to study bar passage.

 

The actual reason the medical school pass rates are so high, though, may not be all that helpful.

It’s not just because they are able to limit admission to students who have already demonstrated an ability to score very highly on the MCAT.  A test that is much more similar to step 1 than the bar exam is to the LSAT.  Indeed, medical schools have direct input in both the MCAT and the Licensing Exams—so when one changes, the other can too. And it’s not clear that anything in the curriculum makes a difference at all—the industry offering study aids and licensure prep courses dwarfs the bar prep and study aid market to a point where students often start studying for the licensing exams before the first day of medical school.

But if it is the curriculum, it’s important to remember the vast difference in time scale between medical and legal education.  We have students for three years post B.A. Medical schools in the U.S. plan their curriculum based on  8 plus years of increasingly specialized medical education.  They are therefore comfortable holding off on the direct teaching of practice skills for the first two years while they are aligning their curriculum with the content of the Step 1 exam.

Even Step 1, though, is far more focused on practice than on knowledge accumulation or deliberately confusing question formulations that characterize the bar exam. Step 2,  the second round of licensing exams prior to graduation medical school,  go past paper and pencil in that they actually test students’ ability to conduct exams and exercise medical judgement.  Another reason for the high pass rate is that most medical schools have stopped developing their own tests and instead use assessment instruments (shelf exams) provided by the same company that produces the exam.   Sure, there is grumbling and criticism about content & timing of the licensing exams, but medical schools work hard to make sure that their curriculums are aligned with the content of the exams.  Finally, medical education is extremely self-reflecting–they are constantly aware of the risks that come from confusing correlation and causation.  How do you know that a change in one part of the curriculum is the cause of a change in test scores?  You run Pearson correlations followed by stepwise linear regressions.  Seeing is not believing when comes to identifying factors that affect performance on licensure exams.   Look here, here, here, and here for studies evaluating curriculum changes.  They take nothing for granted—does attendance make a difference, does flipping classrooms really work? Does reducing the number of hours spend in the anatomy lab reduce USMLE scores?

Another standard practice in medical schools is curriculum mapping— an essential first step for any school that wants to understand what they are teaching—let alone make changes.   Like all maps, curriculum maps are DESCRIPTIVE, not PROSCRIPTIVE.  Here is   Harvard’s curriculum map, but you can find examples on the home page of just about every U.S. Medical School.This is a an article walking through how to map a curriculum.

So what’s helpful to us isn’t so much what medical schools are doing, but how they are evaluating themselves. 

In recap, neither I nor anyone else who has ever practiced law thinks it would be a good idea to emulate medical schools by fully aligning our curriculum with the bar exam so as to turn the three years of law school into one extended bar prep course.  Among other reasons, the material tested on the bar is quite static and doesn’t reflect the realities of today’s law practice.   It also wouldn’t make much sense for schools whose students take the bar exam in many different jurisdictions.   Also, the bar exam is just not equivalent to the three rounds of USMLE exams in actually testing both the knowledge and application of knowledge needed to be a successful lawyer.  If it was, we wouldn’t hear so many complaints about how students who have passed bar are never-the-less not “practice ready.”

Tomorrow—where can we get the help we need to find out this information, and who is going to pay for it?  Spoiler--Access Lex has a program.

We have to talk about the bar exam

Thank you very much to the team at Best Practices for Legal Education for inviting me to blog this week.  My particular thanks to Elizabeth Murad for administering the blog, Professor Mary Lynch, Kate Stoneman Chair in Law and Democracy & Director, Center for Excellence in Law and President & Dean Alicia Ouellette of Albany Law School for hosting this blog.  It is an honor to join such a distinguished group of scholars and teachers.

We knew it was going to be a bad bar year when on Sept 14, 2018 the NCBE announced that “the national average MBE Score for July 2018” had decreased “about 2.2. points from the July 2018 average.”  And, indeed, as states have announced the pass rates based on their own individual formula of MBE plus essays plus the MPT (multistate performance test) plus their own individualized questions, the results were bad.  A lot of our students failed the bar exam.  Pass rates were down in New York, in California, Georgia, Florida, in Texas, and lots of other places.  Yet at the same time, individual schools saw significant success in raising pass rates in the face of grim news all around them.  All of this makes for glib headlines and much handwringing, but in the context of a blog post on “Best Practices for Legal Education” it is more helpful to take a step back and assess the tools we, as legal educators, have available to us in addressing bar passage in our individual schools.  I do so from my Ph.D. studies in higher education as well as from my experience as a dean, associate dean, law professor, and medical school professor.

One of my main themes this week will be to argue for individualized problem solving.  If anyone comes to you with a product to solve all your bar passage problems, I hope after this week you will be able to ask some questions about the data on which they base their claims.    Because a productive discussion of bar exam passage really rests on two questions—1. Why aren’t the students at your law school passing the bar exam at the rate they “should” and 2. What should you do about it?

I am going to use this week to share with you some of the resources available to law schools, to individual faculty members, and even to law students who want to increase their chances of passing the bar the first time.  Along the way, I hope to address some of the unhelpful myths that have arisen and to endorse a very old idea borrowed from the then revolutionary 1960s era child rearing techniques of Dr. Benjamin Spock: These are your students—and you know more than you think do.  Trust your judgement.  Ask questions.  That doesn’t mean that you can do everything yourself—it’s fine to consult with experts, but in the end addressing bar exam passage issues is a school wide effort and everyone has relevant information to add and a valuable role to play.

To get started, it’s helpful to have an overview of the players.  As a matter of foundational Constitutional Law, each state retains the power to license and regulate professionals.  (more detail here).   As a result, every state and territory has its own process for setting criteria for bar passage.   Almost every state contracts with the National Conference of Bar Examiners which develops the annual exam, grades it, and spends a lot of time explaining itself.  If you have any interest in this topic, a free subscription to The Bar Examiner will quickly bring you up to speed.

Tomorrow–how a test from the 1950’s trips up today’s digital natives (or “Do we need a Tardis to match law school curriculum to the bar exam?”)

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Readers of this blog and followers of the NCBE’s expansion remember  that this past Spring New York became the 16th state  to  adopt the Uniform Bar Examination (UBE), changing  its longstanding bar admission requirements.  Many voices opposed adoption including the New York State Bar Association (NYSBA) (see Committee on Legal Education and Admission to the Bar (CLEAB) report 10-29-2014  and vote of House of Delegates), the Clinical Legal Education Association (CLEA) and the Society for American Law Teachers (SALT).  Despite these and other  opposition voices, the proposal was adopted with the new changes going into effect for the July 2016 bar examination.

During discussion of the adoption of the UBE, the Court was encouraged  to include clinical or experiential  requirements for licensing so that lawyers admitted to the New York Bar would be ahead of the curve — a position I firmly support.   On the opposite coast, California had been engaged in a multi-year process examining licensure and profession readiness which resulted in a proposal requiring 15 credits of experiential learning before admission.  In response to the movement to incorporate experiential learning in bar admission,  the New York State Court of Appeals formed a Task Force on Experiential Learning and Admission to the Bar.  Just last month, that Taskforce requested comments on its proposal that

New York adopt a new mechanism for ensuring that all applicants for admission to the bar possess the requisite skills and are familiar with the professional values for effective, ethical and responsible practice. In light of New York’s diverse applicant pool, and in an effort to accommodate the varying educational backgrounds of applicants, the Task Force suggests five separate paths by which applicants for admission can demonstrate that they have satisfied the skills competency requirement.

The New York Law Journal examined the proposal in an article found here.   In addition, the Honorable Judge Jenny Rivera, chair of the Taskforce attended a meeting of NYSBA’s Committee on Legal Education and Admission to the Bar (CLEAB) to explain the proposal and answer questions.

It is heartening that the Court is concerned about and wants to  require the development of essential lawyering skills and professional values acquisition. However, without more, Pathway 1 of the current proposal will not actually ensure  that applicants to the bar experience the kind of skill development and value formation that the Taskforce desires.  Pathway 1, referencing new ABA standards,  requires schools to confirm that they have published  their “plan for incorporating into their curriculum the skills and professional values that,  in the school’s judgment,  are required for its graduates’ basic competence and ethical participation in the legal profession.” It also requires law schools to certify  that law graduate applicants for admission “have sufficient competency in those skills and sufficient familiarity with those values” which are publicly available on the law school’s website.  Although Judge Rivera believes that the certification process described in Pathway 1 can have some real bite, as pointed out in comments submitted by the Clinical Legal Education Association (11.9. 15 CLEA SUBMISSION ON EXPERIENTIAL REQUIREMENT ), Pathway 1 simply mirrors the experiential training requirements already mandated by the American Bar Association.     

New York’s  law school deans, not unexpectedly,  submitted comments supporting the “flexibility” of Pathway 1.  The  CLEAB report to the Experiential Taskforce expressed concern that without additional content to Pathway 1 “little will be accomplished” by the proposal.   And as one member of the NYS bar committee  argued, “what law school is going to admit that one of its graduates did not acquire the skills or  values promised on its website?”

In my opinion, the most important concern is whether applicants to the bar have ever represented or interacted with a client, or operated as a lawyer, in a live setting under guided, experienced supervision before admission.  In its comment to the Taskforce, CLEA urges that a “three- credit clinical training requirement” be added for all J. D. applicants to the New York Bar.  This makes sense.  Law school clinics and faculty-supervised externships are designed to create the very kind of skill development and value acquisition with which the Court is concerned.  And clinical faculty have developed the formative assessment tools to maximize skill and professional identity formation.

I am hopeful that, in its next iteration of the proposal, the Taskforce will heed CLEA and CLEAB’s comments and come back with recommendations that will ensure applicants for the bar are ready to engage in competent, ethical and professional representation of New York’s citizenry, corporations, and notforprofits.

 

 

 

 

Musings on the Bar Exam and Legal Education’s Attitude toward it

I have been studying and writing about the bar exam of late, so I appreciate the guest blogging opportunity, graciously offered by Mary Lynch, which I shall use to share some bar exam musings. Later this week, I hope to follow up with a bit more.

I noted with interest a recent New York Times feature, Is the Bar Too Low to Get into Law School? The feature offered perspectives from five legal professionals, four of whom are law professors, on how best to respond to declining bar exam passage rates. (Scores on the MBE, the anchor of the bar exam in almost every state, have declined again this year.) Two took issue with the bar exam itself, arguing for fundamental changes or its complete abolition. But Linda Sheryl Greene of the University of Wisconsin Law School argued that law schools simply need to do the work of preparing their students for the exam.

Law schools (or at least those not in the very top tier) indeed need to help their students prepare for the bar exam, but the bar exam also has to change in a way that allows law schools to do their part without the deleterious distraction of the exam’s heavy focus on recall of memorized law. Regrettably, bar exam reform efforts over the last 20 years have not focused on the one part of the exam that actually and exclusively tests lawyer competencies, requiring zero memorization of legal rules. That sadly neglected part of the exam is the performance test, which assigns a specific written lawyering task to be completed using a closed universe of factual materials and legal authorities. About one-fifth of the states do not even administer a performance test. Among states that do, the performance test remains the smallest part of the exam, accorded the least weight in scoring. It is in a very real sense the ugly step-child of the bar exam.

The behemoth of the bar exam, the MBE, compels examinees to study and memorize a copious number of legal rules. To be fair, the MBE does not test only for knowledge of law. But every skill set evaluated by the MBE—reading comprehension and legal analysis among them—is evaluated also by the performance test. The MBE’s primary value to the overall exam is psychometric—i.e., when scores on other parts of the exam are scaled to the MBE, the overall exam achieves testing reliability. A reasonable level of testing reliability can be achieved if the MBE is weighted at 40% of the overall score. (See page 13 of this article by the National Conference of Bar Examiners’ former Director of Research.) However, the NCBE recommends 50%, a recommendation that most states follow.

What of the rest of the exam? In every state, the remaining part of the score comes mostly from answers to essay questions, which, like the MBE, require memorization and recall of legal rules. If the MBE is testing knowledge of law (and creating more than enough focus on rote memorization), what reason other than inertia is there for essay questions to retain such a significant place on bar exams? Or to remain on bar exams at all? For years, essay questions were the venue for testing knowledge of state-specific law. However, most states now use the NCBE’s Multistate Essay Examination. And, as a growing number of states adopt the Uniform Bar Examination, several are employing other means outside of the bar exam, such as a required seminar, to ensure that new lawyers are familiar with unique attributes of local law.

And that takes me back to the performance test, the most valid of the testing instruments on the bar exam. The performance test was the answer from bar examiners 20 years ago to the recommendations of the MacCrate Report, which called on law schools and bar examiners to increase their attention to lawyering skills. Since then, while the MBE and essay examinations have been expanded, the performance test has remained stagnant. That needs to change. Through careful attention to the various skills today’s beginning lawyers have to perform, examiners should be able to reinvigorate the performance test and expand its skills coverage. They should also be able to increase the inadequate weight given to the performance test in scoring.

As for legal education’s attitude and approach toward the bar, I think an exam that focuses more heavily on skills through performance testing is one that would put law schools in a better position to help their students prepare. Because performance tests do not evaluate substantive knowledge of law, bar preparation specialists in law schools can easily administer performance tests from previous bar exams to students as both formative and evaluative assessments. Legal Writing professors have been using performance test-style problems for many years, especially with first-year students. Clinical professors use them, and, yes, even some doctrinal professors have too.  (I compiled a list of articles discussing the use of performance test-based problems by law professors in footnote 269 of my recent article.)

Why More States Should Not Jump on the Uniform Bar Exam Bandwagon

In May, New York became the 16th state—and by far the most prominent—to adopt the Uniform Bar Exam, the standardized licensing test for lawyers created and promoted by the National Conference of Bar Examiners..

With such an influential state on board, the UBE is now all the rage. New York Court of Appeals Chief Judge Jonathan Lippman proclaimed in his announcement of the Court’s decision that it “will reverberate among the other states,” ultimately leading to adoption  of the UBE throughout the country. The president of the Florida Bar responded to the decision by predicting that the northeastern states will soon follow New York’s lead and also acknowledging that Florida will give accelerated consideration to the UBE. And UC Irvine Law School Dean Erwin Chemerinsky argued in an L.A. Times op-ed that California should join New York in adopting the UBE.

Before other states react to the reverberations by crowning the UBE as the predominant or sole bar exam in the nation, the entire uniform exam initiative merits closer scrutiny. To be sure, adoption of the UBE throughout the country would make law licenses much more portable for beginning lawyers recently out of law school. That is a worthy goal, but the UBE is a dubious and potentially damaging means for achieving the desirable end of greater interstate license portability, especially because there is a different and more benign way to achieve that end.

A comprehensive consideration of the UBE suggests to me five reasons why additional states should step back from the precipice and be very cautious about joining the UBE bandwagon.

I discuss my five reasons in greater length here and, without annotations, here. In sum, they are as follows:

  1. The UBE perpetuates a flawed bar exam and is fundamentally inconsistent with recent trends in legal education and the legal profession
  2. The UBE would represent a regressive change to the current bar exam in several state
  3. The advantage of having the National Conference set a singular scoring methodology for all bar exams is overstated
  4. National implementation of the UBE would decrease the chance of meaningful bar exam reform by taking power away from states and giving it to the National Conference
  5. There is another means for achieving license portability that has far fewer drawbacks than the UBE

The Multistate Bar Exam (MBE) is administered as roughly half of the bar exam in 49 states and is a very reliable test from a psychometric perspective. Why not allow applicants who achieve a passing score on the MBE on any jurisdiction’s bar exam to apply for admission to any other state without the necessity of taking the bar exam again? Minnesota, North Dakota, and the District of Columbia already permit this, and other states need only follow their example.

A state adopting this approach in lieu of the UBE would be giving full faith and credit to whatever the written portion of the exam is in the original jurisdiction instead of forcing applicants to take its written portion. Since most states scale the scores from their written tests to the MBE, the policy is psychometrically sound. In fact, in many cases, the written tests of the two states at issue would be very similar, and the only variable would be awareness of local law, which could be covered through a CLE program.

Instead of putting its energies into national adoption of the UBE, which would decrease the quality of the bar exam in many states and drastically reduce state autonomy and flexibility, the National Conference could simply push for more states to adopt the policy already maintained by Minnesota, North Dakota, and D.C. We already have a uniform bar exam, and the National Conference already has enough power.

(Some of my article expands on the astute observations made by Professor Dennis Honabach in a piece he wrote in the ABA journal Professional Lawyer last year. ABA members can retrieve full text here, or on Westlaw, it’s at 22 No. 2 Prof. Law. 43.)

NEW YORK STATE BAR ASSOCIATION VIGOROUSLY OPPOSES PROPOSAL TO BRING UBE TO NY THIS JULY

Less than a month ago, the New York State Courts circulated a proposal to change the New York State (NYS) Bar Exam by adopting the Uniform Bar Exam (UBE) along with a second, separately graded “New York Law Exam” segment consisting of 50 multiple-choice questions, tested for one hour on the second day of the exam.   The proposal would make the changes effective for all current graduating law students who face the bar exam in July 2015.    This past weekend, the New York State Bar Association House of Delegates unanimously opposed the proposed immediate changes,  sending a  message to the NYS Board of Law Examiners and to the New York Court of Appeals – do not bring the Uniform Bar Exam and a yet to be formulated or studied New York Exam to NYS in  July 2015.  Even more significantly, the House directed the State Bar President, based on an amendment from the floor,  to do everything possible to prevent immediate implementation of a new bar exam in New York.  

So, how did NYS get to the point where the Courts and the Bar are in such conflict over proposed changes to the bar exam?

For several years,  the NYS Legal Education and Admissions to the Bar Committee (LEAB)  (on which I have formerly served as an active member) has been studying how to improve the bar exam to make it fairer for all groups of test takers and more relevant to what graduates need to know, value and do in the early years of practice. See NYSBA Legal Education September2013Journal particularly page 31.  The Committee, through its chairs, has reached out to the NYS Board of Law Examiners and the Chief Justice about these matters without success.  The UBE was not one of the reform measures which LEAB proposed for further study or pilot projects.

Suddenly, and without notice to the NYSBA LEAB Committee,  co-chaired by  well-respected practitioner Eileen Millett and equally well-respected Touro Law Center Dean Patricia Salkin , the courts circulated and posted the following:

1) UNIFORM BAR EXAMINATION (UBE)
 POSTED OCTOBER 7, 2014

The New York State Board of Law Examiners has recommended to the New York Court of Appeals that the current bar examination be replaced with the Uniform Bar Examination (UBE). The Court of Appeals is considering adopting the UBE for the administration of the July 2015 bar exam. On October 6, 2014, the Court of Appeals issued a Request for Public Comment on the proposal. Submissions will be accepted until November 7, 2014. A copy of the Request for Public Comment is available by clicking this link:   New York Court of Appeals Request for Comment  http://www.nybarexam.org

The proposal and request for comment document asserts that  “The UBE is prepared by the National Conference of Bar Examiners (NCBE) and passage of the test would produce a portable score that can be used to gain admission in other states that accept the UBE, provided the applicant satisfies any other jurisdiction-specific admission requirements. As the UBE is accepted by more states,the portable score will facilitate lawyer mobility across state lines, resulting in expanded employment opportunities for lawyers throughout the nation and facilitating multi-state law practices.”

Given the surprise announcement from the Court on October 6, 2014 of a 30-day comment period (open until this Friday, Nov. 7th) , the LEAB and its co-chairs  had only a matter of weeks to research, discuss and prepare a report for the State Bar Association about the implications of the proposed changes. The LEAB report 10-29-2014 (2) argues that it is simply too soon to discuss the merits of the Uniform Bar Exam and its potential impact on test takers in New York because of the surprise nature of the announcement along with absence of any study or report discussing a need, a cost-benefit analysis, or a discussion of whether there could be disparate impacts on minority test-takers.  LEAB is concerned about potential increase in costs for test-takers, impact on barriers to entry to the profession in New York, and impact on the New York job market.  LEAB  discussions emphasized that the practicing bar has been pressuring law schools to meet the demands of a changing market place including, among other things, producing more “practice ready” lawyers that would presumably include a richer knowledge of New York Law.  Impacts on foreign lawyers and other important issues for consideration were also raised.

On this past weekend, co-chairs Millett and Salkin presented their findings to the NYSBA House of Delegates.  The presentation to the State Bar can be viewed here (Click on the Nov. 1 House of Delegates Meeting and then click on the Report of the Committee on Legal Education and Admission to the Bar)  Co-Chair Millett challenged the notion that the proposed reforms as outlined would actually result in  portability. Co-chair Salkin pointed out that the notion of “uniformity” seems misleading given that in NY many uniform rules are not used and that  current law school  courses focus on statutes different than those used by the UBE .   Significantly, three past presidents of the NYSBA testified against the proposed immediate changes including Steve Younger who emphasized the issues raised by New York’s special connection with international lawyers from around the globe admitted to practice  in New York State.  Many expressed concern for current students facing the July Bar, including Albany Law School Professor Michael Hutter who asked  “Why the Rush to Judgment?” Dean Patricia Salkin and Betty Lugo (President-elect of the Puerto Rican Bar Association founded in 1957) expressed particular concern that minority bar associations were not consulted, and that questions on the proposed brand new “New York Law Exam” component have never been tested on previous exams, a “best practice” for all standardized tests that are given as points of entry to higher education and the professions.

Why does this matter?

The contents, pass rates and disparate impacts of the bar exam matter tremendously .  This is our profession’s gatekeeping device.  It announces  what we value and what we do not value. It will be a make or break change for many law students starting in July who have prepared their course of study under different sets of expectations. For many schools and many students, bar exam subjects and testing methods determine their course curriculum rather than what they need to meet student learning outcomes or preparing for practice. This proposed change deserves further scrutiny and evaluation.  New Yorkers also deserve that the Court evaluate  the success of licensure practices which include clinical evaluation while in professional school as opposed to sole reliance on standardized testing.

See attached SALT Letter-NY Bar opposing the proposed changes.

My  Reaction to the Proposed Changes:

POTENTIAL ADVANTAGES:

  • Should proposed changes result in a decrease in the number of doctrinal subjects tested on the NY Bar exam that will be an advantageous change both for making the bar exam more relevant and for allowing law schools and students to craft better curricular choices to prepare them for the jobs and careers of today and tomorrow.  (see earlier BLOG post on this issue here.)

POTENTIAL DISADVANTAGES:

  • The process for adopting the proposed change is too hasty and is unfair to current third year students and to second year students who have already planned three semesters around the exam.
  • The proposed changes have not been studied appropriately. For example, no one knows if the new format, particularly the 50 question NYS multiple choice format,  will exacerbate the already disparate impact on graduates of color and/or if it will create a separate barrier for admission to those who will make great lawyers but not particularly good standardized test-takers given the speededness/speediness factor – 50 multiple choice in one hour will make or break you on the NY part!
  •  The proposed format fails to address the critical need for bar licensure to include evaluation of actual, supervised, and  limited practice of law while in law school or immediately thereafter.  As a gateway to a client-centered, civic profession, evaluation of the limited supervised practice of law could and should replace – at least some part – of the current standardized testing.

NEW YORK LAWYERS, LAW STUDENTS  AND LAW PROFESSORS ACT NOW!  Comments due by this Friday November 7th.

Address comments to:

UniformBarExam@nycourts.gov

%d bloggers like this: