Teaching Students the Art of Giving (and Receiving) Feedback and Sharing IDEASS


By Kaci Bishop, UNC School of Law

Law students may be accustomed to receiving feedback, but as lawyers, they will also be called on to give feedback.  They may review a colleague’s brief or contract, adapt samples, help moot a case or supervise a summer or new associate, etc.  Learning how to give feedback effectively can also help them develop their critical eye to assess and revise their own work.  Giving feedback effectively is thus another skill worth teaching. [1]

In my classes, students have opportunities to exchange feedback with peers on written assignments through structured peer reviews and through moots and case rounds.  More informally, they exchange feedback regularly with their partner as they work on their clients’ cases, and as we debrief together in team meetings.

To be sure we have a shared vocabulary and framework for giving feedback, I include in the beginning of the semester a lesson on giving feedback.  It also sets a tone for receiving feedback.  My stated objectives for the lesson are to (1) reflect on how they have received and given feedback in the past; (2) explore what it means to have a growth mindset; (3) learn a framework for giving effective feedback; and (4) practice using that framework.

I begin by having them answer polls about what goals or concerns they have had when giving feedback in the past and then how they like to receive feedback.  Often, the polls reveal that most students want to help someone improve their work but are concerned they will hurt the receiver’s feelings—while they themselves prefer direct and honest (which students often frame to me as “harsh”) feedback.  We discuss these tensions, and circle back to them throughout the class and the semester.  We also explore and discuss the differences between direct and directive feedback, and I share how I usually give feedback (e.g., asking them questions to help them puzzle out what they need to do to make the product more effective or sometimes identifying the issue and modeling one but letting them find where they did it other times).

Figure 1: Sample Poll Question Assessing How Students Like to Receive Feedback

Discussing the polls segues to talking about what it means to have a growth mindset, because the polls usually demonstrate that the students’ best experiences in giving and receiving feedback were when they were open and ready to learn.  I introduce (or re-introduce) Carol Dweck’s Mindset theory work, highlighting the differences between the fixed mindset (intelligence is static) and the growth mindset (intelligence is malleable).  I emphasize how we all occupy both mindsets at different times, even though we may gravitate to one over the other. I emphasize, too, how we can learn to recognize when we are occupying a fixed mindset and then work to reorient ourselves to be in one of growth.[2]  For example, as a 1L, I struggled to grasp Contract law.  It would have been easy to give up and conclude that I was just not a Contracts person and write off the class (and my ability to understand it).  That’s a fixed mindset.  To succeed, I had to shift to a growth mindset.  I had to shift to thinking that while I was not yet understanding Contracts, I was capable of understanding it and needed to put forth more time and effort to do so. 

Figure 2: Slide with a summary of Carol Dweck’s mindset attributes.

One’s mindset is important for feedback—both receiving and giving. When a person is occupying a fixed mindset when receiving feedback, she is entering the exchange with the goal of receiving validation and approval.  She will be more resistant to criticism, no matter how constructive.  Similarly, when giving feedback, a person occupying a fixed mindset may offer feedback aimed more at demonstrating how smart he is or with performing if in front of other people than responding to the goals of the person seeking the feedback.  Alternatively, someone giving feedback while occupying a fixed mindset may not think he has anything of value to add and thus not offer much in the exchange.  Getting into a growth mindset—for both the receiver and giver of feedback—and seeing the exchange as an opportunity for both to learn and improve is essential for sharing feedback effectively.  If both the giver and the receiver are occupying a growth mindset when exchanging feedback, they will learn and draw inspiration from each other and propel each other to higher levels of achievement.

In addition to having a growth mindset, I advise the students that when giving feedback, they should focus on the skills or product not on the person, personality, or identity.  And their feedback should be constructive; meaning, it should be based on observations not opinions, be concrete and achievable, and limited.  The giver of feedback should not overwhelm the receiver with tons of pieces of things to correct and should always include at least one thing that should be preserved because it is already effective.

I then share the following framework for giving feedback, complete with the (possibly silly) mnemonic: IDEASS.

Figure 3: IDEASS Framework

The first objective when someone is asked to give feedback to another is to identify the priorities or goals of the person seeking the feedback.  What would the receiver most like to get out of the peer review, moot, or rounds?  What feedback would be most helpful?  How do they prefer to receive feedback?  Are there particular questions the receiver has that they are seeking answers to?  When is the product due and how much time do they have to revise?  These questions help set expectations to guide the exchange.

The student then needs to diagnose the issues.  This may be difficult; it’s also crucial because it focuses the feedback and helps to train the analytical skills and critical eye of both the giver and the receiver.  To diagnose the issues, the giver of feedback needs to understand and articulate what the underlying norms or rules of the skill or product are.  For example, if giving feedback on headings in a brief, the underlying rule for effective headings might be that they should be framed as conclusions that blend law and fact allowing the writer’s arguments to appear as an exoskeleton of the brief.  For a direct exam, the underlying rule might be that the questions should be open-ended rather than leading. These underlying issues or rules might mirror what the receiver of feedback identified as their priorities.  They might have asked for help making their direct exam more open-ended, for instance.  If the underlying norms or rules for the product are not clear, the giver of feedback should askthe person seeking feedback what they intended or how they chose to do what they did, then the giver can share observations about the product or skill.

The student giving feedback should share one or two effective aspects and then one or two areas of focus for improvement.  Often “feedback” seems only to encompass the latter but sharing what worked well or what was effectively done helps the giver know what to keep or what to replicate going forward.  Both feedback about effective aspects and those that could be improved or more effective should be shared as what the giver observed.

Sharing observations, not opinions, helps both receiver and giver to continue to occupy a growth mindset and to maintain the goal that both are learning through the exchange.  The giver should focus on what they noticed about the skill or product and reflect or even replay what the person seeking the feedback said or did.  For example, if the student seeking feedback on a direct exam asked a leading question, the student giving the feedback might note: “you asked your client: ‘Were you trying to leave your partner when you went to stay at your grandmother’s?’ That is a leading question.”

After reflecting what she noticed, the student giving the feedback can then suggest next steps or solutions.  How might someone do it differently next time?  The student may also model a solution.  She might, for example, say: “Instead, you could ask: ‘Why did you go to stay at your grandmother’s?’”  Alternatively, the student giving the feedback might ask the student who did the direct exam to arrive at a solution by saying something like: “How might you ask an open-ended question to get the same point?” At this phase, if possible, the person seeking the feedback could try again or revise the product, incorporating the feedback.

Putting it all together, a student’s feedback on the direct exam hypothetical may look like this:

  • You wanted me to assess your direct exam.
  • Your questions have a good rhythm and build upon each other in a way that allows your client’s story to come out persuasively.
  • Some of your questions were not yet open-ended. For example, at one point, you asked your client: ‘Were you trying to leave your partner when you went to stay at your grandmother’s?” That is a leading question.  Instead, you could ask: “Why did you go to stay at your grandmother’s?”

Beyond sharing IDEASS with their peer, I encourage students to also use growth language[3] in giving feedback—such as the words: yet, and, and opportunity—and to express gratitude by thanking each other for the time, feedback, and opportunity to help.  Then, to finish the lesson, I have my students practice using the framework with a simulation.  I share a video of a simulated client interview (e.g., one from the Legal Interviewing and Language Access Film Project, created by Lindsay M. Harris and Laila L. Hlass, which as one of the participants in the lightning session at the AALS Clinical Conference in the spring of 2021 noted is the gift that keeps on giving!) and have the students share their feedback to the student interviews in the video.  The students thus get to practice using this IDEASS framework for feedback in a low-stakes way.  We can then revisit this shared vocabulary and framework as needed throughout the semester when they are called upon to give feedback to a peer—and continue to build this skill along with many others.


[1] This blog post summarizes the lightning session at the AALS Clinical Conference 2021 by the same name.

[2] In addition to exploring Carol Dweck’s work, here are some other resources for incorporating her mindset theory into legal education: Corie Rosen, The Method and The Message; Heidi K. Brown, The Emotionally Intelligent Law Professor; Paula J. Manning, Word to the Wise; and Megan Bess, Grit, Growth Mindset, and the Path to Successful Lawyering;

[3] I explore growth language in more depth in my article on Framing Failure in the Legal Classroom.

Round and Round We Go: The Stages of Rounds applied to a AALS Clinical 2021 Lightning Session

By Cori Alonso-YoderSherley Cruz, Vanessa F. Hernandez

Stage 1: Description of the Issue

“In almost every small group of clinicians at clinical conferences, someone raises the subject of ‘improving rounds.’”  – Elliott Milstein & Sue Bryant, Rounds: A “Signature Pedagogy” for Clinical Education?   

While this year’s virtual AALS Clinical Conference differed from prior years, clinicians’ desire to maximize rounds remains a constant. These facilitated classroom discussions are what Professors Elliot Milstein and Sue Bryant called a “signature pedagogy” for clinical legal education. Indeed, rounds figure as a meta exercise of clinical education. By giving up total control, rounds may feel challenging or unpredictable. Yet, year after year, rounds remain a principal teaching tool. Because, as observed by Milstein and Bryant, “when the conversations go well, they are precious sources of learning.” 

In 2019, a group of us compared notes to discuss their use of rounds.[1] We learned that we had similar approaches, but that there was also great flexibility in our practices of rounds. At the Rounds on Rounds Session, we hoped that by sharing our experiences we could learn from one another while also amplifying different models of rounds. To our delight, 136 conference participants also wanted to learn new ways of teaching rounds.

Stages 2 & 3:  Questions to Clarify and Problems Identified

Our initial goals for the AALS lightning session were to 1) introduce the concept of rounds; and 2) to share different approaches to rounds. To seek feedback from our participants, we used a Google form survey to learn more about our audience’s familiarity with rounds (especially the “ traditional Milstein/Bryant five stages”), while also soliciting ideas on different approaches. In particular, we sought to introduce the concept of rounds for student learning and as a tool for clinicians to use in their own development as supervisors and educators. 

From our survey, we learned that over 90% of our 69 respondents currently used rounds. Of these, the vast majority (more than 85%) reported using rounds in discussion with students about client fieldwork. Only about 30% of respondents mentioned using rounds with colleagues to discuss supervision of students. Almost 10% of respondents, 7.2%, responded that they didn’t use rounds or were unsure if they used rounds.

Figure 1. Responses about rounds modes used by participants.

The survey also asked respondents to identify their priorities for learning within the session. Most of the respondents, 63.8%, indicated that they were most interested in learning about maximizing their use of rounds to discuss lawyering skills.

Figure 2. Responses about priorities for learning in the lightning session.

These responses helped us focus our discussion for the remaining time and to clarify our goals for the session.

Stage 4: Goals Clarified

Based on the responses from students, we returned to our dual goals of 1) introducing rounds as a teaching tool; and 2) sharing practices for rounds.

Stage 5: Lawyering Strategies Exchanged/Proposed Solutions

To set the stage for the nearly 10% of respondents not presently using rounds, we presented some of the foundational concepts related to use of rounds.

Figure 3. A slide with the “Milstein/Bryant” rounds structure from the lightning session.

Participants took part in a How Do You Use Rounds Google Doc “quick write” to share their perspectives on what is working in rounds and where they experienced challenges. Having 150 participants trying to access a Google Document at once “crashed” the shared doc. Despite the technical difficulties, we were able to spark a rich discussion from the quick write. We took a fresh look at the first few stages from a cultural, racial, gender, and other differences perspective, which provided an opportunity for conversations about bias, stereotypes, and their impact on third parties.[2]  

Stage 6: Lessons Learned

Perhaps it is poetic justice that a 30-minute lightning discussion on rounds with nearly 150 participants would feel rushed and incomplete.

Many clinicians in the quick write exercise expressed their feelings of struggling to find time for rounds, or properly developing the conversation. We, the presenters, faced similar struggles in getting out all that we hoped to share with our session, but were encouraged by the enthusiastic responses from our colleagues.

Among the helpful conversations that developed with participants after the formal conclusion of the Zoom session, we identified the need to develop materials to foster student led discussion and participation. One participant asked for readings to provide to students in advance of rounds. Another participant wanted to learn more about one of our practices in requesting students prepare a pre-rounds memo.

This session confirmed that challenges and opportunities with structuring rounds will likely remain a topic to which we continue to circle back. We look forward to the next “go round” on this topic.


[1] This group included Cori Alonso-Yoder,  Sherley Cruz, Vanessa F. Hernandez, Nadiyah Humber, and  Katie Ladewski Jarosz.

[2] This innovation was credited to Professor Alexander Scherr at the University of Georgia. The exercise was particularly resonant as an intervention given the conference’s larger theme of “Recognizing Our Past and Building for Our Future.”

Implementation of the ABA’s New Experiential Training Requirement: More Whimper Than Bang

By: Robert Kuehn, Washington University School of Law


When the ABA adopted a new experiential training requirement in 2014, there was hope it would spur law schools to significantly change the way they prepared students for legal practice. The new six-credit requirement in ABA Standard 303(a)(3) was less than the fifteen credits proposed by some educators and did not include a mandate for a law clinic or externship experience. Nonetheless, the six credits were an improvement over the ABA’s previous “substantial instruction” in professional skills requirement.[1] But data from the initial implementation of the new experiential requirement suggest its effect has been more of a whimper than the bang some hoped for, with little evidence it has spurred legal education to enhance the ability of students to get hands-on training in professional skills.

            Law schools are required to report annually to the ABA on the number of seats simply “available” to students in law clinic and simulation courses and the number of field placement/externship positions actually “filled.”[2] Data from the first two years of the new six-credit requirement in 2019 and 2020 show no increase in the positions available to students in clinics or simulations and even a decrease in actual enrollment in field placement courses, when normalized to address fluctuations in nationwide law school enrollment. While some law schools have made important changes to their curriculum, the graph below indicates that, on average, schools have not reported positive changes in law clinic, field placement, or simulation data since the ABA’s adoption of the new experiential standard in 2014. The number of clinic seats available per J.D. student in 2014 was 0.27 and still only 0.28 in 2020; field placements decreased from 0.26 in 2014 to 0.24 in 2020; and seats available in simulations likewise decreased over the six-year period from 1.22 to 1.12 per student.


  Source: ABA 509 Required Disclosures at http://www.abarequireddisclosures.org/Disclosure509.aspx

            The New York Court of Appeals followed the ABA in 2015 with its own new skills competency standard for bar candidates, proclaiming that “the goal of ensuring effective, ethical and responsible legal services in New York requires more than what the new ABA Standards provide.”[3] Commentators on the proposed New York standard argued it simply mirrored the ABA’s requirement, with some additional paperwork, and would not improve the skills training of students. The graph below shows that the New York competency standard, indeed, does not appear to have spurred New York’s law schools to noticeably enhance their professional skills training of students or to provide more training than schools in states following only the ABA requirement. Although students at New York schools were offered more opportunities to enroll in simulation courses lacking the supervised experience of handling the complexities of real-life clients, opportunities to participate in a law clinic were unchanged and field placements decreased.


Source: ABA 509 Required Disclosures for 15 New York law schools

            Data from the recent Center for the Study of Applied Legal Education (CSALE) 2019-20 Survey of Applied Legal Education of 95% of law schools also show little measurable effect from the new experiential training standard.[4] Schools reported no increase in the median number of law clinic courses offered to their students since the prior 2016-17 survey and no increase in the percentage of students that graduate with a law clinic experience. Similarly, there was no reported increase in the percentage of students that graduate with an externship experience, with student demand for externship courses in 2019-20 up slightly from the last survey yet significantly less than externship demand in 2014 when the new standard was adopted. And, the percentage of schools requiring each J.D. student to participate in a law clinic or externship course as a condition of graduation only increased marginally from 20% in 2016-17 to 23% in the 2019-20 survey.

            Two thirds of schools in the CSALE survey did report making some changes to their curriculum in response to the ABA’s new experiential requirement, with almost half reporting the addition of a new law clinic, field placement, or simulation course(s), and one quarter of schools reporting increased slots available in an existing experiential course(s). A 2018 survey by Allison Korn and Laila Hlass also found that about two thirds of schools reported an expansion or enhancement of their course offerings in light of the ABA’s new experiential course requirement.[5]

            In both surveys, however, significant numbers of schools simply restructured existing courses to meet the experiential training definition, including merely relabeling parts of the first-year required legal writing course as “experiential” or offering a one-credit simulation component to a doctrinal course. Because the survey questions did not ask separately about law clinic and externship courses but grouped them with non-clinical simulation courses, the data do not reveal if legal education has increased live-client clinic or externship opportunities for students or simply adjusted to the new requirement in other ways. In the 2019-20 CSALE survey, there was a slight increase of approximately 5% in the reported percentage of students that participated in a law clinic or externship prior to graduation. But fewer than 20% of schools attributed any increase in clinic or externship demand to the new ABA requirement.

            To the extent the ABA’s new six-credit experiential requirement was intended to provide law students with more meaningful hands-on training in important professional skills, its own data do not show that intended result. In addition, surveys of schools on their implementation of the new training requirement do not show significant gains in skills training as a result of the new accreditation standard.

            It is time for the ABA to address these deficiencies by at a minimum requiring schools to report actual enrollments in law clinic and simulation courses so that the ABA can truly judge the effect of its requirement and prospective applicants to law schools will not continue to be potentially deceived by reports of ethereal “available” law clinic opportunities.[6]

            Yet students, and the clients they will soon represent in practice, deserve more than just enhanced reporting requirements. The ABA’s six-credit experiential requirement remains far below the skills training other professional schools require of their students.[7] Two recent studies on legal education have highlighted the need for greatly enhanced skills training, including mandatory clinical training prior to bar licensing.[8] The ABA should heed these calls for reform and revisit the proposals for fifteen-credits of experiential coursework and a mandatory, live-client clinical experience for all J.D. students.


[1] An ABA memorandum explained that “substantial instruction” equaled only one credit of lawyering skills instruction, which could be in a simulation course. Peter A. Joy, The Uneasy History of Experiential Education in U.S. Law Schools, 122 Dick. L. Rev. 551, 574 (2018), available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3129111.

[2] Prior to 2017, the ABA also required schools to report the actual number of students enrolled in law clinic and simulation courses, not just seats available. However, the ABA determined that asking schools to report actual enrollment, when the accreditation standard only requires “substantial opportunities,” was unnecessarily burdensome and now only requires schools to report the number of clinic and simulation opportunities that are potentially available to students.

[3] New York Court of Appeals, New Skills Competency Requirement for Admission to the Bar (Dec. 16, 2015), at

http://www.courts.state.ny.us/ctapps/news/nottobar/nottobar121615.pdf; Task Force on Experiential Learning and Admission to the Bar: Report to Chief Judge Lippman and the New York Court of Appeals 3 (Nov. 2015), at http://ww2.nycourts.gov/sites/default/files/document/files/2018-05/Experiential-Learning-Admiss2Bar-Report122015.pdf.

[4] Center for the Study of Applied Legal Education, 2019-20 Survey of Applied Legal Education (2020), available at https://www.csale.org/#results.

[5] Allison Korn & Laila L. Hlass, Assessing the Experiential (R)Evolution, 65 Villanova L. Rev. 713, 731-33 (2020), available at https://digitalcommons.law.villanova.edu/vlr/vol65/iss4/1/.

[6] One school with enrollments of approximately 300 students per class claimed in its 2018 509 Required Disclosure to prospective applicants over 1,500 seats available to students in its law clinics. Another school with a class of 100 reported over 300 clinic positions available, yet only 50 students actually enrolled in those purported available positions.

[7] See Robert R. Kuehn, Pricing Clinical Legal Education, 92 Denv. U. L. Rev. 1, App.. A (2014) (documenting one-quarter to one-third required credits in skills training for other professional schools), available at. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2318042.

[8] Deborah Jones Merritt & Logan Cornett, Building a Better Bar 75-76 (2020), available at  https://iaals.du.edu/sites/default/files/documents/publications/building_a_better_bar.pdf; Joan W. Howarth & Judith Welch Wegner, Ringing Changes: Systems Thinking About Legal Licensing, 13 Fla. Int’l L. Rev. 383, 430-31 (2019), available at https://scholars.law.unlv.edu/cgi/viewcontent.cgi?article=2220&context=facpub.


The Imperative and Power of Empirical Research

By Anahid Gharakhanian, Southwestern Law School

Allison Korn and Laila L. Hlass’s Assessing the Experiential (R)evolution, recently published in Villanova Law Review, should be celebrated as a much needed example of empirical investigation and analysis in legal education, specifically experiential education.  As aptly noted in the Experiential Education section of Building on Best Practices, “[l]egal education urgently needs empirical research on what methods will best promote deep learning that transfers to practice.” 

For many years, the experiential teaching community has had the benefit of the triennial CSALE Study, providing extensive data about the infrastructure of clinics and externships.  Now Korn & Hlass’s empirical work provides data about the proliferation of deans/directors of experiential education and growth in experiential curricula.  This data sets the stage for the important questions they raise about what law schools are doing about the following:  “working to uplift experiential programming as an essential part of the institution,” and “core to the law school curriculum”; “taking steps to identify, recruit, and support clinicians of color”; and ensuring security of position and voice in law school governance.  Korn & Hlass’s work, along with CSALE’s compilation of data since 2007 about applied legal education, serves as an essential foundation for posing these important questions and joins the clarion call of others that rigorous empirical research is critical in every aspect of our assessment and advancement of experiential education – the students’ learning, role of experiential curricula, and diversity of and equity for experiential faculty. 

I think about the critical importance of empirical work from the vantage point of externships or field placement courses, which provide a singularly unique bridge to practice and where so much of the student’s experience occurs outside of the classroom and the externship professor’s direct observation.  Anecdotally we know that these real world experiences are very important to a student’s professional development and practice readiness as a new attorney.  At the same time, the ABA and some in legal education have worried about the educational rigor outside of the law school setting.  What’s needed is exploration of our impressions and perceptions through rigorous empirical work.  In the world of externships, this translated into research questions that Carolyn Young Larmore, of Chapman University, Fowler School of Law, and I took up in a year-long, multi-school study, assessing students’ success at externships and factors contributing to it (involving three law schools in the same geographic area, with very different externship program designs, and widely different incoming credentials – with 2019 median LSATs of 153, 158, and 168).  The study yielded helpful information about the importance of externships to practice readiness.  Also, a notable finding of our study – related to access – was that students from all three surveyed schools achieved very similar levels of externship success (measured in terms of first-year practice readiness), regardless of widely different entering credentials as well as the academic component of the externship programs.  Similarly, the study found that law school GPA plays a very limited role in predicting externship success.  You can see how this data could be a powerful tool in creating access for law students, from many diverse academic backgrounds and schools, to career-shaping professional experiences while in law school and beyond.

As we tackle empirical questions in experiential education, it’s helpful to think about backward design.  In the case of the experiential programming that we offer to our students, a couple of recent national studies are enormously helpful: IAALS’s Foundations for Practice, a relatively recent national study about what foundations entry-level attorneys need to begin a successful legal career (which is the study that Carolyn and I used to define externship success in our own study – i.e., how close are externs by the end of their externship to first-year practice readiness); and the very recent study by IAALS and Professor Deborah Jones Merritt, Building a Better Bar: Capturing Minimum Competence, with one of its two objectives to “[d]evelop and promote understanding of the minimum competence needed to practice law” (and the second one to “[a]lign the bar exam with research-based concepts of minimum competence”). 

To borrow from IAALS and Professor Merritt, the key here is being guided by research-based concepts.  Whether assessing our students’ learning (as Carolyn and I tackled in our externship study), or raising questions about the role of experiential curricula, and diversity of and equity for experiential faculty – as Korn & Hlass have done – we need to engage in more empirical research and use this powerful tool to inform and advance the critical work of experiential education and educators.

Clinical Law Review seeks applications for five vacancies on Board of Editors

The Clinical Law Review seeks applications for five vacancies on the Board of Editors. The Board urges you to think about whether you would be interested, and to think about others whom you would encourage to apply. 

Members of the Board of Editors serve for a term of 6 years. The term of the new Board members will commence in January 2022. The primary role of the Board members is to edit articles for the Review. Because this is a peer-edited journal, the editing process is collaborative. Board members also serve as small group facilitators in the annual Clinical Law Review Workshop. There is at least one meeting per year of the Board, usually held at the annual Workshop. 

Applicants should submit (1) a C.V. and (2) a statement explaining their interest in the position and highlighting relevant aspects of their experience.  The Board seeks applications from people committed to the work of the Review and will prioritize applicants from underrepresented groups and applicants with diverse experiences in and approaches to clinical legal education. Applications must be received no later than January 31, 2021. Please e-mail them to CLRBoardApps2021@gmail.com.  

The committee to select new Board members is always co-chaired by two current Board members whose term is expiring. We (Jeff Selbin & Jennifer Koh) will be serving this year as the co-chairs of the Selection Committee. The other members of the committee will be designated by the three organizations that sponsor the Clinical Law Review — AALS, CLEA, and NYU — each of which will designate two committee members. 

We encourage you to contact us or other current or former Board members with any questions or for information about service on the Board. We and other Board members have found the experience to be very rewarding. 

The other current members of the Board are: Muneer Ahmad, Sameer Ashar, Susan Bennett, Warren Binford, Marty Guggenheim, Margaret Johnson, Jen Lee, and Alex Scherr. The current members whose terms are ending, along with ours, are: Muneer, Susan, and Warren. 

The current Editors-in-Chief are Phyllis Goldfarb, Randy Hertz, and Michael Pinard. 

Those who previously served on the Board are: Jane Aiken, Amna Akbar; Tony Alfieri, Wendy Bach; Bev Balos, Margaret Martin Barry, Ben Barton, Juliet Brodie, Angela Burton, Stacy Caplow, Bob Dinerstein, Jon Dubin, Cecelia Espenoza, Keith Findley, Gay Gellhorn, Michele Gilman, Carolyn Grose, Peter Toll Hoffman, Jonathan Hyman, Peter Joy, Minna Kotkin, Deborah Maranville, Bridget McCormack, Binny Miller, Kim O’Leary, Ascanio Piomelli, Mae Quinn, Paul Reingold, Brenda Smith, Jim Stark, Paul Tremblay, Nina Tarr, Kim Thomas, Rod Uphoff, and Leah Wortham. The Emeritus Editors-in-Chief are Richard Boswell, Isabelle Gunning, and Kate Kruse. The late Steve Ellmann was a founding Editor-in-Chief of the Review.

We look forward to hearing from you. — Jennifer Lee Koh & Jeff Selbin 

%d bloggers like this: