By: Robert R. Kuehn, Associate Dean for Clinical Education, Professor of Law
In addition to its annual ranking of law schools, U.S. News ranks nine law school specialty programs. Although the school rankings rely on a number of factors to compute a final score (peer assessment, lawyers/judges assessment, LSAT, undergraduate GPA, etc.), specialty rankings are based solely on reputation. A faculty member teaching in the specialty area at each school is asked to assess the specialty programs at the almost 200 other ABA accredited schools. As this column explains, there are serious problems with this type of ranking.
In previous years, the person rating other schools was directed to choose the top 15 programs in that specialty area. U.S. News would then list the top 12 to 20 programs (and even more behind a paywall) based on how often the school was listed by raters. This year, U.S. News asked survey participants to rate specialty programs on a 5-point scale of Outstanding (5) to Marginal (1), mirroring the much-criticized method used to assess the academic reputation of schools among peers and lawyers/judges.
The response rates among the faculty who received this year’s specialty program ballot varied from 44% (International Law) to 65% (Clinical Training). With the new scale, U.S. News chose to rank in order every program that received at least 10 ratings. This resulted in publicized rankings from 1st to 186th, with individual school scores ranging from 4.9 to 1.0 and average program scores from 2.1 (Intellectual Property and International Law) to 2.9 (Dispute Resolution).
A glance at the new rankings shows noticeable bunching of schools ranked highest by U.S. News also at the top of most specialty programs and an overpopulation of lowest ranked schools at the bottom. Is this because the highest ranked schools also happen to have the highest quality specialty programs and vice-versa for the lowest ranked schools? Or, might the over-representation at the two ends be due in part to a halo effect where the academic reputation of a school is influencing the rating of that school’s specialty program? This effect could be especially pronounced this year, when raters were asked to not just opine on the quality of 15 specialty programs but to express an informed opinion on 200.
To examine the strength of the relationship between a school’s peer assessment and specialty program reputation, scores for each school in every specialty were collected and analyzed. The Pearson’s correlation coefficients between the two scores are set out below:
Where a coefficient of 1.0 indicates a perfect relationship between two variables, coefficients of .7 or greater are often characterized as “very strong.” The ratings of some specialty programs (e.g., International and Tax Law) stand out as very strongly associated with the school’s academic reputation, while two programs were negligible and lacked statistical significance to their school’s reputations (Dispute Resolution and Legal Writing). Law school reputation scores explains 80% of the variability in International and Tax Law specialty rankings and over 50% of the variance in Environmental and Intellectual Property Law scores. The large correlations suggest that some raters may be cueing off a school’s academic reputation and not independently judging the quality of its specialty program. But because correlations do not prove causation, it could also be that schools with better academic reputations also largely have better specialty programs.
A closer look at the distribution of Clinical Training scores suggests that some haloing is going on, at least when raters are asked about schools with the highest and lowest academic reputations. Clinical scores were strongly correlated with a school’s academic reputation (r=.66), with 44% of the variation in scores explained by school reputations. The graph below plots a school’s peer assessment score against its Clinical Training score with a regression line added (β=.51, p < .01) to illustrate the positive relationship.
Notice that every school with a peer assessment score of at least 4.0 also had a Clinical Training score above 3.0. Conversely, only one out of thirty-two schools with a peer assessment score of 1.5 or less garnered a clinical rating above 3.0. Clinical scores for schools in the peer assessment mid-range appear to be more dispersed. Correlation analysis supports these impressions — correlation coefficients for the 50 top-rated and bottom-rated schools were .69 and .65 respectively, while the coefficient for middle-ranked schools was only .37.
Data from the ABA and surveys of clinical programs by the Center for the Study of Applied Legal Educating (CSALE) provide additional variables potentially associated with clinical program rankings. There is a moderately strong association between the size of a school’s J.D. enrollment and its Clinical Training score, with a statistically significant correlation of .49 (explaining 24% of the variance in clinical ranking). There is a similar association between the number of “slots available” in law clinics and program scores (r=.48), but beware that slots available reporting by schools is largely unregulated and varies widely. The number of full-time law clinic and externship faculty reported in CSALE surveys is also related to program ranking, although less strongly (r=.31). Do these correlations indicate that larger schools tend to have higher quality clinical programs or are their programs and number of clinical faculty just larger and more noticeable?
An increasing number of schools are guaranteeing or requiring their J.D. students a law clinic or externship experience before graduation. This potential signal of clinical experience availability or quality, now at 75 schools, is only weakly associated with program ranking scores (r=.17).
U.S. News explained that its goal in moving to the new scale was to rank more schools (five to six times more) and provide better understanding of the scope of the differences between schools. Yet when there is strong evidence that many raters may cue off a school’s overall reputation and not independently assess the quality of its specialty program, U.S. News’ rank ordering of school specialty programs down to 192 is of dubious validity and should also call into question the use of this method in its annual law school ranking.
Filed under: Uncategorized |