Thursday, March 27, 2008

Putting my money where my mouth is....

In keeping with my comments in the recent ABA Journal article on the rankings (here), talking about improving USNWR's rankings to make them, well, useful would be helpful. Over at Concurring Opinions, Daniel Solove has started such a project (here). Here are his suggestions:
1. The reputation surveys are not sent out broadly enough. They go out only to deans and to newly-tenured professors. A broader cross-section of law school faculties should be used in the poll.

2. A more granular reputation scoring system should be used. The current 1-5 score isn't granular enough. For starters, how do top schools like Yale and Harvard have average scores less than 5. Who's giving them a score of 4? Seems fishy to me. Suppose a dean thinks Yale is the best and that Chicago is excellent -- not quite as good as Yale, but very close. Yale therefore gets a 5. Does that mean Chicago gets a 4? That's a big drop. Giving Chicago a 5 says it is equal, which may not be the dean's view. There's a problem here -- the scale isn't granular enough.

3. The number of library volumes shouldn't be a part of the scoring system. This strikes me as a silly factor in ranking law schools.

Here's mine:

1. (Fantasyland) Get rid of ordinal rankings, and group schools into meaningful clusters. There's no way that USNWR will make this change. Ordinal rankings sell, even though the distinctions between 1 and 5 (or, for that matter, between 1 and 15, or 15 and 30, etc.) are negligible.

2. Make the reputational scores a much smaller part of the rankings. These scores are based on surveys, folks, and surveys sent to a very small number of people (see Solove's point, above). There are all sorts of problems with halo effects lingering around schools and with people rating schools without knowing much--or anything--about them.

3. If USNWR's rankings are supposed to be a consumer guide, focus on things important to the consumers (potential law students): for one thing, parse out the job placement stats, and force law schools to report how many graduates are working as lawyers, in what types of practices, in which markets, etc., so that the old trick of hiring grads for a few months is no longer advantageous. Toss out the LSAT and UGPA stuff altogether, leaving it for the ABA-LSAC Official Guide to ABA-Approved Law Schools. Who applies to a particular law school doesn't tell the consumer much about what the law school itself is like--it just perpetuates halo effects and distorts the admissions process. Keep in bar passage rates. Try to figure out if there's a way to track the learning experience for the students. (I doubt it. How do you track how good the professors are at teaching or whether they're available for office hours?) Add in measurable facilities or curricular issues, perhaps, if there's a way not to game them.



The whole problem with the rankings is that they're being gamed by some schools, which further distorts an already distorted process. Figure out ways to measure what matters to law students, and (just as important) come up with measures that are hard to game. Ultimately, it's going to be hard to justify a purely quantitative model that maintains honesty and that is useful to the consumer.

But you weigh in: what should USNWR do differently in future rankings?

2 comments:

Anonymous said...

I teach pre-law courses at the undergraduate level. Given that law school applicants, particularly my young undergrads, view the USNWR rankings as a sort of consumer report, it seems to me that an important piece of information would be satisfaction levels of recent graduates with specific aspects of their legal education. (Assuming you could get an honest measurement of something like that.)

For example, my law school education, while it provided me with ample substantive knowledge, prepared me very little for the realities of practicing law.

Admittedly, part of this is my fault for not taking more procedural courses, and not taking advocacy classes, and not participating in one of the many legal clinics the school offered.

And in my opinion, therein lies the flaw of American legal education. While I was in law school, I had no clue that I would eventually become a litigator. I had no desire to practice litigation. So I didn't take those courses. The truth is, the skills those courses offer are highly useful regardless of whether you become a litigator or a transactional lawyer.

My reality, post-graduation, was that I became a litigator, with no clue how to litigate. (The first time someone said "objection, form" in a deposition, I looked at the senior lawyer from my firm who I was observing--who, thank God, was conducting the deposition--like the objecting lawyer was from Mars.)

If you go to work for a large firm, you have the possible advantage of mentoring. If not, you learn a lot through trial, error and embarrasment your first two or so years.

Looking back on my law school education, I now wish that I had either elected to, or better yet, had been compelled to take advocacy and maybe to participate in a clinic or two. (Maybe there is some value to aspects of the English Inn model?)

If I were a law student or applicant today (which, fortunately is not reality, and only a recurring nightmare I have from time to time), knowing what I know now, I would want to assemble the best practical education possible.

Anonymous said...

I'd like to see the rankings take into account how many people of color and women there are on law faculties, and within student bodies.