Anthropological Notes 5: Michigan Law Review Cite Count Project (Is this serious)



As some readers know Michigan Law Review has published an article on the most cited law reviews articles. Before I go on to the frightful possibility that anyone takes it seriously, I need to make a few points. First, I'd be happy to be on the list. Second, the authors on the list are certainly not responsible for creating the list. Finally the  assessment of SSRN is the best I have seen and the Michigan authors' discussion of why their numbers could be off is first rate.

 But, ultimately there are many anthropological observations to make based on the list and its existence.  Let's start with one of the last and perhaps most nonsensical passage in the work: "In the end, regardless of the publication venue, all involved in publishing legal scholarship should be striving for an environment in which authorship, affiliation, and editorial responsibility are clearly marked so that readers can fully evaluate the credibility of what they are reading."  So in the end, the authors suggest, the credibility of what is written can be determined by who authored the work, their affiliation and where published. To quote a former famous tennis player "YOU CANNOT BE SERIOUS."  My goodness not only do we hire on the bases of institutional authority but now we assess the credibility of what is written by the same standards. Here are a few points.

1. Authority

This type of thinking leads to something I found when recently reviewing a piece for a colleague. He wrote something like this: "It is well known that poor people have less access to dental care."  I do not doubt that this true. His citation was to a Supreme Court Justice who had said just that, without any analysis what so ever. When did a Supreme Court Justice became an authority on income and dental care? I suppose when you join the institution of the Supreme Court you are deemed to be an authority on everything. Of course that is hogwash just as is the suggestion that credibility should be assessed on who said it (not their support) and where is was published (as determined by a third year law student) and where the author teaches (as determined by a system so rigged it would make professional wrestling look legit.)

2. Why are Some Cite Counts High

A friend once told me that he found a really good article in the Buffalo Law Review but he was looking for the same general statement somewhere in the Harvard Law Review. I asked why and he said "So the editors will be impressed." The Michigan article notes that articles published  in elite journals and  written by people at elite schools are cited far more often. Thankfully, except for the quoted passage they do not otherwise say those are the best or most influential articles with respect to anything that matters. That would be like saying "We have found the 100 best articles and, oh, what a coincidence, they just happen to be in elite journals and written by people at elite schools." This would overlook the reason they are are cited. They are in the top  one-hundred in large part because of where they are published and who wrote them. Their inclusion in another article is a bit of advertising. --  the implicit message of the author doing the citing is sending is "What I am writing must be good because look where what I cited was published and by whom." In short, another word for "cited" is "used" and it means used as a means to an end and the ends is publication, regardless of quality.

3. Guess what.

 The authors do think since we are talking about law review articles it might be good idea to see how they have affected actual law.  This, however, is way too hard  especially when we can so easily count articles citing articles. The authors evidently make a stab at it with the results appearing in Table XI. I am sure this is my mistake but the pdf available from the law review web cite does not have a Table  XI that I can find. Nevertheless,  I agree that assessing impact by looking at case citations is difficult (Coase, by the way, the leading article, has 47 in 52 years according to ALLCASES in WESTLAW. That's less than one a year and if you toss out the 7th circuit opinions the total shrinks) but what they conclude is that the legal citations to the most cited articles in law reviews is "respectable." "Respectable" is not defined but I suppose that includes the whopping 4 citations in 49 years for number 15 on the list. And the 10 for number 33. And 10 more for number 19. There are 18 for number 8 on the list.  I did see some that were relatively high but I am no more sure what "relatively high" means here than the authors are of what "respectable" means. Perhaps on Table XI the numbers are higher but what does that tell us much without knowing how often other articles -- including those not in the top 100 -- were cited.   Hey, I am beginning to wonder if someone will cite the Michigan article for the proposition that the courts cite the 100 hundred at "respectable" levels. After all, it did appear in Michigan so it must be true.

4. Impact

The authors tell us they are measuring impact. Let's think about that. There is "impact" and there is "impact." If  you write something for a fancy law review and someone cites it because of who you are and where it is  published in hopes that their own article will be cited, I suppose that is impact.  But it is impact within a group that largely write for each other and have no impact outside their society. Think of 20 films directors. The make movies they exchange. Very few others see them. Periodically they rank their movies based on how often the movies are paid homage in other movies. Is that impact? And is counting the number times this happens scholarship?