Citations and Critical Commentary
10 May 2006 at 6:08 pm Peter G. Klein 3 comments
| Peter Klein |
Yet more on citations. The May 2006 issue of the always-interesting EconJournalWatch — the economics profession’s own watchdog organization — considers the decline of critical commentary in economics journals. Beginning in the 1970s, most top-tier economics journals stopped publishing comments and replies. Robert Whaples blames the Social Sciences Citation Index: regular articles tend to get more citations per page than comments and replies, so journal editors, wishing to maximize their citation counts and impact factors, prefer regular articles to critical commentary and discussion. (I think the same would apply, a fortiori, to book reviews, though they aren’t mentioned here.) Philip Coelho and James McClure add that journal editors care about not only citations to the journal, but also citations to the journal’s own “insiders” (e.g., editors and editorial board members). They provide empirical evidence that comments and replies rarely cite insiders.
Bottom line: academia’s increasing reliance on bibliometric measures of quality and performance has real effects on the kind of research we do, how we package and promote our research, the demographic characteristics of the research community, and so on. Good or bad? Comments are open.
Entry filed under: - Klein -, Methods/Methodology/Theory of Science.
3 Comments Add your own
Leave a comment
Trackback this post | Subscribe to the comments via RSS Feed
1. Tim Swanson | 10 May 2006 at 9:03 pm
I think one of the main problems with this phenomenon is that it is really no different than numerology. Despite good intentions, article and journal quality is by-in-large a subjective measurement and the Austrian critique’s against positivism can be applied here as well. Just as a batting average and GDP are not holistic measurements for all aspects of a batter or economy, so to are many of the measurements used by bibliometrics.
2. Bo Nielsen | 11 May 2006 at 2:18 pm
Just a quick remark about batting average – those of us who follow baseball here in the US will know about Barry Bonds and the controversy surrounding his power and homerun record..
I wonder when we will see the first stereoid-scandal in academia? (as long as coffee is not on the black list I should be safe…)
3. Thomas Mayer | 11 May 2006 at 2:20 pm
One problem with reliance on the Citation Index is that it fosters conformity. A good way to increase your citatation count is to write on a topic that is currently much discussed. For the next few yars just about everyone writing on it will have to mention your paper.
A related problem is that many (I suspect most) citations are just hat-tippers, e.g,. “this problem was also discussed by .Jones, Smith and ….” Serious users of theCittation Index should look at the citing papers and separate such hat-tippers from citations that indicate that the cited paper was of some use to the author o fthe citing paper. Also, a paper can be influential without being cited. It may point out a flaw in set of widely used models. As a result, these models are no longer being cited, and no more models of this type are being published. The likely result is no citations to the paper pointing out the flaw. Final;ly the Citation Index is sometimes inaccurate.
Thomas Mayer