Can we count on journal metrics?

How do you rank science, how do you rate scientists, what kudos do you give their papers and what metrics do you attach to the impact of a paper? They’re questions as old as the scientific literature itself. But, no one has resolved them. Independent organisations and publishers have attempted with the likes of the ISI Impact Factor. Academics weary of the prominent journals and the prominent researchers getting all the “gold stars” have attempted to overturn such metrics and devise their own in the form of the H-index. But, getting the measure of metrics is difficult, especially in today’s climate.

In the current journal market and particularly given the economic climate, institutional purchasing is severely constrained by economic factors. For publishers outside the coteries of the three or four most well known, establishing prestige and validating the research articles within their pages is critical but difficult. In terms of survival of the fittest this applies equally to journals published and paid for by both traditional, open access and other models.

From the researchers’ perspective they want to publish in journals that will give their science and their team the most prominence and so give them more pulling power when it comes to filling in that next grant application or research assessments. Librarians, researchers including those in specialist niches have attempted to apply pressure on the way funding bodies, governments and companies place reliance on the standard metric.

Impact Factors are a double-edged sword, of course. If yours is high, you will be happy for yourself whether author or publisher. But if it’s low it is difficult to remedy that situation and without gaming the system there are few ways for important work from researchers that lack prominence and exist in niche areas to have a big impact.

Institutions have recognised the problems and the biases to some extent and have begun to evaluate a title’s significance beyond the conventional approach. Perhaps a system like PeerIndex might be extended to researchers, their papers and journals in some way. Indeed, some publishers have devised their own systems, e.g. Elsevier with Scopus and online scientific communities are beginning to find ways to rank research papers in a similar way to social bookmarking sites like reddit.com and digg.com.

For some institutions and countries following the Impact Factor is nevertheless obligatory. A recent paper by Larsen and von Ins investigates Impact Factor and others have looked at how publishing the right thing in the right place can affect careers (Segalla et al). There is a whole growth area in the research literature on assessing Impact Factor and other metrics. Instances where Impact Factor seems not to work particularly well have been reported recently. The Scientist explains how a single paper in a relatively small journal boosted that journal’s position in the league tables so that it overtook one of the most prominent and well-known journals, but only for a short period while that paper was topical and being widely cited.

There are many academics arguing for a change in the way papers and journals are assessed, among them Cameron Neylon who is hoping that the scientific community can build an alternative for the diverse measurement of research.

Unfortunately, there seems to be no simple answer to the problem of assessing research impact. Indeed, what is needed is some kind of ranking algorithm that can determine which of the various alternative impact factors systems would have the greatest…well…impact…

Research Blogging IconLarsen, P., & Ins, M. (2010). The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index Scientometrics, 84 (3), 575-603 DOI: 10.1007/s11192-010-0202-z

 

Research Blogging IconSegalla, M. (2008). Publishing in the right place or publishing the right thing: journal targeting and citations’ strategies for promotion and tenure committees European J. of International Management, 2 (2) DOI: 10.1504/EJIM.2008.017765