Data not shown: time to distribute some common sense about impact factors | Reciprocal Space 2015-06-24


" ... It’s that time of year when all clear-thinking people die a little inside: the latest set of journal impact factors has just been released. Although there was an initial flurry of activity on Twitter last week when the 2015 Journal Citation Reports* were published by Thomson Reuters, it had died down by the weekend. You might be forgiven for thinking that the short-lived burst of interest means that the obsession with this damaging metric is on the wane. But this is just the calm before the storm. Soon enough there will be wave upon wave of adverts and emails from journals trumpeting their brand new impact factors all the way to the ridiculous third decimal place. So now is the time to act – and there is something very simple that we can all can do. For journals, promotion of the impact factor makes a kind of sense since the number – a statistically dubious calculation of the mean number of citations that their papers have accumulated in the previous two years – provides an indicator of the average performance of the journal. It’s just good business: higher impact factors attract authors and readers. But the invidious effects of the impact factor on the business of science are well-known and widely acknowledged. Its problems have been recounted in detail on this blog and elsewhere. I can particularly recommend Steve Royle’s recent dissection of the statistical deficiencies of this mis-measure of research. There is no shortage of critiques but the impact factor has burrowed deep into the soul of science and is proving hard to shift. That was a recurrent theme of the recent Royal Society meeting on the Future of Scholarly Scientific Communication which, over four days, repeatedly circled back to the mis-application of impact factors as the perverse incentive that is at the root of problems with the evaluation of science and scientists, with reproducibility, with scientific fraud, and with the speed and cost of publishing research results. I touched on some of these issues in a recent blogpost about the meeting; (you can listen to recordings of the sessions or read a summary). The Royal Society meeting might have considered the impact factor problem from all angles but  discovered once again – unfortunately – that there are no revolutionary solutions to be had. The San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto are commendable steps in the right direction. Both are critical of the mis-use of impact factors and foster the adoption of alternative processes for assessment. But they are just steps ..."


From feeds:

Open Access Tracking Project (OATP) »

Tags: oa.comment oa.jif oa.impact oa.publishers oa.business_models oa.thomson_reuters oa.dora oa.leiden_manifesto oa.advocacy oa.metrics

Date tagged:

06/24/2015, 08:11

Date published:

06/24/2015, 04:11