Open Access and the Impact of Impact

abernard102@gmail.com 2012-08-20

Summary:

“Many scientists including myself have long been convinced that opening access to research and data is the way forward for science: it facilitates the important reproduction of results, speeds up dissemination of results, allows a wider debate, and importantly it places research outputs directly in the hands of those who paid for it, and for whose benefit it was ultimately carried out. We often point the finger at publishing companies for standing in the way of this lofty ideal. They have long been able to make huge amounts of profit out of receiving content for free from scientists, publishing it, and then charging lots of money to libraries and the interested lay person for accessing it. The debate has recently hit the mainstream, following a fed up blog post by mathematician Tim Gowers, a large petition signed by thousands of scientists, and statements in support of open access by the Wellcome Trust and the UK Government. I previously wrote about the Dutch research council NWO making funds available to its grantees for open access publication charges. My current employer, the Max Planck Society, are launching a new top-tier open access journal called eLife with the Wellcome Trust and the Howard Hughes Medical Institute. It seems like some powerful forces are at last aligning behind a more open way of doing science. The Guardian, a big proponent of publicly available data, has been running a series of articles and blog posts on the issue. On Friday fellow astronomer-blogger Peter Coles of the University of Cardiff took his turn to make the case for open access. I was particularly happy to see Peter tackle two particular angles in his article. The first is the need for access not just to publications but also to data. He’s right that astronomy does a pretty good job in that, but this aspect of access often gets overlooked in the broader science community. The experience with public data archives in astronomy is that they have massively increased the scientific output from our observatories. The second interesting angle is that of the UK’s Research Excellence Framework, which plays into the hand of the publishing companies. In the REF, UK universities are judged by the government on their research output. It’s a pretty  complex bureaucratic procedure (if you can’t sleep tonight, you can read all about it here) but essentially it comes down to this: the more papers a university’s researchers have published, the more citations they’ve gathered and the higher the journals’ impact factors are of these publications, the higher they will score. The higher they score, the more funding they receive from the Government. This system props up the prestige of the high-profile journals, which are almost always behind expensive paywalls. Incidentally, the REF webpages actually contain some interesting publications beyond the actual guidelines. The Centre for Science and Technology Studies at the University of Leiden carried out a study for HEFCE in 2007 entitled “Scoping study on the use of bibliometric analysis to measure the quality of research in UK higher education institutions” – and yes, it is publicly available. Essentially it looks at how well we can assess the quality of an institute’s research by studying its bibliographic output, i.e. its journal papers and citation counts. If you’re interested in such matters, it’s a pretty good read. Contrary to what I expected, it gives a balanced description of the pros and cons of using bibliometrics to assess scientific output and what it calls ‘intellectual influence’, including how using such methods affects the publishing behaviour of scientists. This is a very important point to consider. We will only become more open as a community if we are systematically rewarded for it; until then, we remain slaves to the impact factor and to our h-index. I’ve been thinking about this stuff a lot recently. As I’m approaching the 6-7 year post-PhD sweet spot for securing a permanent position, I’m frustrated by the narrowly defined measures of success I’m judged on, and how these are sometimes incompatible with being open. But I also know that it’s probably better to put up, shut up, and play the game to the best of my ability, so that one day I might be a curmudgeonly professor like Peter, instead of someone who was once an astronomer.”

Link:

http://sarahaskew.net/2012/04/22/open-access-and-the-impact-of-impact/

Updated:

08/16/2012, 06:08

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.new oa.data oa.gold oa.business_models oa.publishers oa.policies oa.comment oa.government oa.advocacy oa.signatures oa.petitions oa.boycotts oa.open_science oa.netherlands oa.uk oa.impact oa.prestige oa.astronomy oa.funders oa.wellcome oa.jif oa.citations oa.studies oa.nwo oa.elife oa.h-index oa.ref oa.journals oa.metrics

Authors:

abernard

Date tagged:

08/20/2012, 18:09

Date published:

04/23/2012, 12:55