Research impact: Altmetrics make their mark : Naturejobs

abernard102@gmail.com 2013-08-23

Summary:

"Steve Pettifer and his colleagues did not heavily promote their 2008 paper on digital library tools. So it came as a surprise when, in August 2012, Pettifer got an e-mail from the Public Library of Science (PLOS), based in San Francisco, California. A PLOS representative told him that people had viewed or downloaded the article (D. Hull et al. PLoS Comput. Biol. 4, e1000204; 2008) more than 53,000 times. It was the most-accessed review ever to be published in any of the seven PLOS journals. The paper had come out just as biologists' interest in digital publishing was building and the number of tools was exploding, says Pettifer, a computer scientist at the University of Manchester, UK. 'It hit the right note at the right time,' he says. At one time Pettifer would have listed the paper on his CV accompanied by the journal's impact factor and the article's number of citations — in this case, about 80. But when he came up for promotion this year, he realized that tracking citations was not going to tell the whole story about the paper's influence. Impact factor is a crude measure that applies only to the journal, not to specific articles, he says; citations take a long time to accumulate, and people may not cite a paper even if it influences their thinking. So he added the number of views to the CV entry. And he did not stop there. Next to many of the papers listed, Pettifer added labels indicating scholarly and public engagement. The labels were generated by ImpactStory in Carrboro, North Carolina, one of several services that gauges research impact using a combination of metrics — in this case, a wide range of data sources, including the number of times a paper has been shared on social-media websites or saved using online research tools. When Pettifer submitted his annotated CV for the first round of promotion review, his mentor expressed confusion. He took a look and said, 'What the hell are these badges doing in your CV?' recalls Pettifer. 'But once I explained them, he said, 'Well, give it a go.'' Pettifer submitted his CV for the second round — and got his promotion. He does not know for sure whether the metrics helped, but he plans to use them on future grant applications. 'I'm definitely a convert,' he says.    
'Altmetrics', a term coined in 2010 by ImpactStory co-founder Jason Priem, refers to a range of measures of research impact that go beyond citations. Several altmetrics services have emerged in the past few years (see 'Four ways to score'). They produce reports that gauge impact by taking into account not just academic citations, but also digital use and sharing of data — which can include the number of times a paper has been tweeted, 'liked' on Facebook, covered by the media or blogs, downloaded, cited on Wikipedia or bookmarked online. Some services also evaluate research products such as software, data sets and slideshows by tracking the number of people who have used or viewed the product online (see Nature 500, 243–245; 2013).  
Altmetrics offer researchers a way to showcase the impact of papers that have not yet gathered many citations, and to demonstrate engagement with the public. They can be accessed through journals or independent websites, and can track the impact of particular data sets or papers, or evaluate the combined influence of publications and products produced by multiple researchers in a department.  
But these services must be used wisely. They are not meant for strict quantitative comparisons; nor do they always distinguish between positive and negative attention. And although scientists can include altmetrics in job and grant applications and annual reports, they must select relevant data and clearly explain the context to avoid provoking mistrust or confusion.  
Some altmetrics services generate profiles that summarize the impact of a researcher's products. ImpactStory allows scientists to import lists of items such as papers and software from existing user profiles at websites such as Google Scholar, which automatically tracks a researcher's papers, or the online software-code repository GitHub. Scientists can also manually enter the digital object identifiers (DOIs) of their papers, or input their Open Researcher and Contributor ID (ORCID), a unique identifier that can be used to tag all of a researcher's work. ImpactStory then creates a profile showing how frequently each product has been viewed, saved, discussed, cited or recommended online ..."

Link:

http://www.nature.com/naturejobs/science/articles/10.1038/nj7463-491a

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.new oa.comment oa.universities oa.plos oa.impact oa.quality oa.tools oa.prestige oa.github oa.citations oa.colleges oa.orcid oa.altmetrics oa.dois oa.impact_story oa.metrics oa.hei

Date tagged:

08/23/2013, 19:09

Date published:

08/23/2013, 15:09