Twitter, peer review and altmetrics: the future of research impact assessment | Higher Education Network | Guardian Professional

abernard102@gmail.com 2012-09-22

Summary:

‘No one can read everything. We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these altmetrics reflect the broad, rapid impact of scholarship in this burgeoning ecosystem. We call for more tools and research based on altmetrics.’  This quote is taken from the introduction to the altmetrics manifesto. And the reason it's a manifesto, rather than a mission or vision statement, is arguably because changing the way scholarly impact is measured is going to need something of a revolution – and no revolution is complete without a manifesto.  So why is a revolution needed? Because long before the tools even existed to do anything about it, many in the research community have bemoaned the stranglehold the impact factor of a research paper has held over research funding, careers and reputations. As bloggers Victor Manning and William Gunn wrote: "Influence is only one dimension of importance". Other bugbears include the slowness of peer review and the fact that impact is not linked to an article, but rather to a journal, as this blog from the Scholarly Kitchen points out.  Now though, the tools exists to consider what other factors may be used to determine importance, and they are being refined daily. Alternative metrics (or altmetrics as they are known) have brought together the tech geeks and the research nerds who are eager to define their own measures of excellence. Though many of the communities they form or join, such as Academia.edu, Mendeley or Total-Impact aren't new, wider changes in the research environment (namely, growing support for open access and policy shifts mandating impact measurement) have given altmetrics a new urgency.  But altmetrics are not universally popular. One commenter on the site writes: ‘[Impact factors] can be (and are) manipulated to a certain degree ... but the alt, total, ultimate, mega etc. metrics are far worse because the link to research quality is less direct, and in terms of some of the indicators, like twitter activity, it is non-existent. Moreover, these metrics will be much easier to manipulate’.  So what does the future of impact assessment hold? How will these new metrics develop and how are they likely to be adopted by the sector? Perhaps most importantly, will altmetrics address the abuses of impact factors or simply create abuses of their own, particularly when importance is determined through social media influence.  Join our live chat, Friday, 21 September at 12 BST, to explore these questions and any others you may have...”

Link:

http://www.guardian.co.uk/higher-education-network/blog/2012/sep/19/peer-review-research-impact-altmetrics

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.new oa.business_models oa.publishers oa.comment oa.elsevier oa.peer_review oa.impact oa.quality oa.social_media oa.twitter oa.prestige oa.bmc oa.jif oa.aalto.u oa.u.exeter oa.ubiquity oa.academia.edu oa.altmetrics oa.blogs oa.thomson_reuters oa.mendeley oa.odi oa.total-impact oa.altmetrics_manifesto oa.avatar oa.open_book_publishers oa.informetrics oa.u.greenwich oa.bar-ilan.u oa.events oa.metrics

Date tagged:

09/22/2012, 10:57

Date published:

09/22/2012, 06:56