The next revolution in Science: Open Access will open new ways to measure scientific output 2012-04-19


“Open Access will not only change the way that science is done, it will also change the way that science is judged. The way that scientific output is measured today centers around citations. Essentially, on an author level this means the number of publications and citations of an author’s articles (author-level metrics). On a journal level, it means the average number of citations that articles published in that journal have received in a given time period (journal-level metrics). For author-level metrics the Author citation Index has now been replaced by the H-Index that was introduced in 2005 by JE Hirsch. Here the criterion is the number of articles [n] that have received ≥ n citations at a fixed date. In the case of journal level metrics, the Journal Citation Report (JCR) is a databases of all citations in more than 5000 journals—about 15 million citations from 1 million source items per year. From this the journal Impact Factor (JIF) is derived from the number of citations in the current year to items published in the previous 2 years (numerator) and the number of substantive articles and reviews published in the same 2 years (denominator).  It effectively represents the average number of citations per year that one can expect to receive by publishing his / her work in a specific journal. Although the JIF is meant for large numbers of publications, it is also often used in the evaluation of individual scientists. Granting agencies and university committees for instance often substitute the actual citation counts for the number of articles that an author has published in high impact journals. The introduction of the H-Index has diminished the use of the JIF for individual scientists but the practice has yet to disappear. Apart from this the JIF has other flaws. Imagine a journal only publishing reviews. Such a journal would evidently get a high impact factor but clearly the real impact of the published papers for the field will be much less than that from original research papers. An easy way around this problem is offered by the use of the H-Index methodology for journals. This is precisely what Google Scholar metrics does.  Because Google has only been offering this possibility since 1st april 2012, it is too early to tell whether this will become a widely accepted method for journal-level metrics. The H-Index, Google Scholar metrics and the JIF are all rather good indicators of scientific quality. However, in measuring real-world impact they are seriously flawed. Think for a moment of how impact is felt for whatever random topic you can think of. Everyone of us  will consider the publication itself, but probably also downloads, pageviews, blogs, comments, Twitter, different kinds of media and social network activity (Google+, Facebook), among other things. In other words, all activities that can be measured by “talking” through social media and other online activities can be used to give a more realistic impression of the real impact of a given research article. Since talking about articles depends on actually being able to read the articles, this is where open access comes into play... A number of article-level metrics services are currently in the start-up phase. A company called Altmetric is a small London-based start-up focused on making article level metrics easy. They do this by watching social media sites, newspapers and magazines for any mentions of scholarly articles. The result is an ‘altmetric’ score which is a quantitative measure of the quality and quantity of attention that a scholarly article has received. The altmetric score is also implemented in UtopiaDocs, a PDF reader which links an article to a wealth of other online resources like Crossref (DOI registration agency),Mendeley (scientist network), Dryad (data repository), Scibite (tools for drug discovery), Sherpa (OA policies and copyright database) and  more...PLoS also uses article level metrics to qualify articles by giving comprehensive information about the usage and reach of published articles onto the articles themselves, so that the entire academic community can assess their value. Different from the above, PLoS provides a complete score build on a combination of altmetrics, citation analysis, post-publication peer-review, pageviews, downloads and other criteria. Finally, Total-Impact also makes extensive use of the analysis of social media and other online statistics, to provide a tool to measure total impact of a given collection of scientific articles, datasets and other collections. Their focus on collections represents still another approach to the problem of evaluating scientific output."



08/16/2012, 06:08

From feeds:

Open Access Tracking Project (OATP) »
Open Access Tracking Project (OATP) » Connotea: tomolijhoek's bookmarks matching tag

Tags: oa.@ccess altmetrics oa.impact oa.comment oa.plos oa.quality oa.social_media oa.twitter oa.jif oa.citations oa.dryad oa.facebook oa.h-index oa.altmetrics oa.blogs oa.sherpa oa.mendeley oa.scibite oa.utopia_documents oa.metrics



Date tagged:

04/19/2012, 07:46

Date published:

04/19/2012, 16:28