"The open-access journals that charge the most aren't necessarily the most influential, an online interactive tool suggests. The freely accessible tool, launched earlier this month, shows that a journal's fees do not correlate particularly strongly with its influence, as measured by a citation-based index. 'We have brought together a way of measuring prestige and price and come up with a metric that can be used by authors to help them decide between the different venues they could publish in,' says Jevin West, a network-science and bibliometrics researcher at the University of Washington in Seattle. West led the development of the online tool as part of the Eigenfactor Project, which seeks alternative ways to rank and map science.
The 'real goal', West says, is to help to create a transparent market in open-access publishing. 'We hope to clean up a little of the predatory publishing, where publishers might be charging more than their value merits.' The tool, called Cost Effectiveness for Open Access Journals, incorporates pricing and prestige information for 657 open-access journals indexed by Thomson Reuters, including 356 that do not charge any fees. The data are plotted to show a journal's Article Influence (AI) score against the fee it charges per article. (Where charges are on a per-page basis, an article length of 15 pages is assumed, based on what the authors judge is a typical article length.) The AI score is calculated by dividing the Eigenfactor Score1 of the journal by the number of articles in the journal, normalized so that the average journal has an AI equal to 1. Eigenfactor Scores are like impact factors in that they are based on citations, but they also take into account the source of the citations. The plot can be filtered to look at any one of 35 subject fields. Doing so shows that some fields have a stronger correlation between the AI score and the fee charged, while others show very weak correlation. Journals covering the field of molecular and cell biology, for example, seem to have a stronger correlation than do those from physics or mathematics. But the tool also ranks journals' bang for the buck, with a table of cost-effectiveness values — which are calculated by dividing the journal's AI score by the price to publish. Of the 301 fee-based open-access journals considered, the most cost-effective was the Publication of the Astronomical Society of Japan (see Best-value journals); the least cost-effective was theJournal of Physical Therapy Science (see Least-value journals). Among the largest open-access publishers, the Public Library of Science had three journals — PLoS Biology, PLoS Genetics andPLoS Medicine — ranked within the top 15 journals for cost effectiveness. A BioMedCentral journal, the Irish Veterinary Journal, ranked among the lowest.
Peter Suber, director of the Harvard Open Access Project in Cambridge, Massachusetts, welcomes the tool as a way to drive competition into the market. He adds, however, that he is sceptical about using a metric based solely on citations to judge prestige. Bo-Christer Björk, an open-access researcher at the Hanken School of Economics in Helsinki, says that factors other than prestige — from the speed of the review process to layout — also could influence researchers' decisions about where to publish. But he agrees that the tool will be useful..."