Just the Tip of the Iceberg | Peer to Peer Review

abernard102@gmail.com 2014-03-14

Summary:

" ... as we all found out when it was announced that publishers Springer and IEEE were withdrawing 120 papers they had published because they were found to be such computer-generated gibberish. The story broke in February in an article in Nature, which explained that the articles had apparently been 'written' by software called SCIgen, 'which randomly combines strings of words to produce fake computer-science papers.' A French researcher named Cyril Labbé subsequently wrote a program designed to detect SCIgen-produced papers and ran it against some databases of scientific publications. He found a significant number of these phony articles in two commercial databases.  Apparently all of the papers were in conference proceedings that were identified by the publishers as peer-reviewed. At least one scholar listed as an author on a paper said that his name was used without his knowledge. The parallels between this situation and the 'sting' about peer-review in open access journals that was published by Science last year make it inevitable that the usual suspects would line up to make markedly different assertions about the Labbé study. Those who tend to defend traditional publishers point out that the papers were in conference proceedings and that publishers typically have less control over those types of publications than they do over journals. Advocates for newer models of publication—and I guess I am one of these 'usual suspects'—point out that what is sauce for the open access goose is sauce for the toll-access gander. I hope to make a larger point about both studies before I am through, but let me first offer a series of observations that I hope will help with context for the debate. First, the two publishers 'caught' in this study, Springer and IEEE, both issued statements about the situation. Springer promised better procedures in the future ... Second, it is worth noting that there is a difference between these SCIgen papers and the pseudo-science article that John Bohanan designed for use in his open-access sting. Bohanan crafted a paper that was superficially plausible but that contained, he said, scientific errors that should have been caught by peer-review. The SCIgen papers, on the other hand, literally do not make sense; the words are connected grammatically but not logically, so that any competent reader should know she is not reading anything with substance ... What this suggests is that, whether or not the journals are open or closed access, the SCIgen research exposes an even deeper problem with peer-review. These papers didn’t slip past inattentive peer-reviewers; they may have not been reviewed at all, in spite of the publishers’ claims.  Third, we should address the fact that the papers were all found in conference proceedings. I have no doubt that publishers have less control over conference proceedings than they do over other content that they publish; they may simply rely on the organizers of the specific conferences to have made quality judgments. But that fact alone does not mitigate the fact that a huge failure has occurred here, and it is a failure that appears systemic. From the perspective of the libraries that purchase these collections, we should remember a couple of things. First, these items were marked as peer-reviewed, which means that a common marker that we use to teach students about scholarly sources as well as to evaluate faculty is unreliable. Second, the reason there are so many conference proceedings included in these databases is because publishers use the ever-growing number of papers they publish to market their products and justify their price increases ... Here is where I think these two studies that document the failures of peer-review come together. Both show us that the economic forces behind publishing, both traditional and author-fee Gold OA, are at odds with peer-review. Or, more accurately, that there is no hard line, either based on business model or publisher reputation between journals that practice good peer-review and those where bad peer-review exists. No journal and no publisher can assure us that it always presents exclusively well-reviewed material. The rule here for libraries is caveat emptor—let the buyer beware—and for the academy as a whole, the lesson is that we need to stop putting too much weight or reliance on the assertion that a particular journal is peer-reviewed.  This is why I have called these studies the tip of an iceberg. They have shown us, in my opinion, that peer-review has a systemic problem ... Finally, the way to improve peer-review is to make it more transparent ..."

Link:

http://lj.libraryjournal.com/2014/03/industry-news/just-the-tip-of-the-iceberg-peer-to-peer-review/#_

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.new oa.predatory oa.credibility oa.presentations oa.publishers oa.journals oa.quality oa.springer oa.ieee oa.peer_review oa.business_models oa.libraries oa.librarians

Date tagged:

03/14/2014, 10:17

Date published:

03/14/2014, 06:17