Impact of Social Sciences – Crowd-Sourced Peer Review: Substitute or supplement for the current outdated system?

Amsciforum 2014-08-21


"If, as rumoured, Google builds a platform for depositing un-refereed research papers for 'peer-reviewing' via crowd-sourcing, can this create a substitute for classical peer-review or will it merely supplement classical peer review with crowd-sourcing? In classical peer review, an expert (presumably qualified, and definitely answerable), an 'action editor,'  chooses experts (presumably qualified and definitely answerable), 'referees,' to evaluate a submitted research paper in terms of correctness, quality, reliability, validity, originality, importance and relevance in order to determine whether it meets the standards of a journal with an established track-record for correctness, reliability, originality, quality, novelty, importance and relevance in a certain field. In each field there is usually a well-known hierarchy of journals, hence a hierarchy of peer-review standards, from the most rigorous and selective journals at the top all the way down to what is sometimes close to a vanity press at the bottom. Researchers use the journals’ public track-records for quality standards as a hierarchical filter for deciding in what papers to invest their limited reading time to read, and in what findings to risk investing their even more limited and precious research time to try to use and build upon ... And of course no one knows whether crowd-sourced peer-review, even if it could work, would be scale-able or sustainable.  The key questions are hence: [1] Would all (most? many?) authors be willing to post their un-refereed papers publicly (and in place of submitting them to journals!)? [2] Would all (most? many?) of the posted papers attract referees? Competent experts? [3] Who/what decides whether the refereeing is competent, and whether the author has adequately complied? (Relying on a Wikipedia-style cadre of 2nd-order crowd-sourcers who gain authority recursively in proportion to how much 1st-order crowd-sourcing they have done — rather than on the basis of expertise — sounds like a way to generate Wikipedia quality, but not peer-reviewed quality…) [4] If any of this actually happens on any scale, will it be sustainable? [5] Would this make the landscape (un-refereed preprints, referee comments, revised postprints) as navigable and useful as classical peer review, or not? ..."


From feeds:

Open Access Tracking Project (OATP) » Amsciforum
Open Access Tracking Project (OATP) »


google docs peer review oa.comment oa.peer_review oa.impact oa.quality oa.versions oa.standards oa.sustainability oa.crowd oa.economics_of

Date tagged:

08/21/2014, 11:48

Date published:

08/21/2014, 06:10