There is an absence of scientific authority over research assessment as a professional practice, leaving a gap that has been filled by database providers | Impact of Social Sciences

peter.suber's bookmarks 2024-04-06


"In our recent PLoS ONE article we investigate to what extent research metrics have become established and accepted as legitimate ways to assess research performance. To do this we use a theoretical framework developed by Chicago sociologist Andrew Abbott. The purpose of our research is to offer a new perspective on the ongoing debate on research metrics....

Higher education rankings have been criticised on the grounds that non-experts have gained control over the definition of what constitutes excellent education. Clearly, the same argument applies to the definition of high-impact research as provided by Clarivate Analytics (WoS) and Elsevier (Scopus). We conclude that a growing gap exists between an academic sector with little capacity for collective action and increasing demand for routine performance assessment by research organisations and funding agencies. This gap has been filled by database providers. By selecting and distributing research metrics, these commercial providers have gained a powerful role in defining de facto standards of research excellence without being challenged by expert authority...."



04/06/2024, 09:05

From feeds:

Open Access Tracking Project (OATP) » peter.suber's bookmarks


oa.assessment oa.publishers oa.quality oa.impact oa.metrics oa.rankings

Date tagged:

04/06/2024, 13:05

Date published:

07/25/2018, 09:05