The university research assessment dilemma: a decision support system for the next evaluation campaigns | Scientometrics
peter.suber's bookmarks 2025-03-03
Summary:
Abstract: Our study examines the UK’s Research Excellence Framework 2021 employing an algorithmic method to mimic its outcomes (expressed by their panel experts) and introduce a decision support system for evaluating research outputs. Using CrossRef, Scopus databases, and the Chartered Association of Business Schools’ journal classification, we assessed bibliometric features, finding the citation-based algorithm most effective in producing results close to the ones resulting from the REF panellists. Simulating panellists manually adjusting algorithmic paper classifications, our results closely align with actual evaluations, demonstrating the potential of algorithms to augment human assessments. We also show that the Grade Point Average metric may lead to evaluations that are far from those of panellists and should be avoided.