Developing indicators of the impact of scholarly communication is a massive technical challenge – but it’s also much simpler than that | Impact of Social Sciences

abernard102@gmail.com 2013-10-29

Summary:

"Perhaps the most frequently asked question regarding impact is logistical: how can we measure the impact of our work? A recent story in the Chronicle of Higher Education suggests: “The larger conversation about how to measure scholarly impact is probably as old as scholarship itself.” Today, the question ranges from the development of article level metrics to building a shared infrastructure for all of scholarly communication. Why should we develop new ways to measure the impact of our work? Jason Priem et al. and Heather Piwowar answer that current measures of impact don’t work; since traditional approaches to measuring scholarly communication don’t reward impact, we need a way to measure and reward other approaches that do. Similar views are shared by some and contested by others. What’s most striking about answers to the ‘why’ question is how quickly they turn toward the ‘how’ question. Altmetrics developers are doers and inventors – they take action and try to figure things out. We can’t figure out impact just by thinking about it; we have to do research, warns Piwowar in the post linked above. After noting that the system is broken, Priem and Piwowar quickly ask, 'How can we fix it?' Instead of answering the ‘why’ question, we ask a different question: Do altmetrics work? Should we resist attempts to measure the impact of our work? Philip Moriarty offers sophisticated arguments against the ‘impact agenda’, while others are more demonstrative. Robert Frodeman and I argue that embracing impact in a way that preserves autonomy is a better strategy than mere resistance. Despite contrary answers to the question of resistance, many of us agree that not everything that counts can be counted. Our disagreement rests on different conceptions of freedom. ‘The resistance’ tends toward a negative concept of freedom that sees all forms of interference as evils. Advocates of owning impact, however, embrace a positive view of freedom, emphasizing self-determination as fundamental. Infrequently asked questions Although these frequently asked questions are interesting and important, it’s ultimately simpler than that. Colleagues at the Center for the Study of Interdisciplinarity (CSID), Kelli Barr, Keith Brown, and I, submitted something simple and quite messy to Nature to see whether they might join us in catalyzing a conversation on the question of impact. Frankly, we doubted it. Nevertheless, we submitted the following list, along with a brief discussion of how and why we generated it ... Of course, the editors at Nature declined to publish it. Instead, they asked us to clean up the list to make it more generally applicable and less CSID-specific. We did so, and the officially published version of our correspondence is available here. It can also be viewed free of charge here. The editors of the LSE Impact blog had a similar urge to clean up our mess. This is the table they proposed ... Comparing the original and edited versions illustrates the value of the simpler questions about impact I propose we begin to ask more frequently. I realize that people have already been asking some of these questions – but not frequently enough ..."

Link:

http://blogs.lse.ac.uk/impactofsocialsciences/2013/06/12/56-indicators-of-impact/

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.new oa.comment oa.impact oa.terminology oa.definitions oa.altmetrics oa.metrics

Date tagged:

10/29/2013, 09:41

Date published:

10/29/2013, 05:41