Handful of “highly toxic” Wikipedia editors cause 9% of abuse on the site | Ars Technica
Ars Technica 2017-02-11
Summary:
with Alphabet tech incubator Jigsaw worked with Wikimedia Foundation to analyze 100,000 comments left on English-language Wikipedia. They found predictable patterns behind who will launch personal attacks and when. The goal of the research team was to lay the groundwork for an automated system to "reduce toxic discussions" on Wikipedia. The team's work could one day lead to the creation of a warning system for moderators. The researchers caution that this system would require more research to implement, but they have released a paper with some fascinating early findings. To make the supervised machine-learning task simple, the researchers focused exclusively on ad hominem or personal attacks, which are relatively easy to identify. They defined personal attacks as directed at a commenter (i.e., "you suck"), directed at a third party ("Bill sucks"), quoting an attack ("Bill says Henri sucks"), or just "another kind of attack or harassment." They used Crowdflower to crowdsource the job of reviewing 100,000 Wikipedia comments made between 2004-2015. Ultimately, they used over 4,000 Crowdflower workers to complete the task, and each comment was annotated by
Link:
https://arstechnica.com/information-technology/2017/02/one-third-of-personal-attacks-on-wikipedia-come-from-active-editors/From feeds:
Cyberlaw » Ars TechnicaMusic and Digital Media » Ars Technica
Data & Society » idilali's bookmarks
Harmful Speech » amarashar's bookmarks