NSF Award Search: Award#1841354 - CHS: EAGER: Handling Online Risks and Creating Safe Spaces: Content Moderation in Live Streaming Micro Communities
amarashar's bookmarks 2018-07-30
Summary:
This research will investigate how individuals and small groups handle content moderation real time in the context of live streaming, from both technical and social perspectives, distinguishing between professional content creators who create content for a living, and hobbyists. Live streaming services such as Twitch are the latest form of social media that marries user-generated content with the traditional concept of live television broadcasting: as someone broadcasts, viewers can post comments in a chat interface that is displayed alongside the broadcast, creating an interactive synchronous media experience. This real-time interaction, however, makes the platform ripe for deviant behavior, as potential harassers can visually see the immediate impact of their harsh words on the person who is broadcasting. Most current forms of social media rely on crowdsourced methods of moderation, where users report bad content that is ultimately reviewed by a human moderator. This does not work well in the context of real-time moderation, posing greater social and technological challenges. This project will study approaches to improving understanding of the sociotechnical aspects of content moderation from the perspective of micro communities on live streaming platforms. By understanding how streamers currently moderate audiences through manual and automated labor, the research will identify opportunities for technology to assist and enhance the moderation process and provide guidelines for sustainable and scalable moderation. Exploration of different governance structures of moderation may also yield insights into alternative models of moderation for the future of social media and understanding of how different moderation practices may influence the evolution of positive and negative norms in micro communities.