Online comment systems reveal multiple layers of social bias

Ars Technica » Scientific Method 2013-08-09

Angry mobs online may be less of a problem than we tend to think.

Thanks to the Internet, we can rate, judge, and evaluate nearly anything we’d like—from products to movies to fitness centers—with minimal effort. In many ways, this is a boon, since there’s a wealth of information at our fingertips advising us what car to buy and whether or not we should tune into a new TV show. But behind each of these ratings is a person's opinion, and human nature makes those opinions notoriously fickle.

In the latest issue of Science, a group of researchers decided to see whether people's tendency to follow the crowd extended into online ratings. They tackled this question by carrying out a large-scale experiment in which they rigged the comment ratings on a website. The website they used is a news aggregating website (like reddit) that lets users rate comments and contributions by either “upvoting” or “downvoting” them. The comment’s score, which can be seen by all the website’s users, is simply the raw difference in the number of positive and negative ratings.

During the study, which lasted 5 months, the researchers manipulated the ratings that some comments received. Out of the 101,281 comments made during the span of the study, the researchers randomly assigned 4,049 comments (four percent of the total) to the positive treatment group, and these comments were given a single upvote upon submission. Another 1,942 comments (two percent of the total) were assigned to the negative treatment group and were given a single downvote; the rest of the comments made during this period served as controls. These proportions mimicked the natural frequency of up- and downvotes normally seen on the site.

Read 10 remaining paragraphs | Comments