Facebook’s Most Recent Transparency Report Demonstrates the Pitfalls of Automated Content Moderation

amarashar's bookmarks 2020-10-08

Summary:

More troubling, however, is what seems to be differences in whether users had access to appeal mechanisms. When content is removed on either Facebook or Instagram, people typically have the option to contest takedown decisions. Typically, when the appeals process is initiated, the deleted material is reviewed by a human moderator and the takedown decision can get reversed and content reinstated. During the pandemic, however, that option has been seriously limited, with users receiving notification that their appeal may not be considered. According to the transparency report, there were zero appeals on Instagram during the second quarter of 2020 and very few on Facebook.

Link:

https://www.eff.org/deeplinks/2020/10/facebooks-most-recent-transparency-report-demonstrates-pitfalls-automated-content

From feeds:

Harmful Speech » amarashar's bookmarks
Fair Use Tracker » Deeplinks
CLS / ROC » Deeplinks

Tags:

harmfulspeech

Authors:

Svea Windwehr, Jillian C. York

Date tagged:

10/08/2020, 19:17

Date published:

10/08/2020, 12:36