Facebook’s Technocratic Reports Hide Its Failures on Abuse | by Chris Gilliard | Aug, 2020 | OneZero

amarashar's bookmarks 2020-08-27

Summary:

In other words, there’s too much toxic content for Facebook to ever really see it all (hello, scale) so Facebook has a formula for determining the amount based on random sampling. Abstractly, perhaps everyone understands this, but to think about it another way: There’s so much sewage on the platform that the company must continually guess how much of it people are actually seeing.

Link:

https://onezero.medium.com/facebook-is-hiding-its-failure-to-keep-abuse-off-its-platform-behind-technocratic-reports-682d871ef1ca

From feeds:

Ethics/Gov of AI » amarashar's bookmarks
Harmful Speech » amarashar's bookmarks

Tags:

harmfulspeech

Date tagged:

08/27/2020, 14:55

Date published:

08/27/2020, 06:05