Facebook’s Technocratic Reports Hide Its Failures on Abuse | by Chris Gilliard | Aug, 2020 | OneZero

amarashar's bookmarks 2020-08-27


In other words, there’s too much toxic content for Facebook to ever really see it all (hello, scale) so Facebook has a formula for determining the amount based on random sampling. Abstractly, perhaps everyone understands this, but to think about it another way: There’s so much sewage on the platform that the company must continually guess how much of it people are actually seeing.



From feeds:

Ethics/Gov of AI » amarashar's bookmarks
Harmful Speech » amarashar's bookmarks


Date tagged:

08/27/2020, 14:55

Date published:

08/27/2020, 06:05