Microsoft, IBM Facial Analyses Struggle With Race and Gender | WIRED

amarashar's bookmarks 2018-02-07

Summary:

The skewed accuracy appears to be due to underrepresentation of darker skin tones in the training data used to create the face-analysis algorithms.

Link:

https://www.wired.com/story/photo-algorithms-id-white-men-fineblack-women-not-so-much/?mbid=social_twitter_onsiteshare

From feeds:

Ethics/Gov of AI ยป amarashar's bookmarks

Tags:

bias

Date tagged:

02/07/2018, 16:00

Date published:

02/07/2018, 11:00