Facial-Recognition Software Might Have a Racial Bias Problem - The Atlantic

audrey's bookmarks 2016-04-08

Summary:

The facial-recognition algorithms used by police are not required to undergo public or independent testing to determine accuracy or check for bias before being deployed on everyday citizens. More worrying still, the limited testing that has been done on these systems has uncovered a pattern of racial bias.

Link:

http://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/

From feeds:

Data & Society ยป audrey's bookmarks

Tags:

Date tagged:

04/08/2016, 11:03

Date published:

04/08/2016, 07:03