The Accuracy, Fairness, and Limits of Predicting Recidivism

Berkman Center for Internet and Society: Audio Fishbowl 2018-03-15

Summary:

Algorithms for predicting recidivism are commonly used to assess a criminal defendant’s likelihood of committing a crime. Proponents of these systems argue that big data and advanced machine learning make these analyses more accurate and less biased than humans. In this talk researcher Julia Dressel discusses a recent study demonstrating that the widely used commercial risk assessment software COMPAS is no more accurate or fair than predictions made by people with little or no criminal justice expertise. Learn more about this event here: http://cyber.harvard.edu/events/2018/luncheon/03/Dressel

Link:

http://feedproxy.google.com/~r/audioberkman/~3/hNd7-MeHbtU/the-accuracy-fairness-and-limits-of-predicting-recidivism

From feeds:

Fair Use Tracker » Current Berkman People and Projects
CLS / ROC » Berkman Center for Internet and Society: Audio Fishbowl

Tags:

Authors:

djones@cyber.harvard.edu (Berkman Klein Center for Internet & Society at Harvard University)

Date tagged:

03/15/2018, 13:08

Date published:

03/15/2018, 13:03