Main content

False Positives, False Negatives, and False Analyses: A Rejoinder to "Machine Bias: There's Software Used Across the Country to Predict Future Criminals. And It's Biased Against Blacks."

Volume 80 Number 2
September 2016

The authors respond to a recent ProPublica article claiming that the widely used risk assessment tool COMPAS is biased against black defendants. They conclude that ProPublica's report was based on faulty statistics and data analysis and failed to show that the COMPAS itself is racially biased, let alone that other risk instruments are biased.