Archiloque

Du code et des loutres

  • Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say
    https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-sa

    The racial bias that ProPublica found in a formula used by courts and parole boards to forecast future criminal behavior arises inevitably from the test’s design, according to new research.

    The findings were described in scholarly papers published or circulated over the past several months. Taken together, they represent the most far-reaching critique to date of the fairness of algorithms that seek to provide an objective measure of the likelihood a defendant will commit further crimes.

    Increasingly, criminal justice officials are using similar risk prediction equations to inform their decisions about bail, sentencing and early release.

    The researchers found that the formula, and others like it, have been written in a way that guarantees black defendants will be inaccurately identified as future criminals more often than their white counterparts.

    • The problem, several said in interviews, arises from the characteristic that criminologists have used as the cornerstone for creating fair algorithms, which is that formula must generate equally accurate forecasts for all racial groups.

      The researchers found that an algorithm crafted to achieve that goal, known as “#predictive_parity,” inevitably leads to disparities in what sorts of people are incorrectly classified as high risk when two groups have different arrest rates.

      ’Predictive parity’ actually corresponds to ‘optimal discrimination,’” said Nathan Srebro, associate professor of computer science at the University of Chicago and the Toyota Technological Institute at Chicago. That’s because predictive parity results in a higher proportion of black defendants being wrongly rated as high-risk.

      #parité_prédictive