In 2016, Julia Angwin at ProPublica discovered that COMPAS exhibited racial bias, even supposing the program wasn't told the races on the defendants. Although the error rate for each whites and blacks was calibrated equivalent at just 61%, the glitches for every race have been different—the technique continually overestimated the https://emiliooluyb.bloggazza.com/24691868/the-5-second-trick-for-ai-machine-learning