Jeff Larson from ProPublica joins us to talk about his work on bias found in automated algorithms that compute the recidivism scores of convicted criminals.

podlovePlayerCache.add([{"url":"https:\/\/datastori.es\/wp-json\/podlove-web-player\/shortcode\/publisher\/2038","data":{"version":5,"show":{"title":"Data Stories","subtitle":"A podcast on data and how it affects our lives \u2014 with Enrico Bertini and Moritz Stefaner","summary":"Enrico Bertini and Moritz Stefaner discuss the latest developments in data analytics, visualization and related topics.","poster":"https:\/\/datastori.es\/wp-content\/cache\/podlove\/f1\/aea5f20a523df5a96daae43b938a55\/data-stories_500x.png","link":"https:\/\/datastori.es"},"title":"85 | Machine Bias with Jeff Larson","subtitle":"","summary":"Jeff Larson from ProPublica joins us to talk about his work on bias found in automated algorithms that compute the recidivism scores of convicted criminals.","publicationDate":"2016-10-20T05:09:55+02:00","duration":"00:49:26.976","poster":"https:\/\/datastori.es\/wp-content\/cache\/podlove\/8b\/c4117f800bb8d2589ad7c0ed246b67\/machine-bias-with-jeff-larson_500x.jpg","link":"https:\/\/datastori.es\/85-machine-bias-with-jeff-larson\/","chapters":[{"start":"00:00:00.059","title":"","href":"","image":""}],"audio":[{"url":"https:\/\/datastori.es\/podlove\/file\/3024\/s\/webplayer\/c\/website\/85-machine-bias-with-jeff-larson.m4a","size":"37092021","title":"MPEG-4 AAC Audio (m4a)","mimeType":"audio\/mp4"},{"url":"https:\/\/datastori.es\/podlove\/file\/3025\/s\/webplayer\/c\/website\/85-machine-bias-with-jeff-larson.mp3","size":"47555960","title":"MP3 Audio (mp3)","mimeType":"audio\/mpeg"}],"files":[{"url":"https:\/\/datastori.es\/podlove\/file\/3025\/s\/webplayer\/85-machine-bias-with-jeff-larson.mp3","size":"47555960","title":"MP3 Audio","mimeType":"audio\/mpeg"},{"url":"https:\/\/datastori.es\/podlove\/file\/3024\/s\/webplayer\/85-machine-bias-with-jeff-larson.m4a","size":"37092021","title":"MPEG-4 AAC Audio","mimeType":"audio\/mp4"}],"contributors":[]}}, {"url":"https:\/\/datastori.es\/wp-json\/podlove-web-player\/shortcode\/config\/ds\/theme\/ds","data":{"activeTab":"chapters","subscribe-button":{"feed":"https:\/\/datastori.es\/feed\/podcast\/","clients":[{"id":"overcast","service":null},{"id":"stitcher","service":null},{"id":"spotify","service":null},{"id":"pocket-casts","service":null},{"id":"google-podcasts","service":null},{"id":"apple-podcasts","service":null},{"id":"rss","service":null}]},"share":{"channels":["facebook","twitter","whats-app","linkedin","pinterest","xing","mail","link"],"outlet":"https:\/\/datastori.es\/wp-content\/plugins\/podlove-web-player\/web-player\/share.html","sharePlaytime":true},"related-episodes":{"source":"disabled","value":null},"version":5,"theme":{"tokens":{"brand":"#5728b1","brandDark":"#47309b","brandDarkest":"#221064","brandLightest":"#FFF","shadeDark":"#47309b","shadeBase":"#5728b1","contrast":"#221064","alt":"#fff"},"fonts":{"ci":{"name":"","family":[" AvenirNext"," Avenir Next","Segoe UI","-apple-system","BlinkMacSystemFont","Roboto","Helvetica","Arial","sans-serif","Apple Color Emoji","Segoe UI Emoji\", \"Segoe UI Symbol"],"src":[],"weight":"600"},"regular":{"name":"regular","family":["AvenirNext","Avenir Next","Segoe UI","-apple-system","BlinkMacSystemFont","Roboto","Helvetica","Arial","sans-serif","Apple Color Emoji","Segoe UI Emoji\", \"Segoe UI Symbol"],"src":[],"weight":300},"bold":{"name":"bold","family":["AvenirNext","Avenir Next","-apple-system","BlinkMacSystemFont","Segoe UI","Roboto","Helvetica","Arial","sans-serif","Apple Color Emoji","Segoe UI Emoji\", \"Segoe UI Symbol"],"src":[],"weight":700}}},"base":"https:\/\/datastori.es\/wp-content\/plugins\/podlove-web-player\/web-player\/"}}]);
podlovePlayer("#player-5fc64b4569c39", "https://datastori.es/wp-json/podlove-web-player/shortcode/publisher/2038", "https://datastori.es/wp-json/podlove-web-player/shortcode/config/ds/theme/ds");

ProPublica – Jeff Larson
http://www.propublica.org

On the show this week we have Jeff Larson, Data Editor at ProPublica, to talk about his team’s recent work on “Machine Bias“. Jeff and his colleagues have analyzed the automated scoring decisions made by COMPAS, one of the systems American judges use to assess the likelihood that a convicted criminal will re-offend.


By looking at the COMPAS data, Jeff and his colleagues sought to determine the accuracy of the algorithm and whether it introduces significant biases into the criminal justice system — racial or otherwise. (Their finding: Yes, it seems that it does.)


On the show we talk about how the software is used by judges, how the ProPublica analysis was carried out, what the team found, and what can be done to improve the situation.


Jeff also gives us a small preview of other stories his team is working on and how you can go about developing similar projects.


Enjoy the show!

This episode of Data Stories is sponsored by Qlik, which allows you to explore the hidden relationships within your data that lead to meaningful insights. Take a look at their Presidential Election app to analyze the TV network coverage for every mention of both Donald Trump and Hillary Clinton. And make sure to try out Qlik Sense for free at: qlik.de/datastories.

http://datastori.es/wp-content/uploads/2016/10/Larson-promo_v2.mp4

 


Links

Data analysis on GitHub: https://github.com/propublica/compas-analysis
Article: “Machine Bias
Article: “Discrimination By Design
Article: “ProPublica Responds to Company’s Critique of Machine Bias Story
Article: “Technical Response to Northpointe
Article: “What Algorithmic Injustice Looks Like in Real Life
Article: “How We Analyzed the COMPAS Recidivism Algorithm
Article: “How Machines Learn to Be Racist
Workshop: FAT ML 2016: Fairness, Accountability, and Transparency in Machine Learning