Gravity FM artwork

Bytes on the Beat: How Predictive Analytics Amplifies Discriminatory Police Practices

Gravity FM

English - May 01, 2017 20:00 - 1 hour - 120 MB - ★★★★★ - 6 ratings
News human rights environment sustainability equality democracy public health reproductive rights refugee rights free speech protest rights Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed


Selection Bias, Confirmation Bias and the Feedback Loop of Predictive Policing Algorithms, the Black Box Problem of Proprietary Algorithms and Lack of Accountability

Discussion with Kristian Lum and William Isaac on how machine learning algorithms work and how seemingly neutral police data can perpetuate systemic and institutional prejudices and produce predictive systems that predict police enforcement rather than future crime. We explore the creation and conclusions of their Oakland case study on the bias of police data sets and how selection bias can produce confirmation bias and a feedback loop, leading to over-policing of communities already overexposed to police activity. We also discuss the lack of transparency and accountability of the current proprietary predictive models and best practices for input data and implementation of predictive systems into future police work.

For More Info: http://thegravity.fm/#/episode/22