This latest episode is a special one. Tommy has wanted to explore predictive policing for a while. He had the opportunity to do so as a sample case study for one of the classes he teaches at Queen's University: AI, Ethics & Society - a Masters of Engineering course where students use any media format they wish to explore the social and ethical implications of artificial intelligence systems. 


And so, this episode is presented a bit differently than what you are used to - but it is still driven by matters of confusion, and certainly in the pursuit of clarity. Tommy raises some hard questions as he looks into the history of PredPol. As he outlines right off the top, PredPol is a relatively well known system by this point - as too are its social and ethical implications. But the matter of how the system came to be so problematic is an important one, too. As Tommy argues, the biases, assumptions, and limited intellectual scope of its designers implicated how the system was built, what kind of data it uses, and what kind of algorithm it used. This lattermost point is an intriguing one, precisely because the algorithm PredPol is built on has virtually nothing to do with social, cultural, or political life. Rather, it was designed to detect earthquakes...


Special shout out to João Lobato, the brilliant mind behind LASERS, whose EP you hear on this track. You can hear his incredible work here.


Follow your host: @whatsthatdata | @wtncast


Subscribe for updates!
Email Tommy: [email protected]
Follow What's That Noise?! on Apple Music and on Spotify 

Twitter Mentions