![Linear Digressions artwork](https://is3-ssl.mzstatic.com/image/thumb/Podcasts113/v4/4d/ca/f2/4dcaf27f-1f74-9477-477b-f7aaecb6d843/mza_1113917496893811473.jpg/100x100bb.jpg)
Discriminatory Algorithms
Linear Digressions
English - April 04, 2016 02:30 - 15 minutes - 21.1 MB - ★★★★★ - 350 ratingsTechnology data science machine learning linear digressions Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
Sometimes when we say an algorithm discriminates, we mean it can tell the difference between two types of items. But in this episode, we'll talk about another, more troublesome side to discrimination: algorithms can be... racist? Sexist? Ageist? Yes to all of the above. It's an important thing to be aware of, especially when doing people-centered data science. We'll discuss how and why this happens, and what solutions are out there (or not).
Relevant Links:
http://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html
http://techcrunch.com/2015/08/02/machine-learning-and-human-bias-an-uneasy-pair/
http://www.sciencefriday.com/segments/why-machines-discriminate-and-how-to-fix-them/
https://medium.com/@geomblog/when-an-algorithm-isn-t-2b9fe01b9bb5#.auxqi5srz