By Jason Middleton
 
They're in there, you just don't always know where they are. Racism, sexism, and probably just about every other -ism there is could be baked into any algorithm. We're human. And when humans code they bring their own, personal biases to the table. 
 
Oh, they're in there, consciously or not. 
 
This week we host author Sarah Wachter-Boettcher, author of "Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech."
 

The tech-lash that's been building all year long - triggered largely by the Facebook slash Cambridge Analytica scandal - is showing no signs of abating.
 
If anything, it's gaining steam. The calls for oversight and regulation of mega-tech companies are growing louder. 
 
On this show, we've drilled into just how biases like sexism and racism can get baked into algorithms - even unintentionally. 
 
Recently, that aspect of how we use and interface with our technology has been seen more through a human behavior lens - we're finding concrete examples of bias, not just theories.
 
Case in point: Apple Health wasn't very inclusive when it comes to women's health needs.
 

Please click through on any of these links for more information, as well as our show page, Facebook page or follow me on the Twitter. The podcasts also live on iTunes and Stitcher.
 

Have great weeks, everybody.

--30--

Twitter Mentions