Apple will soon start scanning your iPhone for child pornography and other illegal content. Should you be worried? We discuss the Child Sexual Abuse Material (CSAM) database, what Apple is looking for with its scans, and (perhaps more importantly) what it is *not* looking for.

We discuss the two sides to this argument and issue, what (exactly) Apple's commitment to protecting you really means, and whether or not Apple is likely to make a mistake, along with the lessons Apple could (or should) learn from BlackBerry.