#apple #icloud #privacy




Apple recently announced scanning all images uploaded to iCloud for CSAM (child abuse material), and that this scan would happen locally on users' phones. We take a look at the technical report and explore how the system works in detail, how it is designed to preserve user privacy, and what weak points it still has.




OUTLINE:


0:00 - Introduction


3:05 - System Requirements


9:15 - System Overview


14:00 - NeuralHash


20:45 - Private Set Intersection


31:15 - Threshold Secret Sharing


35:25 - Synthetic Match Vouchers


38:20 - Problem 1: Who controls the database?


42:40 - Problem 2: Adversarial Attacks


49:40 - Comments & Conclusion




Paper: https://www.apple.com/child-safety/pd...


ML News Episode about CSAM: https://youtu.be/gFkBqD2hbnU




Abstract:


CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.


CSAM Detection provides these privacy and security assurances:


• Apple does not learn anything about images that do not match the known CSAM database.


• Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.


• The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.


• Users can’t access or view the database of known CSAM images.


• Users can’t identify which images were flagged as CSAM by the system.


For detailed information about the cryptographic protocol and security proofs that the CSAM Detection process uses, see The Apple PSI System.




Links:


TabNine Code Completion (Referral): http://bit.ly/tabnine-yannick


YouTube: https://www.youtube.com/c/yannickilcher


Twitter: https://twitter.com/ykilcher


Discord: https://discord.gg/4H8xxDF


BitChute: https://www.bitchute.com/channel/yann...


Minds: https://www.minds.com/ykilcher


Parler: https://parler.com/profile/YannicKilcher


LinkedIn: https://www.linkedin.com/in/yannic-ki...


BiliBili: https://space.bilibili.com/1824646584




If you want to support me, the best thing to do is to share out the content :)




If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):


SubscribeStar: https://www.subscribestar.com/yannick...


Patreon: https://www.patreon.com/yannickilcher


Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq


Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2


Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m


Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n

Twitter Mentions