Apple Wants to Violate Your Privacy Under the Veil of Saving Children

Here’s why Apple is lying to you

AJ Krow
AJ Krow
Sep 7 · 5 min read
Photo by Tyler Lastovich on Unsplash

Recently, Apple announced they plan to use a surveillance system to detect CSAM (Child Sexual Abuse Material) on people’s iPhones. Apple planned to do this by scanning all photos on a user’s iPhone and using artificial intelligence to determine if any of the user’s images on their phone would…