Apple’s CSAM System … Walking A Fine Balance?

--

We have a challenge. How do we protect the rights to privacy, against the right of society to protect itself? It happened with the COVID-19 Bluetooth matching system, and now Apple is aiming to use advanced machine learning and cryptography to protect privacy, but detect criminal activity. If they fail, they could destroy the strong trust levels that their users have in them. When we look at Google, we see a company which does not have a strong track record in preserving the privacy of the user, whereas Apple tends to be well trusted for these things.

For this, Apple is integrating CSAM (Child Sexual Abuse Material) detection within their iPhone infrastructure. Overall Apple claims that they cannot access metadata on images until a given threshold match is reached and that all of the true positives are reviewed by a human before forwarding to the National Center for Missing and Exploited Children (NCMEC).

CSAM detection

CSAM detection uses a database of hashes, and that is then blinded before they are sent to a client’s iPhone. The matching process — known as NeuralHash — is implemented on the device, rather than in the iCloud. NeuralHash analyses an image and finds key features. When the image is stored within iCloud, it creates a cryptographic safety voucher, and where a threshold secret sharing method…

--

--

Prof Bill Buchanan OBE FRSE
ASecuritySite: When Bob Met Alice

Professor of Cryptography. Serial innovator. Believer in fairness, justice & freedom. Based in Edinburgh. Old World Breaker. New World Creator. Building trust.