Privacy Debate: Apple Scans iCloud Photos On-Device for Child Pornography

Xiaoli Jin
Geek Culture
Published in
3 min readAug 23, 2021

In August 2021, Apple announced that it would scan users’ iCloud photos for child sexual abuse material (CSAM) starting from iOS 15. Contrary to Google and Microsoft, which scanned user photos for CSAM from their cloud servers, Apple’s scanning will take place on users’ devices, right before a photo is synced to iCloud. Apple states that their way enhances privacy since it won’t expose any information about other users’ photos, which might not be true if it scanned its servers. Critics, however, warn that it can create a slippery slope for law enforcement to access user data on devices.

Source: Apple Child Safety

As shown below, Apple’s algorithm would only decrypt user’s photos for human inspection if two conditions are both met. First, the photo must match known CSAM content curated by the National Center for Missing and Exploited Children. Second, the user must store enough matched photos on iCloud beyond a given threshold.

Source: Apple CSAM Detection — Technical Summary

Opinions among computer scientists are split: Some praise it as a way to combat CSAM and preserve user privacy, leveraging cryptography to minimize the intrusion into private data. Yet others warn that Apple might now have the technical capability to get around end-to-end encryption, and it wouldn’t be able to defy law enforcement’s demands for user data, as it did in the 2016 San Bernardino shooting controversy.

Source: Apple Expanded Protections for Children

Related story-> Court Sides with Google: Using Hash Value to Detect Child Pornography

In July 2015, William Miller attached two child pornographic files to his Gmail. Google’s hash-value algorithms detected and reported them to the authorities. Miller argued that he was subject to an “unreasonable search” when law enforcement viewed his files shared by Google. But the Sixth Circuit held that the government does not conduct a Fourth Amendment search in this case: Google’s hash value search created a “virtual certainty” that Miller’s images were illegal, making the government’s search disclose nothing more than what Google had already revealed. So how does hash value detect pornographic content with virtual certainty?

(Microsoft uses hash value based techniques to detect child pornography in video)

Hash, in this case, is a unique digital fingerprint of an image file, resistant to alterations like resizing. For each file uploaded to Gmail, Google computes its hash value and compares it to confirmed pornographic content curated by the National Center for Missing and Exploited Children. If the two hash values match, Google will report the photo to the appropriate regulatory agency, without viewing the content of the email. Google later announced that it is using deep neural networks to detect new child sexual abuse material not previous recorded.

--

--