Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> today the scanning is only for images uploaded to iCloud, and only for CSAM

This, to me, would be exponentially less privacy invasive as I’ve come to assume all major cloud hosting providers implement something like this (look at Google Drive), but Apple has said that the scanning is done on-device, meaning whether or not you upload your photo library to iCloud, your local photos will be scanned with an on-device database of hashes.

Essentially iOS photos now implement a direct API call to the feds with some vague “human verification” layer if you go above an unknown threshold



Right, this is what all the articles on this matter are getting wrong, unknowingly or otherwise. It’ll be one flag switch to change from scanning uploaded images to scanning local images.

This is an erosion of fundamental human rights under the guise of “think of the children“ so that anyone who stands up against this tyranny can be labelled a “pedophile”. 1984 wasn’t like 1984 but 2021 is surely looking that way.


It’s a delayed concern for us in the West. First the bait and switch will happen in all parts of the third world, autocratic, and dictatorial governments. After all, our list of hashes will be different from what their agencies will provide. Across the globe Apple will assist in this.

Then one day we get another Trumpian President who adds more categories of hashes in a place like America.

What Apple is not willing to do is have that philosophical and ethical discussion with its customers. It has simply made the decision.


> I’ve come to assume all major cloud hosting providers implement something like this

They have, for the past decade.

>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)

https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...


> but Apple has said that the scanning is done on-device

Yes, but only for images being uploaded to iCloud.

> meaning whether or not you upload your photo library to iCloud, your local photos will be scanned

Not in the currently proposed implementation, if I understand correctly.

Disclaimer: All my information about this thing is from news articles; I might be misunderstanding the details.


So child molestors will disable iCloud upload and innocent people will have their privacy violated?


That is one plausible outcome here, yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: