Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
The files WILL be scanned the second they leave your device to any major cloud.
There are services with e2e and you can encrypt before uploading to those who can’t.
Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.
Then, once implemented and with less fanfare some authoritarian regimes (won’t say any to not upset the tankies) can ask apple to scan for other material too… And as it’s closed source we wouldn’t even know that the models are different by country.
There are services with e2e and you can encrypt before uploading to those who can’t.
Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.
Then, once implemented and with less fanfare some authoritarian regimes (won’t say any to not upset the tankies) can ask apple to scan for other material too… And as it’s closed source we wouldn’t even know that the models are different by country.