Tooki: privacy can only be reasoned about as a worst case outcome from any technical decision. This is actually really my day job; protecting people’s financial data. Regardless of the facts or technical merit of the solution the point is that only the whole concept can be qualified, not just the technical aspects.
And how is one supposed to qualify the whole concept without thoroughly understanding and qualifying the technical aspects, too?!? A great "whole concept" can be turned to complete shit with a poor technical implementation, after all.
But again, I wasn't
claiming to evaluate the entire concept, I was addressing a
specific technical error in your representation of the process.
That’s probably where we got our wires crossed so apologies if you felt I was standing on your toes there.
Well, I just feel you're trying to correct me on things I already know,
and that you would know that I already know, had you read my posts carefully. To wit:
As for recovering the data, Apple already can decrypt your iCloud photos contents; check the terms and conditions.
In my original post here, I literally said:
Remember the context: Apple has always been able and willing to share unencrypted data that’s on iCloud when subpoenaed to do so, and I don’t think they’ve ever claimed that iCloud Photos is end to end encrypted.
And Gruber, linked in another reply of mine, confirms that iCloud photos and files are not end-to-end encrypted.
He says:Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing end-to-end encryption for iCloud Photo Library and iCloud device backups. Apple has long encrypted all iCloud data that can be encrypted, both in transit and on server, but device backups, photos, and iCloud Drive are among the things that are not end-to-end encrypted. Apple has the keys to decrypt them, and can be compelled to do so by law enforcement.
That's after explaining that everyone else does their CSAM scanning server-side, which becomes
literally impossible to do with end-to-end encryption.
They will do that under existing rules not from technical outcomes from this. Their process would be to use this as a basis to decrypt the rest.
Have you not read the documentation? It
clearly states that Apple's verification stops at the
contents of the vouchers. If they confirm it, then they disable the account and send a
report to law enforcement. That's it. One can infer that at that point, the investigation continues forward just as if the clues had come from any other source: subpoenas for all the stuff on their servers, and a subpoena to the suspect to inspect their devices.
The smartphone is far harder to get rid of and that will take a few weeks of careful unpicking. I've had my eye on a Nikon DSLR for a few weeks now so that bit is already solved. The biggest loss for me would be Apple Music as that's actually fairly good. Everything else is replaceable or I can live without. I need to migrate my 2FA stuff over to YubiKeys. I've got a cheap Garmin eTrex 10 and paper maps for outdoors nav, Casio F91W to tell the time. Quite frankly I probably don't need the Internet or phone comms most of the time and spend a lot of my dead time spamming on here when I should be reading a book or something
. I may just switch to a dumb phone for the sake of on call requirements.
That's an awful lot of pain to accomplish (at most!) what you could have accomplished by just disabling iCloud Photos, which you can do without disabling any other iCloud features. (The sexting filter in the iMessage app isn't applicable to you, since it's an
opt-in feature only for accounts
belonging to kids 12 and under AND
within an iCloud family account.)
I say "at most" because of the thing that everyone here is ignoring: Apple is playing catch-up regarding CSAM. Everyone else (FB, Google, etc) has been doing it at large scale for years. Unless you're operating
all the servers that run the services you need, you aren't escaping anything.