General > General Technical Chat

Apple privacy letter (Law enforcement through your phone)

<< < (4/24) > >>

NiHaoMike:
Someone just needs to make a viral picture (completely legal) that false positives and the system will get DDoSed in no time.

rstofer:
You have to wonder what the DOJ has on Apple that made them come up with this idea.  Were they going to break up Apple under some 'monopoly' statute?  Maybe force them into 'right to repair' on steroids?

tooki:
While I very much share the concerns about oppressive governments legislating that this technology be used for other things, many commenters seem to be confused about what it does and does not do, technically speaking.

Nothing in the technical documentation even hints at the ability for Apple or law enforcement to decrypt the user’s devices, nor at it forwarding the offending images themselves. Remember the context: Apple has always been able and willing to share unencrypted data that’s on iCloud when subpoenaed to do so, and I don’t think they’ve ever claimed that iCloud Photos is end to end encrypted. This thing seems to scan the photos in the user’s device at the same time the user uploads it to iCloud. The entire process seems to operate on the hashes themselves.

As for the sexting filter for kids, it’s kinda strange to me, insofar as I don’t really believe in empowering helicopter parents. But it is done entirely on-device, and no part of it involves breaking iMessage’s end to end encryption, nor of forwarding the images themselves to parents or anyone else. (On the other hand, as a consenting adult, I actually wouldn’t mind the app obscuring incoming nudie pics initially, so as to not have any embarrassing surprises on the bus or the bar.)

rstofer:

--- Quote from: tooki on August 07, 2021, 03:39:41 pm ---This thing seems to scan the photos in the user’s device at the same time the user uploads it to iCloud. The entire process seems to operate on the hashes themselves.

--- End quote ---

Sounds like 'theft of services' to me!  I own those compute cycles, I paid for them and I don't want them stolen for something like this.

OK, I looked it up and using computer services without permission is larceny.  Section 502 (15) C 3 here:

https://leginfo.legislature.ca.gov/faces/codes_displaySection.xhtml?sectionNum=502.&lawCode=PEN


--- Quote ---(3) Knowingly and without permission uses or causes to be used computer services.

--- End quote ---

Lawsuits to follow...

bd139:

--- Quote from: tooki on August 07, 2021, 03:39:41 pm ---While I very much share the concerns about oppressive governments legislating that this technology be used for other things, many commenters seem to be confused about what it does and does not do, technically speaking.

Nothing in the technical documentation even hints at the ability for Apple or law enforcement to decrypt the user’s devices, nor at it forwarding the offending images themselves. Remember the context: Apple has always been able and willing to share unencrypted data that’s on iCloud when subpoenaed to do so, and I don’t think they’ve ever claimed that iCloud Photos is end to end encrypted. This thing seems to scan the photos in the user’s device at the same time the user uploads it to iCloud. The entire process seems to operate on the hashes themselves.

As for the sexting filter for kids, it’s kinda strange to me, insofar as I don’t really believe in empowering helicopter parents. But it is done entirely on-device, and no part of it involves breaking iMessage’s end to end encryption, nor of forwarding the images themselves to parents or anyone else. (On the other hand, as a consenting adult, I actually wouldn’t mind the app obscuring incoming nudie pics initially, so as to not have any embarrassing surprises on the bus or the bar.)

--- End quote ---

Actually the terms state that on a number of suitable matches (poorly defined), they will manually review the images. That means they will decrypt them at that point in time and it will be an Apple staff member doing it.

As for the hashes I get the feeling people don’t know what these are. These are not binary level hashes of the image files but perceptual hashes of the content. They are very easy to attack and cause collisions on. See: https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

I’ve attached the key image which shows how easy they are to cause collisions with from the article.

The article says they should not be used for privacy sensitive applications and they’re right.

Edit: what’s worrying is that the assertion that 1 in 1 trillion images causes a false match. What’s their corpus? Where’s the public proofs. Where’s the source code for independent validation?

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod