OK. This is quite technical. But TL;DR - the neural hash system #Apple uses for their CSAM detection has been confronted with its first possible collision by some good hackers. This dog might be marked by the system as suspicious. Ouch. Issue 1 at github.com/AsuharietYgvar/Appl

cc @aral FYI. This dog would be classified as possible CSAM by the Apple Neuralhash system. Do NOT save this on your iDevice.

@jwildeboer @aral as far as I understand the issue, only the hash of these two pictures in the issue collide, there is no information on classification on the CSAM database.
According to reddid the hashes (of every picture on the device) is checked on apples servers only, so there is no way to know if the hash is actually in the DB.
But yes, once hashes are public, one could generate colliding pictures quite easily.

Folgen

@jwildeboer @aral also, I always wondered why Apple said they will only alert if there are a few matching files... which makes sense if they already knew that there will be hash collisions.

Melde dich an, um an der Konversation teilzuhaben
literatur.social

The gateway into the fediverse for authors and all people interested in literature.

Der Einstieg ins Fediverse für Autor*innen und Literaturmenschen ...