Apple will use an on-device matching tech utilizing a database of recognized little one abuse picture hashes supplied by NCMEC and different little one security organisations. Earlier than a picture is saved in iCloud Photographs, an on-device matching course of is carried out for that picture in opposition to the recognized CSAM hashes.
It’s utilizing a cryptographic expertise known as non-public set intersection, which determines if there’s a match with out revealing the end result. The system creates a cryptographic security voucher that encodes the match end result together with extra encrypted information concerning the picture. This voucher is uploaded to iCloud Photographs together with the picture.
Utilizing one other expertise known as threshold secret sharing, the system ensures the contents of the security vouchers can’t be interpreted by Apple except the iCloud Photographs account crosses a threshold of recognized CSAM content material. Solely when the edge is exceeded does the cryptographic expertise permit Apple to interpret the contents of the security vouchers related to the matching CSAM photos. Apple then manually opinions every report to substantiate there’s a match, disables the consumer’s account, and sends a report back to NCMEC.