Apple devices will scan footage for child porn

Apple will check on iPhones and iPads whether photos that are put on iCloud contain child abuse. With the help of hashes and a database present on the devices, Apple wants to help fight child sexual abuse material (CSAM). In addition, Messages warns when someone shares ‘sensitive content’ such as nude photos. Finally, Siri and the search function will also show a warning for search terms related to child abuse.

Apple indicates that the CSAM detection will come with the release of iOS 15 and iPadOS 15. The functionalities will initially only come to America but may be rolled out worldwide later.

Apple has made available a detailed explanation of the technical implementation of the CSAM detection function. In short, it works like this. Organizations like the National Center for Missing and Exploited Children provide Apple with a database of child abuse images. Apple then uses NeuralHash to give a photo and variations on that photo (such as black and white or scaled-down versions) a so-called hash value. In fact, the photo is converted into a unique identification code. This code is then further encrypted by Apple, so that only Apple can read the original hash. This prevents users from knowing anything about the original photo.

Apple devices will scan footage for child porn

The iPad or iPhone also uses NeuralHash to convert the photos on the device to a hash. The device can then compare the hash of the photo with the hashes from the database. If there is a match, the photo will receive certain cryptographic information. Without this information, the Apple server cannot decrypt the image after uploading. According to Apple, images without a match will not be analyzed further.

Apple has built-in 2 security measures to prevent false positives, namely a minimum amount of matches (Treshold Secret Sharing) and a manual check before going through a child abuse report.

Tracking down and exterminating criminals who create and distribute child pornography is, of course, a noble goal that no one objects to. Yet this system has consequences that go beyond catching criminals. This kind of technology puts pressure on the integrity of encryption. As cryptologist Matthew Green says in his tweet, “ Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems ”. Both the Dutch government and the American government have long been asking for the possibility to view communications with end-to-end encryption (such as WhatsApp).. Well, you probably don’t have much to fear from the Dutch government. However, the evolution of this technology could endanger the lives of dissidents, journalists and opposition members under an authoritarian regime. The announcement has received similar criticism from various quarters.

Catch up on more articles here

Follow us on Twitter here

Popular

Must read

MORE ON THIS TOPIC:

Related Posts