New investigation method developed in the fight against child pornography

Computer scientists have devised a new technique for analyzing image and video material for child pornography material. By simply studying noise it is possible to identify child abuse. International investigative organizations have already shown interest.

This is how the new detection method works

Photos and videos are never perfect: they always contain a speck or other form of noise somewhere. This is not always visible to the naked eye, but it is indeed present. This is due to the production process of the image sensor.

These small imperfections on the image sensor are unique. If you know the noise pattern of a sensor, you can link photos and videos. You can compare it to ballistic research. Each firearm leaves a distinctive pattern when a bullet is fired. Using that pattern, forensic experts can link a bullet to a crime scene and link a weapon.

Much interest in the Dutch investigation model

Guru Bennabhaktula, a PhD student at the University of Groningen and the Spanish University of León, devised and developed a machine learning model for his PhD research to extract and analyze the noise from image and video material. His model can be used as a new investigative tool for investigative and enforcement authorities.

For example, if officers find a camera on a suspect, detectives can investigate whether the device was used to witness child pornography. The image and video material can be linked to photos and videos that have been shared and distributed on the internet or the dark web.

Bennabhaktula says Interpol has already expressed interest. “We also gave a presentation to the Netherlands Forensic Institute, hopefully, a collaboration will come out,” he tells RTL Nieuws.

Apple joins the fight against child porn

Last year, Apple introduced NeuralHash, scanning software that scans images as the images are synced to their iCloud account. The software uses a special code or hash assigned to the footage by the National Center for Missing and Exploited Children (NCMEC). This also makes it possible to recognize modified photos.

To prevent the software from giving false positives, Apple has built in two security measures. First of all, there is a manual check by an employee if there is a match. In addition, alarm bells only go off with at least thirty hits. The latter is what Apple calls  Threshold Secret Sharing.

Critics make themselves heard

Although Apple’s intentions were good, the American tech company received a lot of criticism. Civil rights movements like the Electronic Frontier Foundation (EFF) fear it is the first step towards a massive surveillance system. The organization tried to change Apple’s mind through an open letter.

Apple employees are also very concerned that the system could be abused by authoritarian regimes, for example, to monitor dissidents or impose government censorship. The technique could also be used to scan mobile devices for other types of material, such as pamphlets.

Finally, Arda Gerkens, director of the Expertise Agency Online Child Abuse (EOKM), questions Apple’s NeuralHash. She fears the drug will mean more work for detectives. “Imagine what tens of thousands of reports per year mean for the workload of the police. They have to conduct an investigation, a house search and the confiscation of data carriers. And it is not the case that one detective is working on this,” says Gerkens.

Catch up on more articles here

Follow us on Twitter here


Must read


Related Posts