BoF sees nothing in European Commission bill against child pornography

Bits of Freedom is not in favour of the child sexual exploitation bill that the European Commission is likely to announce this week. The interest group is afraid that conversations are no longer confidential. She, therefore, sent a letter to the executive board of the EU, requesting that parts of the proposal be deleted.

That writes Rejo Zenger, policy advisor at Bits of Freedom, in an opinion piece.

Legislation must be ‘effective and sustainable’

The sexual exploitation of children and distribution of child pornographic material is a major problem that needs to be curbed. The interest group is firmly convinced of this. It is important in this battle that the legislation is “effective and sustainable”. This means that policymakers should not come up with laws and regulations that, at all costs, do everything they can to prevent this.

That’s exactly what Bits of Freedom is afraid of. The European Commission is likely to present a bill this week to combat the distribution of child pornography and tackle the sexual exploitation of children.

One of the things in the proposal is that platforms such as WhatsApp and Telegram will be forced to watch all chats from users. We also call this client-side device scanning. If they find anything suspicious, they should report it to the police.

For these reasons, BoF sees nothing in the European Commission’s plans

Zenger emphasizes that the idea of ​​the European Commission is noble. At the same time, he thinks it’s a bad idea. He puts forward three reasons for this.

Firstly, there is no independent research into the effectiveness of the type of technology envisaged by the Commission. “The figures available are vague and from WC Duck itself. That is undesirable because we mean that we are investing in an unproven technology – with all the associated risks,” says Zenger.

The policy advisor of Bits of Freedom calls the constant monitoring or the shoulders of users a general obligation to monitor. “That is against European rules and sooner or later European judges will declare such a law invalid. That makes such a measure anything but sustainable.”

Finally, client-side device scanning detracts from end-to-end encryption. End-to-end encryption is all about ensuring that only the sender and receiver can read your messages. Governments, security services, police and other investigative authorities cannot view these. This also applies to tech companies that offer these chat applications.

‘Breach of the right to privacy can never be proportional’

In his letter to the European Commission (PDF), Zenger writes that a general monitoring obligation is anything but sustainable and effective. “The infringement of the right to privacy can never be proportionate. And that is all the more relevant at a time when every aspect of our lives is becoming increasingly digital and the importance of a secure digital infrastructure is only increasing. The social costs of the measures in the proposal are therefore enormous. In our democratic society, it is not appropriate to take measures at all costs”

He, therefore, calls on the European Commission not to launch parts of the bill. That is not to say that the Commission should sit on its hands. “Tackling the problem requires a broader view. The European legislator can focus on streamlining cross-border criminal investigations, increasing investigative capacity, better sharing knowledge and skills and taking more preventive measures that make young children more resilient. These kinds of solutions are not controversial and at least as effective,” said Bits of Freedom’s policy advisor.

EOKM receives hundreds of thousands of reports of online child abuse

The Online Child Abuse Expertise Agency (EOKM) recently showed that the sexual exploitation of children on the Internet is a major problem. In its recently presented annual report, the organization wrote that it received more than 400,000 reports of (possible) child abuse last year. More than half of these reports (57 per cent) came from the Netherlands.

Half of the reported footage was found in the cloud. According to the EOKM, this is a new and worrying trend. According to Dutch law, the cloud environment is a private environment. Researchers from the expert bureau are not allowed to assess images that come from there. What the organization does is pass the URLs to the hosting company. It is then up to this party to remove any illegal footage.

Catch up on more articles here

Follow us on Twitter here


Must read


Related Posts