Home ai Controversial EU Proposal to Scan Private Messages for Child Abuse Material Could...

Controversial EU Proposal to Scan Private Messages for Child Abuse Material Could Result in Millions of False Positives, Warn Experts

Experts Warn European Union Proposal for Messaging Platform Scanning Could Lead to Millions of False Positives

A proposal by the European Union (EU) to require messaging platforms to scan private communications for child sexual abuse material (CSAM) has faced significant backlash from security and privacy experts. In an open letter, over 270 experts, including prominent academics and researchers, expressed concerns about the proposal’s potential for millions of false positives per day. They argue that the EU’s plan is technologically impossible and will jeopardize internet security and user privacy.

The EU’s proposal not only mandates scanning for known CSAM but also requires the use of unspecified detection technologies to identify unknown CSAM and grooming activity in real-time. Critics argue that this approach is unrealistic and relies on unproven technologies, such as client-side scanning. They emphasize that there is currently no technology capable of achieving the proposed goals without causing more harm than good.

The European Council recently proposed amendments to the draft CSAM-scanning regulation, but the experts argue that these revisions still fail to address fundamental flaws in the plan. They claim that the amendments would give unprecedented surveillance and control powers to internet platforms, undermining a secure digital future and democratic processes. The experts warn that the proposed changes would also compromise communications and systems security.

One of the main concerns raised by the experts is the high likelihood of false positives. Given the vast number of messages sent on platforms like WhatsApp (approximately 140 billion per day), even with a low false positive rate of 0.1%, there would still be millions of false alarms every day. The experts argue that the proposed revisions do not effectively reduce the sharing of CSAM and would indiscriminately affect a massive number of people.

The letter also highlights the contradiction between protecting encryption and enabling detection capabilities. The experts emphasize that detection in end-to-end encrypted services undermines the protection provided by encryption. They argue that enabling detection capabilities violates the confidentiality of communication and sets a dangerous precedent for filtering the internet.

While police chiefs in Europe have called for platforms to design security systems that can identify illegal activity, they have not specified the technical solutions they propose. This raises questions about the feasibility of achieving “lawful access” without compromising encryption.

If the EU continues with its current proposal, the consequences could be catastrophic, according to the experts. It would not only change how digital services are used but also have a chilling effect on individuals’ right to a private life in the digital space. The experts warn that the proposal sets a precedent for internet filtering and could negatively affect democracies worldwide.

The EU is set to discuss the proposal for a regulation to combat child sexual abuse in a working party meeting on May 8.

Exit mobile version