Chat Control 2.0: EU governments set to approve the end of private messaging and secure encryption
By making a minor concession EU governments hope to find a majority next week to approve the controversial „chat control“ bill. According to the proposed child sexual abuse regulation (CSAR), providers of messengers, e-mail and chat services would be forced to automatically search all private messages and photos for suspicious content and report it to the EU. To find a majority for this unprecedented mass surveillance, the EU Council Presidency proposed Tuesday that the scanners would initially search for previously classified CSAM only, and even less reliable technology to classify unknown imagery or conversations would be reserved to a later stage. The proposed „deal“ will be discussed by ambassadors tomorrow and could be adopted by ministers next week.
Patrick Breyer, Pirate Party Member of the European Parliament and co-negotiator of the proposal, warns about the consequences of such a „deal“:
„Firstly, the proposed text would mandate the implementation of surveillance bugs and vulnerabilities into currently securely end-to-end encrypted messenger apps such as Whatsapp or Signal. It would mean the end of secure encryption because we could never be sure whether our messages or photos would be forwarded to persons we don’t know and can’t trust. The so-called client-side scanning would either make our communications fundamentally insecure, or Europeans would no longer be able to use Whatsapp or Signal at all, as their providers have contemplated.
Secondly, the proposed indiscriminate mass scanning of private communications of millions of citizens not even remotely connected with crime would inevitably be struck down by the courts, utterly betraying the hopes of children or victims. All independent legal experts and even the EU Council’s own legal service agree that indiscriminate content analysis fails to comply with fundamental rights and the jurisprudence of the EU Court of Justice. The disaster surrounding the failed data retention directive would repeat itself.
Thirdly, indiscriminate scanning mass criminalizes our children, with 40% of criminal suspects for possession of CSAM being minors in Germany alone. Youths are usually not aware of the criminal nature of seemingly funny content, which they often receive inadvertedly via chat channels.
Fourthly, scanning for known, thus old material does not help identify and rescue victims, or prevent child sexual abuse. It will actually make safeguarding victims more difficult by pushing criminals to secure, decentralised communication channels which are impossible to intercept even with a warrant. Although some US corporations such as Meta are already scanning European messages for previously classified CSAM ‚only‘, up to 80% of reported messages are classified by the police as not criminally relevant, thus implicating innocent citizens. The Commission estimates the number of reported messages to multiply as a result of mandatory scanning, which would flood law enforcement and overload the resources already lacking for targeted or undercover investigations into the organised producers of such material and into ongoing child sexual abuse.
Fifth, opening the door to indiscriminate surveillance will put us on a slippery slope, with Europol already calling to scan for other types of content.
The proposed ‚compromise‘ does not even touch upon other fundamental problems of the draft legislation, including ending anonymous communications and whistleblower tips as a result of mandatory age verification, and the prohibition of commonplace messenger, social networking, gaming and video conferencing apps for teenagers under 16 years of age, even where their parents consent.
This proposal urgently needs a fresh start that focuses on security by design instead of mass surveillance, paternalism and breaking IT security. The future of our privacy and security, and that of our children, is at stake!“