EU chat control bill: fundamental rights terrorism against trust, self-determination and security on the Internet
Today, the European Commission presented publicly for the first time an EU draft law on mandatory chat control. With the stated intention of fighting against “child pornography”, the Commission plans to oblige all providers of e-mail, chat and messaging services to search for suspicious messages in a fully automated way and disclose them to the police. This requires them to monitor and scan the communications of all citizens. End-to-end encryption would have to be undermined by “client-side” scanning on all mobile phones.
MEP and civil rights activist Dr Patrick Breyer (Pirate Party), who filed a lawsuit on Monday against the chat control already voluntarily practiced by Facebook/Meta, comments:
“Apart from ineffective web blocking, the proposed chat control threatens to destroy digital privacy of correspondence and secure encryption. Scanning personal cloud storage would result in the mass surveillance of private photos. Mandatory age verification would end anonymous communication. Appstore censorship would be the end of secure messenger apps and patronise young people. The proposal does not include the overdue obligation on law enforcement agencies to report and remove known abusive material on the net, nor does it provide for Europe-wide standards for effective prevention measures, victim support and counselling and effective criminal investigations. Von der Leyen continues to chart the territory of censorship, mass surveillance, anonymity bans and paternalism, while leaving the activities of child porn rings completely untouched. The proposed measures deprive the entire population of trust, self-determination and security on the net. This plan is nothing other than terrorism against our digital fundamental rights, which I will not relent to fight.
This Big Brother attack on our mobile phones, private messages and photos with the help of error-prone algorithms is a giant step towards a Chinese-style surveillance state. Chat control is like the post office opening and scanning all letters – ineffective and illegal. Even the most intimate nude photos and sex chats can suddenly end up with company personnel or the police. Those who destroy the digital secrecy of letters destroy trust. We all depend on the security and confidentiality of private communication: People in need, victims of abuse, children, the economy and also state authorities.”
Breyer summarises the content and effects of the bill in the following overview:
EU chat control proposal | Consequences |
Envisaged are chat control, network blocking, mandatory age verification for communication and storage apps, age verification by app stores and exclusion of minors from installing many apps | |
The communication services affected include telephony, e-mail, messenger, chats (also as part of games, on part of games, on dating portals, etc.), videoconferencing | Texts, images, videos and speech could be scanned |
End-to-end encrypted messenger services are not excluded from the scope | Providers will of end-to-end encrypted communications services have to scan messages on every smartphone (client-side scanning) and, in case of a hit, report the message to the police |
Hosting services affected include web hosting, social media, video streaming services, file hosting and cloud services | Even personal storage that is not being shared, such as Apple’s iCloud, will be subject to chat control |
Services that are likely to be used for illegal material or for child grooming are obliged to search the content of personal communication and stored data (chat control) without suspicion and across the board | Since presumably every service is also used for illegal purposes, all services will be obliged to deploy chat control |
The authority in the provider’s country of establishment is obliged to order the deployment of chat control | There is no discretion in when and in what extent chat control is ordered |
Chat control involves automated searches for known CSEM images and videos, suspicious messages/files will be reported to the police | According to the Swiss Federal Police, 87% of the reports they receive (usually based on the method of hashing) are criminally irrelevant |
Chat control also involves automated searches for unknown CSEM pictures and videos, suspicious messages/files will be reported to the police | Machine searching for unknown abuse representations is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not accessible to the public and the scientific community, nor does the draft contain any disclosure requirement. The error rate is unknown and is not limited by the draft regulation. Presumably, these technologies result in massive amounts of false reports. The draft legislation allows providers to pass on automated hit reports to the police without humans checking them. |
Chat control involves machine searches for possible child grooming, suspicious messages will be reported to the police | Machine searching for potential child grooming is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not available to the public and the scientific community, nor does the draft contain a disclosure requirement. The error rate is unknown and is not limited by the draft regulation, presumably these technologies result in massive amounts of false reports. |
Communication services that can be misused for child grooming (thus all) must verify the age of their users | In practice, age verification involves full user identification, meaning that anonymous communication via email, messenger, etc. will effectively be banned. Whistleblowers, human rights defenders and marginalised groups rely on the protection of anonymity. |
App stores must verify the age of their users and block children/young people from installing apps that can be misused for solicitation purposes | All communication services such as messenger apps, dating apps or games can be misused for child grooming and would be blocked for children/young people to use. |
Internet access providers can be obliged to block access to prohibited and non-removable images and videos hosted outside the EU by means of network blocking (URL blocking) | Network blocking is technically ineffective and easy to circumvent, and it results in the construction of a technical censorship infrastructure |
Voices against the proposed law are coming from across the board: EDRi has criticised the proposal would “allow the widespread scanning of people’s private communications” and “inevitably require the use of notoriously inaccurate AI-based scanning tools of our most intimate conversations”. The German Child Protection Association has also described the EU Commission’s planned warrantless scanning of private communication via messenger or email as disproportionate and not effective. Instead, they stress the majority of child pornography material is shared via platforms and forums. What was needed is “above all the expansion of human and technical resources at the law enforcement agencies, more visible police presence on the net, more state reporting offices and the decriminalisation of the dissemination of self-generated material among young people.”