Chat Control lead negotiator proposes to add “voluntary detection orders” and metadata scanning
The European Parliament’s lead LIBE Committee yesterday circulated the draft report by conservative Rapporteur Javier Zarzalejos on the proposal to fight child sexual abuse material (CSAR), also known as “chat control”. While committing to preserve end to end encryption, the Rapporteur proposes to add “voluntary detection orders” and metadata scanning. Pirate Party Member of the European Parliament and long-time opponent of the chat control proposal Patrick Breyer analyses the proposals and their implications.
- Chat Control: Mandatory indiscriminate searching of private correspondence and data of unsuspected citizens is still in. Requirement to „limit the detection order to an identifiable part or component of a service“ such as specific channels or groups (AM 128) goes in the right direction but does not ensure that searches are targeted/limited to specific persons presumably connected to CSEM, as required to avoid annulment of the detection provisions by the Court of Justice. The findings of the European Parliament Research Service are not yet reflected.
- Unlike the Commission, Rapporteur doesn’t want users to be informed that their correspondence has been (falsely) reported (AM 138).
- „Voluntary detection“/Chat Control: Proposed new power for providers to search private correspondence and data of unsuspected citizens of their own initiative, even where conditions of a detection order are not met (AM 99). Again not targeted/limited to specific persons presumably connected to CSEM, as required to avoid annulment by the Court of Justice.
- End to end encryption: Weakening encryption is excluded (AM 106), although wording is not yet sufficient to exclude mandatory client-side scanning with certainty (some argue client-side scanning wouldn’t interfere with the encryption process as such).
- Metadata control: Proposal of new power for automated metadata retention and analysis for allegedly suspicious communications patterns by providers (AM 106). Such technology is again closed-source and not independently evaluated, likely unreliable with countless false positives. The Commission warns that „Service providers do not consider metadata as an effective tool in detecting CSAM“ and „metadata is usually insufficient to initiate investigations“ (p. 29). Most of all proposed provision again does not ensure that processing is targeted/limited to specific persons presumably connected to CSEM, as required by the Court of Justice’s La Quadrature judgement (par. 172 pp.).
- Access blocking / search engine delisting orders: Ineffective access blocking to CSEM is still in. Proposal of new power to delist CSEM from search engines and “artificial intelligence”. Both are ineffective because the material is not deleted at its source.
- App censorship for children: Requiring app stores to block minors from installing communications apps such as Whatsapp, games or chats is still in. Draft report proposes to extend this app censorship (AM 101 pp.).
- Anonymous communications ban: Requirement on communications services to verify user age is still in, with verification systems effectively excluding anonymous use. This would be the end of anonymous e-mail or messenger accounts that whistleblowers, political activists etc. need.
- Promising: Proposal to establish a Victims’ Consultative Forum (AM 273)
- Promising: Services may ensure a “high level of privacy, safety, and security by design and by default” (AM 79)
More on Chat Control: www.chatcontrol.eu