Original article here. This article was translated by Patrick Breyer’s team with the consent of the original authors from Netzpolitik.org.
Screening chat messages, scanning private photos: For the German ministries led by the liberal FDP, the plans for chat control by the EU Commission crosses “red lines” in many places. An internal document shows that the federal government is not united on the issue.
The FDP-led federal ministries are apparently putting internal pressure on the federal government, because the EU Commission’s plans for chat control go too far for them. This becomes clear from a list of “red lines” that the Ministry of Justice and the Ministry of Digital Affairs have sent to the SPD-led Ministry of the Interior, according to Tagesspiegel Background. (We publish the list in full text.)
Chat control refers to plans by the EU Commission to combat the spread of recordings of sexualised violence against children. The Commission presented a draft in May which demands far-reaching obligations for tech companies. Among other things, they are to automatically recognise known and previously unknown depictions of sexualised violence against children, even in private chats. The plans have been met with scathing criticism, including warnings from the EU data protection authorities of unnecessary mass surveillance.
The German government also criticised the planned measures. In a letter, it badgered the EU Commission with more than 60 questions, some of them very pointed, including the importance of encrypted communication or the error rates to be expected when recognising such images. Now a letter shows that the critical attitude within the German government is apparently not consistent. As Tagesspiegel Background reports, in the letter the ministries of justice and digital affairs address the ministry of the interior (BMI) led by SPD minister Nancy Faeser, the ministry in charge of the matter.
“Red lines”: No scanning for unknown recordings
The “red lines” drawn in the letter concern the core of the planned EU legislation. Among other things, the ministries demand: “No regulations that lead to chat control”. Messages sent via messenger or email must be explicitly excluded from automated searches. This would remove the central and name-given measure from the planned regulation.
Material that users upload to personal cloud storage and do not share with anyone, such as a backup of their own photos on their mobile phone, should also not be subject to the search orders.
Another point revolves around a particularly controversial part of the planned EU regulation. Tech companies should not only be forced to search for already known material in their users’ data. This is possible with less invasive procedures. The companies are also supposed to track down new, previously unknown material. In addition, they should automatically detect the initiation of sexual contact with minors, so-called grooming. This means far-reaching invasions of the privacy of millions of innocent users. According to the paper, both measures should be discarded.
End-to-end encryption is to remain
In the paper, the ministries also discuss the confidentiality of private communications. According to the paper, the EU regulation should explicitly rule out the possibility of companies undermining end-to-end encryption in order to automatically screen content. End-to-end encryption ensures that only recipient and sender can read a sent message. Removing encryption from messages would overall weaken the ability to communicate securely.
The paper also explicitly rejects so-called client-side scanning. In this process, providers scan the messages of their users directly on the device, even before they are sent end-to-end encrypted. This method is considered one of the few ways to check content despite end-to-end encryption. But scanning before sending also weakens anonymous communication. The EU data protection authorities had previously warned against this.
Non-negotiable “red lines” for the FDP ministries are apparently also crossed by the Commission’s plans on age verification. The EU Commission wants providers to check the age of their users. The ministries demand that the text of the regulation exclude the requirement to present an identity card or other means of identification. In Germany, the age of majority can be confirmed with the online ID function without revealing further data. However, such technologies are not available for all EU citizens. They could then be forced to disclose not needed further data.
The ministries also demand that content and behaviour that is not punishable under national law be excluded from the regulation. This is a central problem which complicates international action against depictions of sexualised violence. Depending on national law, certain recordings are not punishable – for example, because the age of sexual consent differs. This applies, for example, to nude images sent during consensual sexting between young people. Even today, more than half of the suspects in so-called child pornography are minors themselves.
No screening of audio messages
The FDP ministries attack another point that has hardly been taken into account so far. Most recently, the EU data protection authorities had stated in their assessment: voice messages and audio communication in real time, i.e. telephone calls, are to be explicitly excluded from the regulation. So far, the EU Commission’s draft does not explicitly exclude that providers also have to screen audio files such as voice messages and telephone calls.
The demands in the list are mostly kept very concise and mainly formulated in a negative way: It is about what must not happen in the regulation. We asked the Ministry of Justice whether the ministries will also propose their own alternatives and when exactly the paper was sent out. The answer was that they do not comment on details of ongoing internal government consultations. Instead, the ministry referred to a general statement by Justice Minister Marco Buschmann on chat control. He was “very sceptical about this new draft” and rejected a “general blanket surveillance of private correspondence”.
The Federal Ministry of the Interior, too, wrote on request only that in the context of the current negotiations “according to common practice” all ministries involved were asked to submit their position for further discussion.
The demands from the FDP ministries are clearly set out in the paper. The question is what the Federal Ministry of the Interior will do with them now. At least some of the demands can be derived directly from the coalition agreement of the federal government, which states: “We reject measures to scan private communications and an identification obligation.” In a next step, the ministry is to present a report on the commission’s plans to the digital committee of the Bundestag, as Tagesspiegel Background reports.
Red lines for Ministry of Justice (BMJ) and the Ministry of Digital Affairs (BMDV)
In order for the FDP- led ministries to be able to agree to the draft regulation of the COM the following requirements must at least be met (“red lines”):
- Clear requirements for the issuance of disclosure orders (sufficient limitation of the “significant risk” in the Regulation, more detailed requirements for the balancing decision according to Art. 7 (4) b) of the Draft Regulation).
- No provisions leading to chat control (to be excluded by deleting the applicability of Art. 7 of the Draft Regulation to interpersonal communication services (esp. email services, messenger) according to Art. 2 b) of the Draft Regulation).
- Exclusion of personal memories that are not shared. Cloud memories, which serve as a backup of one’s own photos on the mobile phone, for example, must explicitly not be covered by the regulations on the discovery order (to be excluded by excluding the applicability of Art. 7 Draft Regulation to personal memories).
- Deletion of the applicability of Art. 7 Draft Regulation to so-called unknown material and grooming.
- Explicitly exclude the use of client-side scanning and the removal of end-to-end encryption to fulfil obligations under the Draft Regulation (to be excluded in a separate article of the Draft Regulation).
- Audio communications (voice recordings and real-time audio communications) are to be explicitly excluded from the scope of the Draft Regulation, as in the Interim Regulation.
- Providers must be able to fulfil the obligations under the Draft Regulation (risk assessment, risk mitigation, deletion/blocking) without using the detection technologies described in Article 10(1) Draft Regulation. This is to be specified in the text of the Draft Regulation.
- Age verification for the implementation of the obligations from the Draft Regulation (such as risk reduction, Article 4 Draft Regulation, obligation for app stores in Article 6 (1)(c) Draft Regulation) only if the possibility of anonymous or pseudonymous use of the services concerned is preserved. To this end, the text of the Draft Regulation must exclude the presentation of an identity card or other means of identification for the purpose of age verification.
- No inclusion of content or conduct that is not punishable under national law (definitions in Article 2 of the Draft Regulation must take into account the scope for decision-making granted to member states in Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children and child pornography and replacing Council Framework Decision 2004/68/JHA (FD), in particular with regard to determining the age of sexual consent (Article 6 of the FD) and the impunity of certain acts (Article 5(8) of the FD)).
License: The content of this article is licensed under Creative Commons BY-NC-SA 4.0.