Original article here.
The EU Commission’s proposal to screen private chat messages is met with scepticism by some member states. Others, however, think it’s great. So far unpublished minutes on the negotiations behind closed doors show: The future of the project is uncertain.
Hardly any EU net policy project is currently facing as much headwind as the so-called chat control. In order to combat depictions of sexualised violence against minors on the internet, the EU is planning drastic interventions in private communication. For example, providers will be ordered to screen private chats and scan photos. Even the highest EU data protection authorities have completely torn the draft apart.
Now, an internal survey of the Council’s working group on law enforcement shows that some member states are sceptical – but some are also in favour. The paper dates from July 2022 and cannot be published for reasons of source protection. According to the document, the most serious concerns are whether chat control should be extended to encrypted means of communications.
End-to-end encrypted chats like WhatsApp, Signal or Threema ensure that only senders and recipients can read a message. To scan these messages, they would either must be searched on the device directly or the encryption would have to be broken.
According to the paper, there is no explicit consent from a single member state for such chat control in encrypted communications. Germany and Austria are explicitly against it. Four states have unclear positions on this: The Netherlands, Belgium, Greece, and France.
Austria: Massive reservations
Austria has massive concerns about fundamental rights, especially the violation of the right to privacy, the document says. In fact, the contents of messages could only be read if the secure end-to-end encryption was fundamentally broken. The error rate in the automatic recognition of content is therefore problematically high.
Significantly more states agree with the rather general question of whether comprehensive chat control should be introduced – without defining in more detail whether encrypted communication would also fall under this. Experts warn against putting millions of users under suspicion. However, seven EU states expressly approve: France, Bulgaria, Hungary, Romania, the Netherlands, Latvia, and Finland. According to the paper, only Austria is clearly against it.
Germany still undecided
Germany’s position is not clear from the paper; in general, the German government supports the EU Commission’s proposal in a well-intentioned manner. The German representation points out that measures are alright if the confidentiality of communications is maintained. This must not be undermined either legally or technically. To this end, the planned regulation should explicitly state that encryption may not be broken.
Germany also welcomes stronger age controls if anonymous use is guaranteed. The planned law would oblige providers to introduce more age controls. They should identify minors and prevent adults from posing as children to initiate contact. This is also called cyber-grooming.
Germany had sent the EU Commission a long catalogue of questions on the planned law. In it, the government asked, among other things, critical questions about the planned age verification or the error rates in the detection of problematic material. The EU Commission gave only verbal and partly superficial answers.
No unity in the German coalition
Behind Germany’s ambivalent position is probably a dispute within the traffic light coalition. At the end of August, the FDP-led justice and digital ministries rejected the most important building blocks of chat control. However, the SPD-led Ministry of the Interior is responsible for negotiating the law. Different and ambivalent positions can be heard from Nancy Faeser’s ministry, clear announcements have not yet been publicly documented.
From the ranks of the Greens, some politicians have publicly taken a critical position, among them Family Minister Lisa Paus. She said at a conference she was against chat control for private communication. As a federal minister, Paus is responsible for children and young people – the exact group which should be protected by chat controls.
The negotiators in Brussels distinguish between chat control in general and chat control for private, encrypted communication. This could be an indication that a slimmed-down version of the project is already an option. However, it is also clear from the mood that many of the 27 member states have probably not yet taken a clear position. Other states contributing to the debate have partly formulated critical questions. The Netherlands, for example, would like more information on scanning and encryption, while Lithuania points out the need for data protection.
Comparison to spam filters
Further details on the EU plans are provided by a “for official use only” wire report from another meeting of the Council working group, which we publish in full text. To detect known depictions of violence against children, the EU Commission sees technologies such as the Microsoft product PhotoDNA as suitable. For the detection of new, previously unknown material and grooming, “AI similar to the technologies used to detect SPAM/virus content” could be considered.
According to the report, the Netherlands, Slovenia, and France have asked for a presentation of the technologies that can check content. According to the planned regulation, a new EU centre will make such technologies available to providers free of charge. If providers receive an order for so-called chat control, they would have to apply the technology and use it to check for grooming and image material.
Experts are critical of the technologies available so far for detecting unknown material and grooming. The EU Commission itself has admitted that the detection technologies currently have high error rates but accepts this as a necessary fact.
No backdoors, but “child safety by design”
According to the paper, another question raised by member states was whether providers would have to design “backdoors” into their products from the outset, even before they receive a chat control order. According to the Commission, providers are obliged to ensure “child safety by design” in their products, meaning ensuring that risks are minimised from the outset. However, the regulation does not provide for a “backdoor”. Providers would also not be liable if there was no functioning technology for their platforms. In such a case, no search order could be issued.
Civil society also plays a role in the debate on chat control. In Germany, civil rights organisations have already organised a street protest. Even some child protection NGOs have sharply criticised the EU proposal.
Here is the document in full text:
– Classification level: VS-For official use only
– Date: 25.07.2022
– From: Permanent Representation EU Brussels
– To: E11, Management
– Copy: BMI, BMWK, BMDV, BMFSFJ, BMBF, BMG, BMJ, BKAMT
– Subject: Meeting of the CWG on Law Enforcement (LEWP) on 20/07/2022
– Reference: Pol 350.80/1
I. Summary and evaluation
The focus of the RAG meeting was the discussion of Articles 8-15 of the draft regulation on combating child sexual abuse more effectively.
The next meeting is scheduled for 6 September.
II. In detail
1. Adoption of the Agenda
Doc. CM 03928/22
The agenda was adopted without amendments.
2. Information by the Presidency
The Presidency first provided information on the main content of the informal JHA Council and the informal COSI. The focus of the discussions had been the consequences of the war in Ukraine. Another topic addressed was the CSA Regulation. The JHA Council had determined that the prevention of CSAM (child sexual abuse material) was in the foreground of the Draft Regulation ; the service providers should focus on the removal of illegal content.
The President of the Council also informed about the current Schengen area report for the year 2022, which had been adopted on 10.06.2022. Insofar as the topics identified there also affected the work of the LEWP, The President of the Council would provide more information on this.
There had been numerous negotiations with COPEN on the follow-up to the “8th round of mutual evaluations (environmental crime)”. The President of the Council would follow up on FRA activities, but there were still about 20 follow-up reports to be dealt with. This would require either three joint meetings (LEWP and COPEN) to deal with the open reports and another to deal with the final report, or only one joint meeting in which only the final report would be dealt with and adopted. EUROPOL would then also be invited to this meeting. MS were invited to discuss their reports in public before the final report was adopted; MS should report to the President of the Council by 15.9 or submit any reports that had not yet been submitted. The deadline of 15.9. must be adhered to. The president did not (yet) say which procedure would be chosen.
With regard to the WIKIPOL platform, the President of the Council announced that the General Secretariat would send some documents to be updated; the deadline for feedback was 16 September 2022.
3. Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse
The Presidency first welcomed the fact that COM had promised to draw up an overview of the competences of Europol and the planned EU Centre and to submit it by September. This was followed first by an article-by-article, then a section-by-section discussion of the Draft Regulation.
On Article 8: BEL (supported by ESP) asked about the relationship between the contact point and the main office of the providers (Art. 23 or 24 of the Draft Regulation) and asked for clarification on the time limit provided for in para. 3. POL and BEL also referred to the time limits of 24 and 12 months for CSAM and grooming respectively provided for in Art. 7(9) and asked for clarification on how a provider offering a (uniform) service should take account of the different time limits, also against the background of his obligation to cooperate 2 months before the expiry of the order.
FRA emphasised that it would definitely support the draft – despite some open questions. More flexibility and simplification of the proposed procedures should be sought in further negotiations. The role of LEAs must be strengthened in the draft regulation and the relationship to national regulations must be clarified. IRL expressed doubts about the necessity of a court/independent authority as a second decision-making instance. In view of the probably identical basis for decision-making and the high demand for (new) resources and capacities, it was questionable whether such a second level would represent a real added value. POL also emphasised a considerable increase in resources. The role of INHOPE was not sufficiently considered in the Draft Proposal as well as in the flowchart provided by the COM. EST supported this point, especially for smaller MS, NGOs like INHOPE/Safer Internet were of great importance.
With regard to the language regime of the Order, HUN, BEL, HRV emphasised the requirements of the DSA. It was important that English could be chosen in addition to the national language – also for cost reasons. GRC expressed doubts about the necessity of establishing an EU centre, as this would lead to delays. Delays could also be expected with removal orders. MLT and POL also stressed the fundamental importance of removing content quickly.
COM stated that detection orders should be served either on the head office if it is in the ordering MS or otherwise on the contact point. In view of the rapidly changing digital world, it was important to provide time limits for the maximum duration of orders. It should be noted that the issuing of a first order usually takes more time than the issuing of subsequent orders. This applies, among other things, because the provider is already obliged to check whether or which circumstances have changed before an order expires. In response to a question, COM explained that the formulations “manifest errors” and “not sufficient information” in Art. 8 para. 3 were taken from the TCO and that an assessment depended on the individual case. COM was confident that established national procedures could be maintained in compliance with the standards of the CSA Regulation. With regard to the language regime, COM explained that the aim was to achieve the greatest possible standardisation; where possible, for example, through “drop-down” menus in the respective languages.
For reasons of proportionality, it was necessary that the decision to issue a detection order be taken by a court or a quasi-judicial authority. It is necessary, also in view of the depth of the intervention, that the final decision is made by the courts. For this, corresponding (additional) capacities would have to be built up at the national courts. The Draft Regulation does not provide for CSAMs to be reported to LEAs, but to the EU Centre. However, reports to LEAs could be processed by them, there was no requirement that LEAs had to report to the EU centre.
FRA (supported by EST) reiterated whether it was true that under the CSA Regulation, companies could not identify CSAMs voluntarily, but only on the basis of an detection order. With regard to removal orders, harmonisation with national processes that already allowed for timely orders had to be ensured.
On INHOPE, COM explained that the role of INHOPE would be strengthened under the CSA Regulation; e.g. in assisting stakeholders in joint action with the (Coordinating Authorities) CA to remove CSAMs and in building capacity/resources in the MS. INHOPE would be directly elected in the EC. The services covered by the e-privacy regulation would not be able to identify voluntarily after August 2024. The CSA Regulation stipulates that in the future, identification will only take place according to legally secured standards. If the negotiations were not concluded by August 2024, a transitional arrangement would be needed so that there would be no regression to the status quo. Services that do not fall under the e-privacy regulation could continue to operate voluntarily according to the provisions of the GDPR. IRL summarised once again that the CSA Regulation does not constitute a legal basis for voluntary identification measures by interpersonal communication service providers.
Section 2: PRT voted for guidelines that are as detailed and binding as possible. ITA and IRL stressed that guidelines should take into account the tasks of LEAs. BEL asked about the relationship between the DSC and the CA and the oversight of the use of certain technologies. FRA reiterated that orders should be issued without delay; the possibility for all “affected users” to complain should pose major challenges for authorities (support from SWE). CYP welcomed that the proposal also provides for measures against grooming. Article 10 must be formulated in a technology-neutral way. The protection of children should take precedence over the protection of other personal rights. COM asked for confirmation that the existing bodies under DSA and TCO can be designated as CAs under the CSA Regulation.
NLD, SVN and FRA asked for a presentation of the technologies provided for under Art. 10. EST asked whether providers would be obliged to include so-called “backdoors” or whether they would first have to react to orders. There was also the question of the possible liability of providers if no technologies existed that enabled identification for the services they offered – especially if the EU centre could not provide suitable technologies. SWE called for more leeway for MS to determine the respective competent authority in connection with Art. 9. The deadline in Art. 10(6) (in conjunction with Art. 48) should be made more flexible in order to keep the administrative burden in view. In addition, it was important to consider the interaction with the prosecution of other offences in this context. DEU made a presentation as instructed.
COM explained that any conflicts between the data protection supervisory authority and the CSA Regulation had been countered with the extensive process in advance of an order. However, if conflicts arose, the decision would ultimately rest with the courts, which would take into account the recommendations of the data protection supervisory authorities and data protection law. Human oversight ensures that systems function properly. Affected users could already take action against voluntary measures, and this would not regularly overburden the courts. Guidelines would be issued as delegated acts and should contribute to the uniformity of decisions. The DSC could also function as a CA, provided the requirements under the CSA Regulation are met. If different authorities are involved, it is important that they work closely together. If providers use technologies that are not made available by the EU centre, providers would have to refer to data protection supervisory authorities. EDPB should therefore not necessarily be (additionally) involved. The EU Centre would have to check all notifications that go to LEAs or EUROPOL. The classification of technology as high-risk AI within the meaning of the AI Regulation represents a further “safeguard”, but does not lead to a delay in orders. It is necessary to subject new technologies to a conformity check under the AI Regulation before they are made available by the EU Centre.
Suitable technologies for recognising known CSAM are hash values and photo DNA, for new CSAM and grooming AI is used, similar to the technologies for recognising SPAM/virus content. The draft proposal provides for technology neutrality. If no technology is available that ensures adequate protection of fundamental rights, an order cannot be issued. The role of encrypted communication in the spread of CSAMs should not be underestimated. According to estimates, 2/3 of all messages would be dropped if E2EE was available in all messengers by “default”. Therefore, draft proposal is technology-neutral in order to ensure the protection of children. Providers are obliged to ensure “child safety by design” (iRd risk management), the draft proposal does not provide for the establishment of “backdoors by design”.
The Draft Regulation was designed to build on existing structures, such as those of the DSA and the TCO. When asked, COM clarified that it was not the providers’ task to determine the illegality of content. This is the task of the LEAs Art. 9 para. 2 represents a further “safeguard”, but does not lead to a suspensive effect, i.e. also not to delay.
The President of the Council announced a tech workshop for late September/early October.
Section 3: DEU presented as instructed. AUT submitted a special scrutiny reservation for Art. 8 – 24, combined with the proposal to hold a special data protection workshop. From the point of view of AUT/FRA/POL/MLT/SVN and HUN, the period of 3 months provided for in Art. 12(2) UA(2) was too short; a period in line with requirements should be sought. IRL stated that for effective prosecution, traceability of reports was necessary.
FRA and EST raised questions on the relationship between Art. 12 and Art. 15a and 14 DSA. NLD requested that Art. 13 (g) not only take into account information on users, but also on data subjects. Regarding Art. 13 c) d), GRC asked what other (content) data was meant. HRV asked about the relationship to notifications based on national regulations.
COM explained that the wording of Art. 12 had been chosen to ensure consistency within the Regulation. While the standard is the same as for the DSA, the wording makes it clear that providers do not carry out their own checks. The provider becomes aware when content is reported to him. As a lex specialis to the DSA, the obligation to report under Article 12 of the CSA Regulation takes precedence over the obligation to report under Article 15a of the DSA. According to Art. 13 c) and d), the reports should contain all information that could also be relevant for LEAs, as well as information on data subjects. Overall, notifications would be subject to the highest data protection standards. The draft does not prevent MS from receiving notifications directly from NCMEC or other actors.
Section 4: SWE referred to the cross-border effect of the orders in the context of Art. 14(1) (para. 7). This had also been discussed in Art. 8/Art. 31 DSA. Even on the basis of the CSA Directive, there were still differences in national law. The cross-border component meant that the respective national standards had an impact on other MS. Conditions for removal orders should be discussed further. FRA was in favour of a separate procedure for cross-border removal orders. ITA asked COM for an overview of the workflow for removal orders. DEU presented as instructed. AUT stated that the deadline of 6 weeks in Art. 15(4) was too short. Information to the persons concerned could only be provided with the involvement of Europol or the LEAs. From FRA’s point of view, it would also be preferable for LEAs to be able to set the deadline for suspending information obligations to data subjects themselves.
COM confirmed that voluntary removal of content by providers would remain permissible under the CSA Regulation.
The Presidency announced that it would submit a revised version of the first and second chapters in September. The next meeting on 6 September will deal with Chapter 3 of the draft regulation.
License: The content of this article is licensed under Creative Commons BY-NC-SA 4.0.