Messaging and Chat Control

The End of the Privacy of Digital Correspondence

The EU decided to let providers search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately. The stated aim: To prosecute child pornography. The result: Mass surveillance by means of fully automated real-time messaging and chat control and the end of secrecy of digital correspondence. A majority of the Members of the European Parliament adopted the chatcontrol regulation on 6 July 2021.

Chatcontrol 2.0 will follow soon

But this is not the end of the story: For autumn 2021, European Commission announced that it will propose a follow-up legislation that will make the use of chatcontrol mandatory for all e-mail and messenger providers. This legislation might then also affect securely end-to-end encrypted communications. However, a public consultation by the Commission on this project showed that the majority of respondents, both citizens and stakeholders, oppose an obligation to use chat control. Over 80% of respondents oppose its application to end-to-end encrypted communications. As a result, the Commission postponed the draft legislation originally announced for July to September December 2021 (page 4 of the pdf). 

 

 

Czech Translation

 

Swedish Translation

How did we get here?

In 2020 the European Commission proposed “temporary” legislation aimed at allowing the search of all private chats, messages, and emails for illegal depictions of minors and attempted initiation of contacts with minors. This is to allow the providers of Facebook Messenger, Gmail, et al, to scan every message for suspicious text and images. This takes place in a fully automated process and using error-prone “artificial intelligence”. If an algorithm considers a message suspicious, its content and meta-data are disclosed automatically and without human verification to a private US-based organization and from there to national police authorities worldwide. The reported users are not notified.

Some U.S. providers of services such as Gmail and Outlook.com are already performing such automated messaging and chat controls. Through a second piece of legislation, the EU Commission intends to oblige all providers of chat, messaging and e-mail services to deploy this mass surveillance technology. At the same time, a representative survey conducted in March 2021 clearly shows that a majority of Europeans oppose the use of chat control  (Detailed poll results here).

More videos on Chatcontrol are available in this playlist

How does this affect you?

  • All of your chat conversations and emails will be automatically searched for suspicious content. Nothing remains confidential or secret. There is no requirement of a court order or an initial suspicion for searching your messages. It occurs always and automatically.
  • If an algorithms classifies the content of a message as suspicious, your private or intimate photos may be viewed by staff and contractors of international corporations and police authorities. Also your private nude photos may be looked at by people not known to you, in whose hands your photos are not safe.
  • Flirts and sexting may be read by staff and contractors of international corporations and police authorities, because text recognition filters looking for “child grooming” frequently falsely flag intimate chats.
  • You can falsely be reported and investigated for allegedly disseminating child sexual exploitation material. Messaging and chat control algorithms are known to flag completely legal vacation photos of children on a beach, for example. According to Swiss federal police authorities, 86% of all machine-generated reports turn out to be without merit. 40% of all criminal investigation procedures initiated in Germany for “child pornography” target minors.
  • On your next trip overseas, you can expect big problems. Machine-generated reports on your communications may have been passed on to other countries, such as the USA, where there is no data privacy – with incalculable results.
  • Intelligence services and hackers may be able to spy on your private chats and emails. The door will be open for anyone with the technical means to read your messages if secure encryption is removed in order to be able to screen messages.
  • This is only the beginning. Once the technology for messaging and chat control has been established, it becomes very easy to use them for other purposes. And who guarantees that these incrimination machines will not be used in the future on our smart phones and laptops?

Click here for further arguments against messaging and chat control

Click here to find out what you can do to stop messaging and chat control

Timeline

Voluntary chat control has been approved! Here is my press release on the outcome of the vote.

In December 2021 the EU Commission intends to make a second legislative proposal, which is to compel all providers of email, messaging and chat services to search all private messages in the absence of any suspicion.

According to the case-law of the European Court of Justice the permanent and comprehensive automated analysis of private communications violates fundamental rights and is prohibited (paragraph 177). For this reason, Member of the European Parliament Patrick Breyer has filed a complaint against U.S. companies Facebook and Google with the data protection authorities for violating the General Data Protection Regulation. Former judge of the European Court of Justice Prof. Dr. Ninon Colneric has extensively analysed the plans and concludes in a legal assessment that the EU legislative plans on chat control are not in line with the case law of the European Court of Justice and violate the fundamental rights of all EU citizens to respect for privacy, to data protection and to freedom of expression.

Upcoming dates

  • 14 January 2021: Internal technical negotiations of the European Parliament
  • 15 January 2021: Technical negotiations between Council, Commission and Parliament (Technical Trilogue)
  • 18 January 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 19 January 2021: Internal technical negotiations of the European Parliament
  • 20 January 2021: Technical negotiations between Council, Commission and Parliament (Technical Trilogue)
  • 25 January Shadow rapporteur meeting of the negotiators of the European Parliament
  • 01 February 2021: Internal technical negotiations of the European Parliament
  • 05 February 2021: Internal technical negotiations of the European Parliament
  • 22 February 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 23 February 2021: Second Political Trilogue negotiations between Council, Commission and Parliament
  • 26 February 2021: Internal technical negotiations of the European Parliament
  • 1 March 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 2 March 2021: Internal technical negotiations of the European Parliament
  • 8 March 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 9 March 2021: Third Political Trilogue negotiations between Council, Commission and Parliament
  • 23 March 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 25 March 2021: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 19 April: Internal technical negotiations of the European Parliament, Commission and Council
  • 28 April: Shadow rapporteur meeting of the negotiators of the European Parliament
  • 29 April: Fifth Political Trilogue negotiations between Council, Commission and Parliament
  • 26 May: Committee Vote on  Trilogue Agreement
  • 6 July 2021: Plenary Vote

 

  • Expected for Autumn December 2021: Commission proposal on mandatory messaging and chat control for online service providers

What you can do

  • Generate attention on social media! Use the hashtags #chatcontrol and #secrecyofcorrespondence
  • Generate media attention! So far very few media have covered the messaging and chat control plans of the EU. Get in touch with newspapers and ask them to cover the subject – online and offline.
  • Ask your e-mail, messaging and chat service providers! Avoid Gmail, Facebook Messenger, outlook.com and the chat function of X-Box, where indiscriminate chat control is already taking place. Ask your email, messaging and chat providers if they generally monitor private messages for suspicious content, or if they plan to do so.

Additional information and arguments

  • All citizens are placed under suspicion, without cause, of possibly having committed a crime. Text and photo filters monitor all messages, without exception. No judge is required to order to such monitoring – contrary to the analog world which guarantees the privacy of correspondence and the confidentiality of written communications. According to a judgment by the European Court of Justice, the permanent and general automatic analysis of private communications violates fundamental rights (case C-511/18, Paragraph 192). Nevertheless, the EU now intends to adopt such legislation. For the court to annul it can take years. Therefore we need to prevent the adoption of the legislation in the first place.
  • The confidentiality of private electronic correspondence is being sacrificed. Users of messenger, chat and e-mail services risk having their private messages read and analyzed. Sensitive photos and text content could be forwarded to unknown entities worldwide and can fall into the wrong hands. NSA staff have reportedly circulated nude photos of female and male citizens in the past. A Google engineer has been reported to stalk minors.
  • Indiscriminate messaging and chat control wrongfully incriminates hundreds of users every day. According the Swiss Federal Police, 90% of machine-reported content is not illegal, for example harmless holiday photos showing nude children playing at a beach.
  • Securely encrypted communication is at risk. Up to now, encrypted messages cannot be searched by the algorithms. To change that back doors would need to be built in to messaging software. As soon as that happens, this security loophole can be exploited by anyone with the technical means needed, for example by foreign intelligence services and criminals. Private communications, business secrets and sensitive government information would be exposed. Secure encryption is needed to protect minorities, LGBTQI people, democratic activists, journalists, etc.
  • Criminal justice is being privatized. In the future the algorithms of corporations such as Facebook, Google, and Microsoft will decide which user is a suspect and which is not. The proposed legislation contains no transparency requirements for the algorithms used. Under the rule of law the investigation of criminal offences belongs in the hands of independent judges and civil servants under court supervision.
  • Indiscriminate messaging and chat control creates a precedent and opens the floodgates. Deploying technology for automatically monitoring all online communications is dangerous: It can very easily be used for other purposes in the future, for example copyright violations, drug abuse, or “harmful content”. In authoritarian states such technology is to identify and arrest government opponents and democracy activists. Once the technology is deployed comprehensively, there is no going back.
  • The temporary legislation on the table is ineffective. Contrary to its intent, it will not allow Facebook et al to continue the mass monitoring of private correspondence. It does limit the ePrivacy directive. The chat control, however, will continue to violate the General Data Protection Regulation (DSGVO) because it lacks a legal basis and violates the principle of proportionality. A complaint filed by Patrick Breyer is being examined by the Irish Data Protection Agency.

Why messaging and chat control harms children and abuse victims

Proponents claim indiscriminate messaging and chat control facilitates the prosecution of child sexual exploitation. However, this argument is controversial, even among victims of child sexual abuse. In fact messaging and chat control can hurt victims and potential victims of sexual exploitation:

  1. Safe spaces are destroyed. Victims of sexual violence are especially in need of the ability to communicate safely and confidentially to seek counseling and support, for example to safely exchange among each other, with their therapists or attorneys. The introduction of real-time monitoring takes these safe rooms away from them. This can discourage victims from seeking help and support.
  2. Self-recorded nude photos of minors (sexting) end up in the hands of company employees and police where they do not belong and are not safe.
  3. Minors are being criminalized. Especially young people often share intimate recordings with each other (sexting). With messaging and chat control in place, their photos and videos may end up in the hands of criminal investigators. German crime statistics demonstrate that 40% of all investigations for child pornography target minors.
  4. Indiscriminate messaging and chat control does not contain the circulation of illegal material but actually makes it more difficult to prosecute child sexual exploitation. It encourages offenders to go underground and use private encrypted servers which can be impossible to detect and intercept. Even on open channels, indiscriminate messaging and chat control does not contain the volume of material circulated, as evidenced by the constantly rising number of machine reports.

Alternatives: How to protect children effectively

The right way to address the problem is police work and strengthening law enforcement capacities, including (online) undercover investigations, with enough resources and expertise to infiltrate the networks where child sexual abuse material is distributed. We need better staffed and more specialized police and judicial authorities, strengthening of resources for investigation and prosecution, prevention, support, training and funding support services for victims. For example:

  1. The 2011 EU directive on combating the sexual abuse and sexual exploitation of children and child pornography needs to be fully implemented in the areas of prevention (in particular prevention programmes for offenders and for people who fear that they might offend), criminal law (especially the definition of offences and level of penalties), and assistance, support and protection measures for child victims.
  2. Undercover police work needs to be intensified to identify the producers of child sexual exploitation material.
  3. To support law enforcement, the following is needed: Sufficient hardware and software for data preparation and evaluation, technical possibilities for automated selection and reduction of data (AI), systematic internal police use of hash databases, sufficient IT capacities to process seized data storage media, carefully evaluate confiscated data carriers in order to detect new material and to identify victims, centralisation of data processing and evaluation of seized image and video material, standards/guidelines for the processing of sexual abuse cases for the police, centralised controlling (monitoring) of the number of investigations, their status, and personnel capacity; specialised training, if necessary also obligatory; special psycho-social support for investigators if needed.
  4. Children need to be educated about online safety.
  5. An EU center for preventing child sexual abuse could support Member States in putting in place usable evaluated and effective multi-disciplinary prevention measures to decrease the prevalence of child sexual abuse in the EU; serve as a hub for connecting, developing and disseminating research and expertise, and for facilitating the communication and exchange of best practices between practitioners and researchers; help develop state-of-the-art research and knowledge, including better prevention-related data; support the exchange of best practices on protection measures for victims; carry out research and serve as a hub of expertise on assistance to victims of child sexual abuse; support evidence-based policy on assistance and support to victims.

Documents on the legislative procedure

Critical commentary and further reading

“we suggest that the Commission prioritise this non-technical work, and more rapid take-down of offending websites, over client-side filtering […]”

“Due to the absence of an impact assessment accompanying the Proposal, the Commission has yet to demonstrate that the measures envisaged by the Proposal are strictly necessary, effective and proportionate for achieving their intended objective.”

“As an abuse survivor, I (and millions of other survivors across the world) rely on confidential communications to both find support and report the crimes against us – to remove our rights to privacy and confidentiality is to subject us to further injury and frankly, we have suffered enough. […] it doesn’t matter what steps we take to find abusers, it doesn’t matter how many freedoms or constitutional rights we destroy in order to meet that agenda – it WILL NOT stop children from being abused, it will simply push the abuse further underground, make it more and more difficult to detect and ultimately lead to more children being abused as the end result.”

“In practice this means that they would put private companies in charge of a matter that public authorities should handle”

„the assessment of child abuse-related facts is part of the legal profession’s area of responsibility. Accordingly, the communication exchanged between lawyers and clients will often contain relevant keywords. […] According to the Commission’s proposals, it is to be feared that in all of the aforementioned constellations there will regularly be a breach of confidentiality due to the unavoidable use of relevant terms.”

“I didn’t have confidential communications tools when I was raped; all my communications were monitored by my abusers – there was nothing I could do, there was no confidence. […] I can’t help but wonder how much different my life would have been had I had access to these modern technologies. [The planned vote on the e-Privacy Derogation] will drive abuse underground making it far more difficult to detect; it will inhibit support groups from being able to help abuse victims – IT WILL DESTROY LIVES.”

“A blanket and unprovoked monitoring of digital communication channels is neither proportionate nor necessary to detect online child abuse. The fight against sexualised violence against children must be tackled with targeted and specific measures. The investigative work is the task of the law enforcement authorities and must not be outsourced to private operators of messenger services.”

“As with other types of content scanning (whether on platforms like YouTube or in private communications) scanning everything from everyone all the time creates huge risks of leading to mass surveillance by failing the necessity and proportionality test. Furthermore, it creates a slippery slope where we start scanning for less harmful cases (copyright) and then we move on to harder issues (child sexual abuse, terrorism) and before you realise what happened scanning everything all the time becomes the new normal.”

“The DAV is explicitly in favour of combating the preparation and commission of child sexual abuse and its dissemination via the internet through effective measures at EU-level. However, the Interim Regulation proposed by the Commission would allow blatantly disproportionate infringements on the fundamental rights of users of internet-based communication services. Furthermore, the proposed Interim Regulation lacks sufficient procedural safeguards for those affected. This is why the legislative proposal should be rejected as a whole.”

“Positive hits with subsequent disclosure to governmental and non-governmental agencies would be feared not only by accused persons but above all by victims of child sexual abuse. In this context, the absolute confidentiality of legal counselling is indispensable in the interest of the victims, especially in these matters which are often fraught with shame. In these cases in particular, the client must retain the authority to decide which contents of the mandate may be disclosed to whom. Otherwise, it is to be feared that victims of child sexual abuse will not seek legal advice.”

“In the course of the initiative “Fighting child sexual abuse: detection, removal, and reporting of illegal content”, the European Union plans to abolish the digital privacy of correspondence. In order to automatically detect illegal content, all private chat messages are to be screened in the future. This should also apply to content that has so far been protected with strong end-to-end encryption. If this initiative is implemented according to the current plan it would enormously damage our European ideals and the indisputable foundations of our democracy, namely freedom of expression and the protection of privacy […]. The initiative would also severely harm Europe’s strategic autonomy and thus EU-based companies.”

Learn more

The EU Commission wants all private electronic communications to be screened for possible child pornography, in the absence of any