Sprache ändern: English

Chat Control: The EU’s CSAM scanner proposal

The End of the Privacy of Digital Correspondence

The EU Commission proposes to oblige providers to search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately. The stated aim: To prosecute child sexual exploitation material (CSEM). The result: Mass surveillance by means of fully automated real-time surveillance of messaging and chats and the end of privacy of digital correspondence.

Other aspects of the proposal include ineffective network blocking, screening of personal cloud storage including private photos, mandatory age verification resulting in the end of anonymous communication, appstore censorship and excluding minors from the digital world.

Chat Control 2.0 on every smartphone

On 11 May 2022 the European Commission presented a proposal which would make chat control searching mandatory for all e-mail and messenger providers and would even apply to so far securely end-to-end encrypted communication services. Prior to the proposal a public consultation had revealed that a majority of respondents, both citizens and stakeholders, opposed imposing an obligation to use chat control. Over 80% of respondents opposed its application to end-to-end encrypted communications.

Currently a regulation is in place allowing providers to scan communications voluntarily (so-called “Chat Control 1.0”). So far only some unencrypted US communications services such as GMail, Facebook/Instagram Messenger, Skype, Snapchat, iCloud email and X-Box apply chat control voluntarily (more details here). As a result of the mandatory Chat Control 2.0 proposal, the Commission expects a 3.5-fold increase in scanning reports (by 354%).

Parliament has positioned itself almost unanimously against indiscriminate chat control. With supporters and opponents of mandatory chat control irreconcilably opposed among EU governments (EU Council) and no common position adopted, the EU adopted a two-year extension of voluntary chat control 1.0 in 2024 – see timeline and documents. A victim of child sexual abuse and Pirate MEP Patrick Breyer have filed lawsuits to stop the voluntary indiscriminate scanning by US big tech companies (chat control 1.0).

Explainer video

Take action to stop Chat Control now!

Chat control 2.0 is back on the agenda of EU governments. In October 2024 we managed once again to stop the unprecedented chat control plan by a narrow “blocking minority” of EU governments. But the Council Presidency is bringing it back to the table. Its proposal will be discussed and potentially endorsed by ambassadors on 6 December 2024.

Several formerly opposed governments such as France have already given up their opposition. Several still critical governments are only asking for small modifications (e.g. searching for “known content” only or excluding end-to-end encryption) which would still result in mass searches and leaks of our private communications. Therefore there is a real threat that the required majority for mass scanning of private communications may be achieved at any time under the current Hungarian presidency.

That is why we now need to get involved and raise our voices to our governments and raise awareness in the wider population.
→ Previously supportive governments must be convinced to change their minds
→ Critical governments need to be pushed to demand comprehensive changes, as proposed by the European Parliament, and not just minor changes to the proposal.

In the absence of such fundamental revision, the proposal should be rejected altogether.

Sharepic showing a map of Europe. "Help Stop #ChatControl! Is your government opposing yet?" Showing most of the EU coloured in red, for "in favour" of chatcontrol. "Act now! www.chatcontrol.eu" and the logo of the European Pirates.

This map (feel free to share online!) visualises EU governments positions on chat control on 4 September according to a leak, also summarised in the table below. It helps you understand where your government stands and can help you start your journey as a digital rights advocate against chat control in your country. You will find some helpful resources below.

Is your government in favour?
→ Ask for an explanation and for your government to revert its course.

Is your government abstaining?
→ Ask why and demand that they take a strong stance against chat control.

Is your government opposing?
→ Great, but take a closer look at the reasoning: Some governments like Germany e.g. only object to the scanning of encrypted communications, but are fine with the indiscriminate scanning of other private and public communication, with the end of anonymous communication by requiring age verification, or with introducing a minimum age for “risky” communication apps. Also critical governments need to do more, exert their influence in the Council of the EU and agree on a joint list of necessary fundamental changes to the proposal. Absent such revision they should ask the European Commission to withdraw the chat control proposal as it stands.

Where your government stands on Chat Control

In favourNot in favourUndecided / unclear
BulgariaAustriaItaly
CroatiaBelgiumPortugal
CyprusCzech RepublicFinland
DenmarkEstonia
Germany
FranceLuxembourg
GreeceNetherlands
HungaryPoland
IrelandSlovenia
Latvia
Lithuania

Malta

Romania

Slovakia

Spain

Sweden
This table summarises EU governments positions on chat control as of 23 September 2024, concerning the Netherlands as of 1 October 2024, concerning Finland as of 29 November 2024.

Note that the nine countries that have expressed criticism only just form a blocking minority. They are under massive pressure and may change their stance at any time.

Take action now

These are ideas for what you can do in the short-term or with some preparation. Start with:

  • Ask you government to call on the European Commission to withdraw the chat control proposal. Point them to a joint letter that was recently sent by children’s rights and digital rights groups from across Europe. Click here to find the letter and more information. Or click here to find arguments against chat control.
  • Check your government’s position (see above) and, if they are in favour, ask them to explain why. Tell them that as a citizen you want them to reject the proposal, that chat control is widely criticised by experts and that none of the proposals tabled in the Council of the EU so far are acceptable. Ask them to protect the privacy of your communication and your IT security.
  • Share this call to action online.

When reaching out to your government, the ministries of the interior (in the lead) of justice and of digitisation/telecommunications/economy are your best bet. You can additionally contact the permanent representation of your country with the EU.

It can also be useful to reach out to Members of your national Parliament who can determine your country’s vote (e.g. the EU affairs committee). Talk to your political representatives. Whether it is the newly elected MEPs of the European Parliament or local groups of the political parties: make sure everyone is aware of what chat control is about and that you expect politicians to defend your fundamental rights against the proposal!

When contacting politicians, writing a real letter, calling in or attending a local party event or visiting a local office to have a conversation will have a stronger impact than writing an e-mail. You can find contact details on their websites. Just remember that while you should be determined in your position, remain polite, as they will otherwise disregard what you have to say. Here is useful argumentation on chat control. And here is argumentation for why the minor modifications so far envisioned by EU governments fail to address the dangers of chat control.

As we continue the fight against against chat control, we need to expand the resistance:

  • Explain to your friends why this is an important topic. This short video, translated to all European languages, is a good start – feel free to use and share it. Also available on PeerTube (EN) and YouTube (DE).
  • Taking action works better and is more motivating when you work together. So try to find allies and form alliances. Whether it is in a local hackspace or in a sports club: your local action group against chat control can start anywhere. Then you can get creative and decide which type of action suits you best.

Take action now. We are the resistance against chat control!

What is important now is to increase pressure on the negotiators:
Reach out to your government. Politely tell them your concerns about chat control (arguments here). Experience shows that phone calls are more effective than e-mails or letters. The official name of the planned mandatory chat control law is “Proposal for a Regulation laying down rules to prevent and combat child sexual abuse”.
Talk about it! Inform others about the dangers of chat control. Use the sharepics and videos available here, or the sharepics below.
Generate attention on social media! Use the hashtags #chatcontrol and #secrecyofcorrespondence
Generate media attention! So far very few media have covered the messaging and chat control plans of the EU. Get in touch with newspapers and ask them to cover the subject – online and offline.
Ask your e-mail, messaging and chat service providers! Avoid Gmail, Facebook Messenger, outlook.com and X-Box, where indiscriminate chat control is already taking place. Ask your email, messaging and chat providers if they generally monitor private messages for suspicious content, or if they plan to do so.

Sharepics and Infographics for you to download and use:

(right click on the images and select “Save Image As….”)

The Chat Control 2.0 proposal in detail

This is what the current proposal actually entails:

EU Commission’s Chat Control ProposalConsequencesEU Parliament’s mandateEU Council’s 2024 draft mandate
Envisaged are chat control, network blocking, mandatory age verification for messages and chats, age verification for app stores and exclusion of minors from installing many appsno chat control, optional network blocking, no mandatory age verification for messages and chats, no general exclusion of minors from installing many appslike Commission
All services normally provided for remuneration (including ad-funded services) are in scope, without no threshold in size, number of users etc.Only non-commercial services that are not ad-funded, such as many open source software, are out of scopelike Commissionlike Commission
Providers established outside the EU will also be obliged to implement the RegulationSee Article 33 of the proposallike Commissionlike Commission
The communication services affected include telephony, e-mail, messenger, chats (also as part of games, on part of games, on dating portals, etc.), videoconferencingTexts, images, videos and speech (e.g. video meetings, voice messages, phone calls) would have to be scannedtelephony excluded, no scanning of text messages, but scanning of e-mail, messenger, chat, videoconferencing serviceslike Parliament
End-to-end encrypted messenger services are not excluded from the scopeProviders of end-to-end encrypted communications services will have to scan messages on every smartphone (client-side scanning) and, in case of a hit, report the message to the policeEnd-to-end encrypted messenger services are excluded from the scopelike Commission
Hosting services affected include web hosting, social media, video streaming services, file hosting and cloud servicesEven personal storage that is not being shared, such as Apple’s iCloud, will be subject to chat controllike Commissionlike Commission, additionally search engines
Services that are likely to be used for illegal material or for child grooming are obliged to search the content of personal communication and stored data (chat control) without suspicion and indiscriminatelySince presumably every service is also used for illegal purposes, all services will be obliged to deploy chat controlTargeted scanning of specific individuals and groups reasonably suspicious of being linked to child sexual abuse material onlylike Commission.
Scanning limited to “high risk” services but practically all services would remain in scope.
User can refuse scanning but would then be blocked from receiving and sending images, videos or hyperlinks.
Security and military accounts to be exempted from scanning.
The authority in the provider’s country of establishment is obliged to order the deployment of chat controlThere is no discretion in when and in what extent chat control is orderedlike Commissionlike Commission.
Additionally providers would continue to be permitted to voluntarily scan for known/unknown material and grooming indications for a period of 54 months.
Chat control involves automated searches for known CSEM images and videos, suspicious messages/files will be reported to the policeAccording to the Swiss Federal Police, 80% of the reports they receive (usually based on the method of hashing) are criminally irrelevant. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”.like Commissionlike Commission
Chat control also involves automated searches for unknown CSEM pictures and videos, suspicious messages/files will be reported to the policeMachine searching for unknown abuse representations is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not accessible to the public and the scientific community, nor does the draft contain any disclosure requirement. The error rate is unknown and is not limited by the draft regulation. Presumably, these technologies result in massive amounts of false reports. The draft legislation allows providers to pass on automated hit reports to the police without humans checking them.like Commissionexcluded (September 2024)
Chat control involves machine searches for possible child grooming, suspicious messages will be reported to the policeMachine searching for potential child grooming is an experimental procedure using machine learning (“artificial intelligence”). The algorithms are not available to the public and the scientific community, nor does the draft contain a disclosure requirement. The error rate is unknown and is not limited by the draft regulation, presumably these technologies result in massive amounts of false reports.no searches for grooming no searches for grooming
Communication services that can be misused for child grooming (thus all) must verify the age of their usersIn practice, age verification involves full user identification, meaning that anonymous communication via email, messenger, etc. will effectively be banned. Whistleblowers, human rights defenders and marginalised groups rely on the protection of anonymity.no mandatory age verification for users of communication serviceslike Commission
App stores must verify the age of their users and block children/young people under 16 from installing apps that can be misused for solicitation purposesAll communication services such as messenger apps, dating apps or games can be misused for child grooming (according to surveys) and would be blocked for children/young people to use.Where an app requires consent in data processing, dominant app stores (Google, Apple) are to make a reasonable effort to ensure parental consent for users below 16like Commission
Internet access providers can be obliged to block access to prohibited and non-removable images and videos hosted outside the EU by means of network blocking (URL blocking)Network blocking is technically ineffective and easy to circumvent, and it results in the construction of a technical censorship infrastructureInternet access providers MAY be obliged to block access (discretion)like Commission

Changes proposed by the European Parliament

In November 2023, the European Parliament nearly unanimously adopted a negotiating mandate for the draft law. With the Pirate Party MEP Patrick Breyer, the most determined opponent of chat control sat at the negotiating table. The result: Parliament wants to pull the following fangs out of the EU Commission’s extreme draft:

  1. We want to safeguard the digital secrecy of correspondence and remove the plans for blanket chat control, which violate fundamental rights and stand no chance in court. The current voluntary chat control of private messages (not social networks) by US internet companies is being phased out. Targeted telecommunication surveillance and searches will only be permitted with a judicial warrant and only limited to persons or groups of persons suspected of being linked to child sexual abuse material.
  2. We want to safeguard trust in secure end-to-end encryption. We clearly exclude so-called client-side scanning, i.e. the installation of surveillance functionalities and security vulnerabilities in our smartphones.
  3. We want to guarantee the right to anonymous communication and remove mandatory age verification for users of communication services. Whistleblowers can thus continue to leak wrong-doings anonymously without having to show their identity card or face.
  4. Removing instead of blocking: Internet access blocking ist to be optional. Under no circumstances must legal content be collaterally blocked.
  5. We prevent the digital house arrest: We don’t want to oblige app stores to prevent young people under 16 from installing messenger apps, social networking and gaming apps ‘for their own protection’ as proposed. The General Data Protection Regulation is maintained.

Parliament wants to protect young people and victims of abuse much more effectively than the EU Commission’s extreme proposal:

  1. Security by design: In order to protect young people from grooming, Parliament wants internet services and apps to be secure by design and default. It must be possible to block and report other users. Only at the request of the user should he or she be publicly addressable and see messages or pictures of other users. Users should be asked for confirmation before sending contact details or nude pictures. Potential perpetrators and victims should be warned where appropriate, for example if they try to search for abuse material using certain search words. Public chats at high risk of grooming are to be moderated.
  2. In order to clean the net of child sexual abuse material, Parliament wants the new EU Child Protection Centre to proactively search publicly accessible internet content automatically for known CSAM. This crawling could also be used in the darknet and is thus more effective than private surveillance measures by providers.
  3. Parliament wants to oblige providers who become aware of clearly illegal material to remove it – unlike in the EU Commission’s proposal. Law enforcement agencies who become aware of illegal material would be obliged to report it to the provider for removal. This is our reaction to the case of the darknet platform Boystown, where the worst abuse material was further disseminated for months with the knowledge of Europol.

Beware: This is only the Parliament’s negotiating mandate, which usually only partially prevails. Most EU governments continue to support the original chat control proposal of the EU Commission without significant compromises. However, many other governments prevent such a positioning (so-called blocking minority). As soon as the EU governments have reached an agreement in Council, Parliament, Council and Commission will start the so-called trilogue negotiations on the final version of the regulation.

Amendments discussed by EU governments

In June 2024 an extremely narrow “blocking minority” of EU governments prevented the EU Council from endorsing chat control. Chat control proponents achieved 63.7% of the 65% of votes threshold required in the Council of the EU for a qualified majority.

Minor changes to the proposal are on the table in Council to woo critical governments:

  • Chat control detection orders would be limited to “high risk” services. Governments would have broad discretion concerning risk classification, and anonymity, encryption or real-time communications is per se considered “high risk”, meaning that effectively chat control would still be applied indiscriminately and to all relevant services and apps with communication functions. In any case, the service used is no justification for searching the private chats of millions of citizens who are not even remotely connected to any wrongdoing.
  • Technologies used for detection in end-to-end encrypted services would be “vetted” with regard to their effectiveness, their impact on fundamental rights and risks to cyber security. However the so-called “client-side scanning” on our smartphones creates risks for hacking and abuse, and destroys trust in the confidentiality of private communication. Since several providers (Signal, Threems) have announced they would rather cease services in the EU than implement bugs in their apps, the proposal would effectively cut Europeans off communication with other users of these apps.
  • Scanning would be limited to visual content and URLs. But it is exactly the scanning of visual content which under the current voluntary scheme exposes intimate family photos to viewing by unknown persons, results in thosands of false positives and leads to mass criminalisation of teenagers.
  • Using AI to automatically search for previously unknown CSAM would result in the disclosure of chats only after two hits. But this limitation will usually be ineffective as falsely flagged beach photos or consensual sexting rarely involve just a single photo. The EU Commissioner for Home Affairs has herself herself stated that three out of four of the disclosed chats and photos are not actionable for the police. Algorithms and hash databases are generally unreliable in distinguishing legal from illegal content.
  • Scanning (“upload moderation”) would be applied only to users who consent. If a user does not agree to the scanning of their chats, they would still be able to use the service for sending text messages, but would no longer be able to send or receive images, videos, iconography, stickers or URLs. Clearly this does not give users a real choice, as using messenger services purely for texting is not an option in the 21st century.
  • Professional accounts of staff of intelligence agencies, police and military would be exempted from the scanning of chats and messages. This exception proves that interior ministers know exactly just how unreliable and dangerous the snooping algorithms are that they want to unleash on us citizens. Ministers themselves do not want to suffer the consequences of the destruction of digital privacy of correspondence and secure encryption that they would impose on us.

Here is a list of the comprehensive changes that are missing to make the proposal acceptable.

Since September 2024 the new Hungarian Presidency is additionally proposing to remove the obligation to use AI to automatically search for previously unknown CSAM (this would remain voluntary). However:

  • this fails to address the fundamental concerns about chat control, including indiscriminate mass monitoring of private communications and the end of secure end-to-end encryption which keeps us safe
  • the Council’s Legal Service reaffirmed that its concerns regarding a likely violation of the fundamental right to privacy persist
  • reports on sharing known CSAM are routinely discarded by law enforcement as not criminally relevant or „non actionable“, for example because it is unclear whether participants acted intentionally, what age persons have, because specificities of national criminal law in a specific country (e.g. is fictional/cartoon material criminal). No algorithm can reliably make a legal assessment. According to Meta, which is the source of the vast majority of reports, they currently only look for known CSAM in EU communications, and yet at least 50% NCMEC of reports made to German law enforcement agencies are not criminally relevant, according to the federal crime agency (BKA). EU Commissioner Johansson admitted in late 2023 that Johansson admitted that 75% of NCMEC reports are not of a quality that the police can work with.
  • looking for reoccurences of known material will not save children from ongoing abuse
  • mass prosecution of known CSAM will divert resources needed to investigate contact abuse

More videos on Chatcontrol are available in this playlist


The negotiations: A timeline

2020: The European Commission proposed “temporary” legislation allowing for chat control

The proposed “temporary” legislation allows the searching of all private chats, messages, and emails for illegal depictions of minors and attempted initiation of contacts with minors. This allows the providers of Facebook Messenger, Gmail, et al, to scan every message for suspicious text and images. This takes place in a fully automated process, in part using error-prone “artificial intelligence”. If an algorithm considers a message suspicious, its content and meta-data are disclosed (usually automatically and without human verification) to a private US-based organization and from there to national police authorities worldwide. The reported users are not notified.

6 July 2021: The European Parliament adopted the legislation allowing for chat control.

The European Parliament voted in favour for the ePrivacy Derogation, which allows for voluntary chat control for messaging and email providers. As a result of this some U.S. providers of services such as Gmail and Outlook.com are already performing such automated messaging and chat controls.

9 May 2022: Member of the European Parliament Patrick Breyer has filed a lawsuit against U.S. company Meta.

According to the case-law of the European Court of Justice the permanent and comprehensive automated analysis of private communications violates fundamental rights and is prohibited (paragraph 177). Former judge of the European Court of Justice Prof. Dr. Ninon Colneric has extensively analysed the plans and concludes in a legal assessment that the EU legislative plans on chat control are not in line with the case law of the European Court of Justice and violate the fundamental rights of all EU citizens to respect for privacy, to data protection and to freedom of expression. On this basis the lawsuit was filed.

11 Mai 2022: The Commission presented a proposal to make chat control mandatory for service providers.

On 11 May 2022 the EU Commission made a second legislative proposal, in which the EU Commission obliges all providers of chat, messaging and e-mail services to deploy this mass surveillance technology in the absence of any suspicion. However, a representative survey conducted in March 2021 clearly shows that a majority of Europeans oppose the use of chat control  (Detailed poll results here).


How does this affect you?

Messaging and chat control scanning:

  • All of your chat conversations and emails will be automatically searched for suspicious content. Nothing remains confidential or secret. There is no requirement of a court order or an initial suspicion for searching your messages. It occurs always and automatically.
  • If an algorithms classifies the content of a message as suspicious, your private or intimate photos may be viewed by staff and contractors of international corporations and police authorities. Also your private nude photos may be looked at by people not known to you, in whose hands your photos are not safe.
  • Flirts and sexting may be read by staff and contractors of international corporations and police authorities, because text recognition filters looking for “child grooming” frequently falsely flag intimate chats.
  • You can falsely be reported and investigated for allegedly disseminating child sexual exploitation material. Messaging and chat control algorithms are known to flag completely legal vacation photos of children on a beach, for example. According to Swiss federal police authorities, 80% of all machine-generated reports turn out to be without merit. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”. Nearly 40% of all criminal investigation procedures initiated in Germany for “child pornography” target minors.
  • On your next trip overseas, you can expect big problems. Machine-generated reports on your communications may have been passed on to other countries, such as the USA, where there is no data privacy – with incalculable results.
  • Intelligence services and hackers may be able to spy on your private chats and emails. The door will be open for anyone with the technical means to read your messages if secure encryption is removed in order to be able to screen messages.
  • This is only the beginning. Once the technology for messaging and chat control has been established, it becomes very easy to use them for other purposes. And who guarantees that these incrimination machines will not be used in the future on our smart phones and laptops?

Age verification:

  • You can no longer set up anonymous e-mail or messenger accounts or chat anonymously without needing to present an ID or your face, making you identifiable and risking data leaks. This inhibits for instance sensitive chats related to sexuality, anonymous media communications with sources (e.g. whistleblowers) as well as political activity.
  • If you are under 16, you will no longer be able to install the following apps from the app store (reason given: risk of grooming): Messenger apps such as Whatsapp, Snapchat, Telegram or Twitter, social media apps such as Instagram, TikTok or Facebook, games such as FIFA, Minecraft, GTA, Call of Duty, Roblox, dating apps, video conferencing apps such as Zoom, Skype, Facetime.
  • If you don’t use an appstore, compliance with the provider’s minimum age will still be verified and enforced. If you are under the minimum age of 16 years you can no longer use Whatsapp due to the proposed age verification requirements; the same applies to the online functions of the game FIFA 23. If you are under 13 years old you’ll no longer be able to use TikTok, Snapchat or Instagram.

Click here for further arguments against messaging and chat control

Click here to find out what you can do to stop messaging and chat control


Additional information and arguments


Debunking Myths

When the draft law on chat control was first presented in May 2022, the EU Commission promoted the controversial plan with various arguments. In the following, various claims are questioned and debunked:

1. “Today, photos and videos depicting child sexual abuse are massively circulated on the Internet. In 2021, 29 million cases were reported to the U.S. National Centre for Missing and Exploited Children.”

To speak exclusively of depictions of child sexual abuse in the context of chat control is misleading. To be sure, child sexual exploitation material (CSEM) is often footage of sexual violence against minors (child sexual abuse material, CSAM). However, an international working group of child protection institutions points out that criminal material also includes recordings of sexual acts or of sexual organs of minors in which no violence is used or no other person is involved. Recordings made in everyday situations are also mentioned, such as a family picture of a girl in a bikini or naked in her mother’s boots. Recordings made or shared without the knowledge of the minor are also covered. Punishable CSEM also includes comics, drawings, manga/anime, and computer-generated depictions of fictional minors. Finally, criminal depictions also include self-made sexual recordings of minors, for example, for forwarding to partners of the same age (“sexting”). The study therefore proposes the term “depictions of sexual exploitation” of minors as a correct description. In this context, recordings of children (up to 14 years of age) and adolescents (up to 18 years of age) are equally punishable.

2. “In 2021 alone, 85 million images and videos of child sexual abuse were reported worldwide.”

There are many misleading claims circulating about how to quantify the extent of sexually exploitative images of minors (CSEM). The figure the EU Commission uses to defend its plans comes from the U.S. nongovernmental organization NCMEC (National Center for Missing and Exploited Children) and includes duplicates because CSEM is shared multiple times and often not deleted. Excluding duplicates, 22 million unique recordings remain of the 85 million reported.

75 percent of all NCMEC reports from 2021 came from Meta (FB, Instagram, Whatsapp). Facebook’s own internal analysis says that “more than 90 percent of [CSEM on Facebook in 2020] was identical or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child-exploitative content.” So the NCMEC’s much-cited numbers don’t really describe the extent of online recordings of sexual violence against children. Rather, they describe how often Facebook discovers copies of footage it already knows about. That, too, is relevant.

Not all unique recordings reported to NCMEC show violence against children. The 85 million depictions reported by NCMEC also include consensual sexting, for example. The number of depictions of abuse reported to NCMEC in 2021 was 1.4 million.

7% of NCMEC’s global SARs go to the European Union.

Moreover, even on Facebook, where chat control has long been used voluntarily, the numbers for the dissemination of abusive material continue to rise. Chat control is thus not a solution.

Source

3. “64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year.”

That the voluntary chat control algorithms of large U.S. providers have reported more CSEM does not indicate how the amount of CSEM has evolved overall. The very configuration of the algorithms has a large impact on the number of SARs. Moreover, the increase shows that the circulation of CSEM cannot be controlled by means of chat control.

4. “Europe is the global hub for most of the material.”

7% of global NCMEC SARs go to the European Union. Incidentally, European law enforcement agencies such as Europol and BKA knowingly do not report abusive material to storage services for removal, so the amount of material stored here cannot decrease.

5. “A Europol-backed investigation based on a report from an online service provider led to the rescue of 146 children worldwide, with over 100 suspects identified across the EU.”

The report was made by a cloud storage provider, not a communications service provider. To screen cloud storage, it is not necessary to mandate the monitoring of everyone’s communications. If you want to catch the perpetrators of online crimes related to child abuse material, you should use so-called honeypots and other methods that do not require monitoring the communications of the entire population.

6. “Existing means of detecting and removing child sexual expoitation material will no longer be available when the current interim regulation expires in 2024.”

Hosting service providers (filehosters, clouds) and social media providers will be allowed to continue scanning after the ePrivacy exemption expires. No regulation is needed for removing child sexual exploitation material, either. For providers of communications services, the regulation for voluntary chat scanning (chat control 1) could be extended in time without requiring all providers to scan (as per chat control 2).

7. metaphors: Chat control is “like a spam filter” / “like a magnet looking for a needle in a haystack: the magnet doesn’t see the hay.” / “like a police dog sniffing out letters: it has no idea what’s inside.” The content of your communication will not be seen by anyone if there is no hit. “Detection for cybersecurity purposes is already taking place, such as the detection of links in WhatsApp” or spam filters.

Malware and spam filters do not disclose the content of private communications to third parties and do not result in innocent people being flagged. They do not result in the removal or long-term blocking of profiles in social media or online services.

8. “As far as the detection of new abusive material on the net is concerned, the hit rate is well over 90 %. … Some existing grooming detection technologies (such as Microsoft’s) have an “accuracy rate” of 88%, before human review.”

With the unmanageable number of messages, even a small error rate results in countless false positives that can far exceed the number of correct messages. Even with a 99% hit rate, this would mean that of the 100 billion messages sent daily via Whatsapp alone, 1 billion (i.e., 1,000,000,000) false positives would need to be verified. And that’s every day and only on a single platform. The “human review burden” on law enforcement would be immense, while the backlog and resource overload are already working against them.

Separately, an FOI request from former MEP Felix Reda exposed the fact that these claims about accuracy come from industry – from those who have a vested interest in these claims because they want to sell you detection technology (Thorn, Microsoft). They refuse to submit their technology to independent testing, and we should not take their claims at face value.

The EU’s evaluation of child abuse detection tools is based solely on industry claims taken at face value.


Mass surveillance is the wrong approach to fighting “child pornography” and sexual exploitation

  • Scanning private messages and chats does not contain the spread of CSEM. Facebook, for example, has been practicing chat control for years, and the number of automated reports has been increasing every year, most recently reaching 22 million in 2021.
  • Mandatory chat control will not detect the perpetrators who record and share child sexual exploitation material. Abusers do not share their material via commercial email, messenger, or chat services, but organize themselves through self-run secret forums without control algorithms. Abusers also typically upload images and videos as encrypted archives and share only the links and passwords. Chat control algorithms do not recognize encrypted archives or links.
  • The right approach would be to delete stored CSEM where it is hosted online. However, Europol does not report known CSEM material.
  • Chat control harms the prosecution of child abuse by flooding investigators will millions of automated reports, most of which are criminally irrelevant.

Message and chat control harms everybody

  • All citizens are placed under suspicion, without cause, of possibly having committed a crime. Text and photo filters monitor all messages, without exception. No judge is required to order to such monitoring – contrary to the analog world which guarantees the privacy of correspondence and the confidentiality of written communications. According to a judgment by the European Court of Justice, the permanent and general automatic analysis of private communications violates fundamental rights (case C-511/18, Paragraph 192). Nevertheless, the EU now intends to adopt such legislation. For the court to annul it can take years. Therefore we need to prevent the adoption of the legislation in the first place.
  • The confidentiality of private electronic correspondence is being sacrificed. Users of messenger, chat and e-mail services risk having their private messages read and analyzed. Sensitive photos and text content could be forwarded to unknown entities worldwide and can fall into the wrong hands. NSA staff have reportedly circulated nude photos of female and male citizens in the past. A Google engineer has been reported to stalk minors.
  • Indiscriminate messaging and chat control wrongfully incriminates hundreds of users every day. According the Swiss Federal Police, 80% of machine-reported content is not illegal, for example harmless holiday photos showing nude children playing at a beach. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”.
  • Securely encrypted communication is at risk. Up to now, encrypted messages cannot be searched by the algorithms. To change that back doors would need to be built in to messaging software. As soon as that happens, this security loophole can be exploited by anyone with the technical means needed, for example by foreign intelligence services and criminals. Private communications, business secrets and sensitive government information would be exposed. Secure encryption is needed to protect minorities, LGBTQI people, democratic activists, journalists, etc.
  • Criminal justice is being privatized. In the future the algorithms of corporations such as Facebook, Google, and Microsoft will decide which user is a suspect and which is not. The proposed legislation contains no transparency requirements for the algorithms used. Under the rule of law the investigation of criminal offences belongs in the hands of independent judges and civil servants under court supervision.
  • Indiscriminate messaging and chat control creates a precedent and opens the floodgates to more intrusive technologies and legislation. Deploying technology for automatically monitoring all online communications is dangerous: It can very easily be used for other purposes in the future, for example copyright violations, drug abuse, or “harmful content”. In authoritarian states such technology is to identify and arrest government opponents and democracy activists. Once the technology is deployed comprehensively, there is no going back.

Messaging and chat control harms children and abuse victims

Proponents claim indiscriminate messaging and chat control facilitates the prosecution of child sexual exploitation. However, this argument is controversial, even among victims of child sexual abuse. In fact messaging and chat control can hurt victims and potential victims of sexual exploitation:

  1. Safe spaces are destroyed. Victims of sexual violence are especially in need of the ability to communicate safely and confidentially to seek counseling and support, for example to safely exchange among each other, with their therapists or attorneys. The introduction of real-time monitoring takes these safe rooms away from them. This can discourage victims from seeking help and support.
  2. Self-recorded nude photos of minors (sexting) end up in the hands of company employees and police where they do not belong and are not safe.
  3. Minors are being criminalized. Especially young people often share intimate recordings with each other (sexting). With messaging and chat control in place, their photos and videos may end up in the hands of criminal investigators. German crime statistics demonstrate that nearly 40% of all investigations for child pornography target minors.
  4. Indiscriminate messaging and chat control does not contain the circulation of illegal material but actually makes it more difficult to prosecute child sexual exploitation. It encourages offenders to go underground and use private encrypted servers which can be impossible to detect and intercept. Even on open channels, indiscriminate messaging and chat control does not contain the volume of material circulated, as evidenced by the constantly rising number of machine reports.

Talk at Chaos Computer Congress (29 December 2023): It’s not over yet


Alternatives

Strengthening the capacity of law enforcement

Currently, the capacity of law enforcement is so inadequate it often takes months and years to follow up on leads and analyse collected data. Known material is often neither analysed nor removed. Those behind the abuse do not share their material via Facebook or similar channels, but on the darknet. To track down perpetrators and producers, undercover police work must take place instead of wasting scarce capacities on checking often irrelevant machine reports. It is also essential to strengthen the responsible investigative units in terms of personnel and funding and financial resources, to ensure long-term, thorough and sustained investigations. Reliable standards/guidelines for the police handling of sexual abuse investgations need to be developed and adhered to.

Addressing not only symptoms, but the root cause

Instead of ineffective technical attempts to contain the spread of exploitation material that has been released, all efforts must focus on preventing such recordings in the first place. Prevention concepts and training play a key role because the vast majority of abuse cases never even become known. Victim protection organisations often suffer from unstable funding.

Fast and easily available support for (potential) victims

  1. Mandatory reporting mechanisms at online services: In order to achieve effective prevention of online abuse and especially grooming, online services should be required to prominently place reporting functions on the platforms. If the service is aimed at and/or used by young people or children, providers should also be required to inform them about the risks of online grooming.
  2. Hotlines and counseling centers: Many national hotlines dealing with cases of reported abuse are struggling with financial problems. It is essential to ensure there is sufficient capacity to follow up on reported cases.

Improving media literacy

Teaching digital literacy at an early age is an essential part of protecting for protecting children and young people online. The children themselves must have the knowledge and tools to navigate the Internet safely. They must be informed that dangers also lurk online and learn to recognise and question patterns of grooming. This could be achieved, for example, through targeted programs in schools and training centers, in which trained staff convey knowledge and lead discussions. Children need to learn to speak up, respond and report abuse, even if the abuse comes from within their sphere of trust (i.e., by people close to them or other people they know and trust), which is often the case. They also need to have access to safe, accessible, and age-appropriate channels to report abuse without fear.


Document pool on chat control 2.0

EP legislative observatory (continuously updated state of play)

European Commission

European Parliament

Council of the European Union (Council)

Statements and Assessments

Document pool on voluntary Chat Control 1.0


Critical commentary and further reading

“Governments seeking to limit encryption have often failed to show that the restrictions they would impose are necessary to meet a particular legitimate interest, given the availability of various other tools and approaches that provide the information needed for specific law enforcement or other legitimate purposes. Such alternative measures include improved, better-resourced traditional policing, undercover operations, metadata analysis and strengthened international police cooperation.”

“Any digital surveillance of children, together with any associated automated processing of personal data, should respect the child’s right to privacy and should not be conducted routinely, indiscriminately or without the child’s knowledge…”

“we suggest that the Commission prioritise this non-technical work, and more rapid take-down of offending websites, over client-side filtering […]”

“Due to the absence of an impact assessment accompanying the Proposal, the Commission has yet to demonstrate that the measures envisaged by the Proposal are strictly necessary, effective and proportionate for achieving their intended objective.”

“As an abuse survivor, I (and millions of other survivors across the world) rely on confidential communications to both find support and report the crimes against us – to remove our rights to privacy and confidentiality is to subject us to further injury and frankly, we have suffered enough. […] it doesn’t matter what steps we take to find abusers, it doesn’t matter how many freedoms or constitutional rights we destroy in order to meet that agenda – it WILL NOT stop children from being abused, it will simply push the abuse further underground, make it more and more difficult to detect and ultimately lead to more children being abused as the end result.”

“Using the veil of morality and the guise of protecting the most vulnerable and beloved in our societies to introduce this potential monster of an initiative is despicable.”

“Especially being a victim of sexual abuse, it is important to me that trusted communication is possible, e.g. in self-help groups and with therapists. If encryption is undermined, this also weakens the possibilities for those affected by sexual abuse to seek help.”

“Having been a victim of sexual violence myself as a child, I am convinced that the only way to move forward on this issue is through education. Generalized surveillance of communications will not help children to stop suffering from this unacceptable violence.”

“In practice this means that they would put private companies in charge of a matter that public authorities should handle”

„the assessment of child abuse-related facts is part of the legal profession’s area of responsibility. Accordingly, the communication exchanged between lawyers and clients will often contain relevant keywords. […] According to the Commission’s proposals, it is to be feared that in all of the aforementioned constellations there will regularly be a breach of confidentiality due to the unavoidable use of relevant terms.”

“I didn’t have confidential communications tools when I was raped; all my communications were monitored by my abusers – there was nothing I could do, there was no confidence. […] I can’t help but wonder how much different my life would have been had I had access to these modern technologies. [The planned vote on the e-Privacy Derogation] will drive abuse underground making it far more difficult to detect; it will inhibit support groups from being able to help abuse victims – IT WILL DESTROY LIVES.”

“A blanket and unprovoked monitoring of digital communication channels is neither proportionate nor necessary to detect online child abuse. The fight against sexualised violence against children must be tackled with targeted and specific measures. The investigative work is the task of the law enforcement authorities and must not be outsourced to private operators of messenger services.”

“As with other types of content scanning (whether on platforms like YouTube or in private communications) scanning everything from everyone all the time creates huge risks of leading to mass surveillance by failing the necessity and proportionality test. Furthermore, it creates a slippery slope where we start scanning for less harmful cases (copyright) and then we move on to harder issues (child sexual abuse, terrorism) and before you realise what happened scanning everything all the time becomes the new normal.”

“The DAV is explicitly in favour of combating the preparation and commission of child sexual abuse and its dissemination via the internet through effective measures at EU-level. However, the Interim Regulation proposed by the Commission would allow blatantly disproportionate infringements on the fundamental rights of users of internet-based communication services. Furthermore, the proposed Interim Regulation lacks sufficient procedural safeguards for those affected. This is why the legislative proposal should be rejected as a whole.”

“Positive hits with subsequent disclosure to governmental and non-governmental agencies would be feared not only by accused persons but above all by victims of child sexual abuse. In this context, the absolute confidentiality of legal counselling is indispensable in the interest of the victims, especially in these matters which are often fraught with shame. In these cases in particular, the client must retain the authority to decide which contents of the mandate may be disclosed to whom. Otherwise, it is to be feared that victims of child sexual abuse will not seek legal advice.”

“In the course of the initiative “Fighting child sexual abuse: detection, removal, and reporting of illegal content”, the European Union plans to abolish the digital privacy of correspondence. In order to automatically detect illegal content, all private chat messages are to be screened in the future. This should also apply to content that has so far been protected with strong end-to-end encryption. If this initiative is implemented according to the current plan it would enormously damage our European ideals and the indisputable foundations of our democracy, namely freedom of expression and the protection of privacy […]. The initiative would also severely harm Europe’s strategic autonomy and thus EU-based companies.

Experts from the police and academia are rather critical of the EU’s plan: on the one hand, they fear many false reports by the scanners, and on the other hand, an alibi function of the law. Daniel Kretzschmar, spokesman for the Federal Board of the Association of German Criminal Investigators, says that the fight against child abuse depictions is “enormously important” to his association. Nevertheless, he is skeptical: unsuspected persons could easily become the focus of investigations. At the same time, he says, privatizing these initiative investigations means “making law enforcement dependent on these companies, which is actually a state and sovereign task. “

Thomas-Gabriel Rüdiger, head of the Institute for Cybercriminology at the Brandenburg Police University, is also rather critical of the EU project. “In the end, it will probably mainly hit minors again,” he told WELT. Rüdiger refers to figures from the crime statistics, according to which 43 percent of the recorded crimes in the area of child pornographic content would be traced back to children and adolescents themselves. This is the case, for example, with so-called “sexting” and “schoolyard pornography “, when 13- and 14-year-olds send each other lewd pictures.

Real perpetrators, who you actually want to catch, would probably rather not be caught. “They are aware of what they have done and use alternatives. Presumably, USB sticks and other data carriers will then be increasingly used again,” Rüdiger continues.

“In accordance with EU fundamental rights law, the surveillance or interception of private communications or their metadata for detecting, investigating or prosecuting online CSAM must be limited to genuine suspects against whom there is reasonable suspicion, must be duly justified and specifically warranted, and must follow national and EU rules on policing, due process, good administration, non-discrimination and fundamental rights safeguards.”

In the run-up to the official proposal later this year, we urge all European Commissioners to remember their responsibilities to human rights, and to ensure that a proposal which threatens the very core of people’s right to privacy, and the cornerstone of democratic society, is not put forward.

“In the view of our experts, academics and IT professionals, all efforts to intercept and extensively monitor chat communication via client site scanning has a tremendous negative impact on the IT security of millions of European internet users and businesses. Therefore, a European right to secure communication and effective encryption for all must become a standard.”