“Denunciation machine”: EU Parliament supports mass surveillance of e-mail, messengers and chats
Today the European Parliament’s Civil liberties Committee voted to restrict the ePrivacy Directive which aims at protecting private communications. The Parliament accepts for the first time searching all electronic communications for possible prohibited content – with disastrous consequences for the protection of privacy and data security in the EU.
As proposed by Member of the European Parliament and shadow rapporteur Patrick Breyer (Pirate Party), the Greens/EFA group justifies its position to reject the legislation as follows:
The proposal does not protect children but exposes children and adults alike to major risks (such as AI algorithms falsely flagging legal intimate depictions and conversations of children and adults relating to their health and sexual life) and violates the fundamental rights of millions of children and adults. Generally and indiscriminately analysing the content of all private correspondence of unsuspected citizens by private companies, as if the post office opened all letters in search of illegal content, is not only unacceptable with regard to the right to privacy, including of children and victims themselves, but also specifically threatens human rights of minorities, LGBTQI people, political dissidents, journalists etc. According to the Court of Justice a permanent automated analysis of communications is proportionate only if limited to suspects (case C-511/18), which the proposal is not.
Despite the proposed regulation those practices will continue to violate the GDPR (no legal basis for private actors to detect crime, lack of proportionality). As demonstrated by the rising number of reports by companies using this method of general monitoring, such mass surveillance does not contain the circulation of illegal material but will only push it further underground, making it more difficult to prosecute.
Background
With the ePrivacy Derogation, the EU Commission proposes to screen and monitor all private electronic communications in the absence of any suspicion in order to search for possible child pornographic content and „child grooming“. On 10 September 2020, it presented a draft law to this effect. Providers of e-mail, chat and messenger services would be exempted from the secrecy of communications for searching the content of all private messages. Looking for as yet unknown material would mean that even intimate photographs of adults will frequently be exposed. In addition, error-prone algorithms are to search text messages for “solicitation of sexual contact” with minors. Supposed hits are reported to the police.
A second law planned for next year will make this mass surveillance procedure compulsory, even where secure end-to-end encryption is so far being used to protect private messages.
MEP and shadow rapporteur Patrick Breyer (Pirate Party) comments:
It is frightening that corporations like Facebook and Google are rifling through all our private communications as auxiliary policemen. The communications channels used by organized crime will not be penetrated in this way. Instead, this denunciation machine falsely incriminates thousands. Children and victims of crime have a right to effective help and court-proof legislation instead of empty promises, a right to private communications for counselling and self-help instead of mass surveillance.
Amendment to the European Electronic Communications Code
Today’s decision amends a directive on the protection of internet communications, the European Electronic Communications Code. The code aims to protect the confidentiality of messages sent via messaging services, e-mail communications and internet telephony and to extend telecommunications secrecy provisions to them. The legislative changes envisaged by the Commission will remove this confidentiality.
The next steps
The report is expected to pass the Parliament‘s plenary session in mid-December. Even this year, the trilogue negotiations are to begin, in which representatives of the Parliament will negotiate with the Council behind closed doors. The law is expected to come into force in January. In the second quarter of 2021, the EU Commission intends to present a draft law that will make indiscriminate communications screening mandatory.
The practice
So far, the general and indiscriminate searching of private messages is practised by some US corporations, affecting services such as Facebook Messenger, GMail and outlook.com. Publications on social networks are also screened. The algorithms result in a veritable flood of criminal reports, with each sharing of a post being counted separately. According to the Swiss Federal Police, 90% of the content reported is not criminally relevant.
10 arguments against mass surveillance of private messages
- Mass surveillance is ineffective. The increasing number of reports demonstrates that the spread of child pornography cannot be curbed in this way – and encrypted channels are not included in the numbers, too. The right way forward is undercover investigations in „child pornography rings“.
- Mass surveillance criminalises minors. 40% of investigations for possession of “child pornography” in Germany are directed against minors who share such depictions for completely different reasons than adults, e.g. because they are inconsiderate, consider it “fun” or want to sound an alarm.
- Mass surveillance particularly harms young people and victims of abuse. Young people often send nude pictures of themselves, which threaten to fall into the wrong hands due to such algorithms. And victims of abuse need private communication channels in order to receive advice and help (e.g. from lawyers and psychologists), to discuss with other victims and in order to report cases confidentially.
- Mass surveillance makes it more difficult to prosecute child abuse. It pushes criminal communication further underground, where it is more difficult to intercept. Furthermore, the risk of being reported to the police and have their account blocked can deter users from reporting “child pornography”.
- Mass surveillance threatens secure encryption without backdoors, as encryption prevents such screening. Secure encryption protects minorities, LGBTQI persons, political dissidents, journalists, etc.
- Mass surveillance wrongfully incriminates thousands of users. According to the Swiss Federal Police, 90% of the content reported via the USA (NCMEC) is not criminally relevant, for example holiday photos on the beach with naked children.
- Mass surveillance privatises law enforcement, which under the rule of law belongs in the hands of independent public officials under judicial supervision.
- The proposed legislation is ineffective. It will not allow Facebook and others to continue mass surveillance. The proposed exception to the ePrivacy Directive is without prejudice to the General Data Protection Regulation (GDPR). Under the GDPR, mass surveillance remains illegal due to lack of a legal basis and proportionality. A complaint submitted by Patrick Breyer is pending with the data protection supervisory authorities.
- The proposed legislation violates fundamental rights and will not stand up in court. According to the case law of the European Court of Justice, a permanent automated analysis of communications is only proportionate if it is limited to suspects (Case C-511/18, paragraph 192).
- The proposed legislation creates a precedent for gradually destroying the fundamental right to respect for privacy: Will the same infrastructure be used in the future to search private messages for copyright infringements, defamation and content which the intelligence agencies are interested in? Will by the same reasoning our personal smartphones and computers be searched, letters opened and private homes bugged with AI “violence detection technology”?
Comments
Es ist natürlich ein berechtigtes Anliegen, die Verbreitung dieser Bilder so weit es geht einzudämmen – und es ist weiterhin sehr schade, dass die politischen Institutionen nicht massiv in neurowissenschaftliche Studien investieren, die sich mit der Erforschung pädophiler Neigungen auseinandersetzen und mittelfristig vielleicht sogar Behandlungsmethoden entwickeln und Missbrauch verhindern könnten.
Trotzdem sollte man die genannten Zahlen von NCMEC/EU-Kommission kritisch hinterfragen. Antworten von Facebook in einer Befragung durch den US-Senat (https://www.judiciary.senate.gov/imo/media/doc/Sullivan%20Responses%20to%20QFRs2.pdf) legen nämlich nahe, dass nicht nur Missbrauchsabbildungen an das NCMEC weitergegeben und gehasht werden, sondern zunächst einmal alle Inhalte, die gegen die Richtlinien zu “Child nudity” verstoßen, wie zum Beispiel “seemingly benign photos of unclothed children running through sprinklers in the backyard”. In einer weiteren Antwort heißt es: “We also use artificial intelligence (AI) and machine learning to proactively detect child nudity and previously unknown child exploitative content when they’re uploaded. We’re using this and other technology to more quickly identify this content, hash it, and report it to NCMEC, in accordance with US law”
Unsere Grundrechte werden also mitnichten “nur” zur Entfernung von Missbrauchsabbildungen außer Kraft gesetzt, sondern in vielen Fällen auch zur Durchsetzung spezifisch amerikanischer Moralvorstellungen, etwa wenn es um männliche und weibliche Oberkörperfreiheit (selbst im vorpubertären Alter) geht. Das Bild einer 8-Jährigen, die noch in Badehose schwimmen geht (hier auf dem Land in Europa nicht völlig ungewöhnlich) kann potenziell in der NCMEC-Datenbank als “Kindesmissbrauchsmaterial” landen und würde dann für Ashton Kutcher zu den “tens of millions of pieces of evidence of the rape of children going unseen” gehören. Das ist m.E. ein sehr fragwürdiges Vorgehen.