Sprache ändern: English
Share:

Lifting the veil on the secretive EU terror filter negotiations: Here’s where we stand [updated 07/12/2020]

European Parliament Freedom, democracy and transparency
On 10 December, the sixth and probably final “Trilogue” meeting will be held on the EU’s proposed terrorist content online law infamous for its “upload filter”/”preventive measures” and “one hour removal orders” provisions. The most recent porposal by the German Council Presidency is published here.

Trilogue means: In a series of closed-door meetings, the European Parliament and the Council (representing the member state governments), supported by the Commission, hammer out a final text acceptable to both institutions. It’s the last chance to make changes before the regulation gets adopted.

english🇬🇧
deutsch🇩🇪
français🇫🇷
español🇪🇸
italiano🇮🇹
hrvatski🇭🇷

In my previous blog post I have published the timetable of negotiations as well as the negotiators and external documents for further reading, including by the Fundamental Rights Agency, three UN Special Rapporteurs, Digitaleurope, Global Networks Initiative and Internet pioneers.

A “4 column document” was recently leaked which reveals a compromise proposed by the European Parliament‘s lead negotiator (rapporteur) as well as proposals by the Commission, and the Member States positions have also been leaked.

Negotiations are nearing the end and are to be concluded in mid- or end of November 2020. Below, you can see the different positions of the Parliament, Commission and Council and what how these would affect everyone, from internet users to SMEs and other online enterprises.

Video

Video
Video: Interview on the proposed regulation (external link)

Council and Parliament positions

Let’s begin by taking a close look at the similarities and differences of the Council and Parliament positions (as proposed now), and break down what they would mean for you:

Commission and Council want Parliament wants What this means for you
  • The proposed regulation shall apply to anybody who is making available information online at the request of a user
  • The proposed regulation shall apply to anybody who is publishing information at the request of a user, except for “closed user groups consisting of a finite number of pre-determined people”, communications services (e.g. messengers) and cloud infrastructure providers
If you operate a website users can contribute to (e.g. a wordpress blog with comments function or a wiki) you would have to satisfy removal orders within one hour, even at night, and possibly implement upload filters. Trolls could provoke authorities to act against you by repeatedly posting terrorist content on your website.
  • Hosting service providers shall remove content within 1 hour of receiving a removal order
  • Hosting service providers shall remove content within 1 hour of receiving a removal order except in cases of “de facto impossibility not attributable to the hosting service provider, including for technical reasons”; 12-hour advance warning for the first order
Individuals or small organisations may decide to terminate services and platforms rather than stand ready 24/7 to respond within one hour to a removal order that will never be issued, because 99%+ of platforms are never targeted with terrorist propaganda.
The exception proposed by the European Parliament is too narrow and lacks the legal certainty to protect small businesses, non-profit organizations and individuals operating websites.
  • Removal of terrorist content can be ordered even if disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
  • Protect and preserve content disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
Some media reports on terrorist activities could be removed if the Commission’s proposal goes through. Video archives that document war crimes, for example in Syria, will partially disappear, resulting in impunity of perpetrators. Spain has prosecuted artists for satirical performances in the past, so recordings of performances could be deleted in the future.
  • Member states can freely decide which authority may order content take-downs
  • National authorities shall not seek or take instructions from any other body when making orders
Where no requirement of independence is in place, ministers could directly order the removal of content for political reasons.
The European Parliament’s proposal does not prevent the government from designating a Ministry to be able to directly order content take-downs.
  • The authority of any one EU Member state can order the removal of content in any other EU Member state (cross-border removal orders)
  • Member state authorities can have content removed by hosting service providers located in their own country and request providers in other Member States to remove content
If Commission and Council have their way, content you wish to access could have been removed due to orders from Member States with populist, authoritarian governments. For example, Hungary has been found in the past to disrespect the rule of law. Hungary has called environmental activists “ecoterrorists”. Spain is using anti-terrorism laws against the Catalan independence movement. France has also been reported to excessively request content removals abroad.
  • EU authorities can order content removals with a global effect (e.g. order a US provider to remove content with effect for US citzens)
  • Service providers can satisfy removal orders by disabling access to content “in all Member States” (in the EU), without having to delete content altogether
Removing content globally could result in other states such as Turkey, Russia, Saudi Arabia or China requiring the removal content which, in Europe, is perfectly legal and legitimate.
  • Hosting service providers shall decide with priority on whether or not to take down content due to a violation of their terms of service upon receiving a “referral” from a national authority
  • National authorities shall only report actual terrorist content as defined by the relevant legislation
Referrals would result in the removal of content that is perfectly legal and merely violates the arbitrary rules set by a private hosting service provider.
  • Hosting service providers shall include in their terms and conditions that they will not store terrorist content (Council)
  • Hosting service providers shall include in their terms and conditions provisions to address the misuse of their service for the dissemination of terrorist content
Terrorist content should be a matter of public policy, not of terms and conditions. Private policing/enforcement by the tech industry is not democratically legitimised, lacks independence, is intransparent and not controlled by the judiciary. This burdens small operators, since many of them do not even use terms and conditions. Where website operators use identical terms and conditions worldwide, forcing modifications could have a global effect far beyond the EU. Would we want Chinese or Turkish requirements in terms and services applicable to our citizens?
  • Hosting service providers shall use “proactive measures” to prevent the re-upload of content previously taken down and to detect and identify alleged terrorist content
  • Hosting service providers shall not be required to use automated tools (such as “upload filters”)
Automated upload filters are censorship machines that have been proven to suppress completely legal content (e.g. documentation of human rights violations in civil wars). Even a filter with an extremely high accuracy rate close to 99% would delete more legal content than illegal content, because terrorist material is extremely rare compared to the overall number of uploads.
Even for identical content (e.g. an image) the classification as illegal “terrorist content” depends on the publisher’s intention (e.g. media coverage, awareness raising, terrorist research, criticizing terrorism). Automated tools can never assess the context and intention of a publication. The GIFCT re-upload filters currently in use are intransparent and lacking public control.
If service providers were obliged to use automated tools for flagging content, the amount of reported content would be so high that providers would “voluntarily” block all such content without a proper assessment.

* * *

Article 6: Upload filters / “Proactive measures”

Although mandatory upload filters have already been imposed in the context of the controversial copyright reform (Article 13), the European Court of Justice is reviewing the provision and may yet strike it down for legal unclarity.

With the proposed “terrorist content” law and the future “Digital Services Act” however, mandatory upload filtering may become a standard requirement for the Internet.

Upload filters are ineffective. They would not be applied by non-EU platforms hosting terrorist content such as 4chan, 8chan and Gab. And they are easy to circumvent on other platforms. For instance, Facebook has over 800 slightly modified duplicates of the Christchurch shooting video in its hash-sharing database.

Upload filters are prone to systematically produce false positives and result in accidentally suppressing legal content (overblocking). Since algorithms cannot account for contextual interpretation of the legal (re-)use of the content, legal uses of terrorist material would be prevented (such as for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity, for expressing polemic or controversial views in the course of public debate). For example the Syrian Archive is documenting war crimes committed by terrorist organisations on Youtube but has had a great part of its documentation deleted. The same could go for media reports on terrorist attacks.

Complaints procedures or subsequent judicial review procedures are insufficient to tackle the risks of automatic filtering. In practise affected users and companies will hardly ever invest time and money into a complicated – possibly international – procedure of challenge a removal order. Also subsequent judicial review takes too long to protect legal speech on current affairs. When content is eventually reinstated your message will no longer be relevant.

In the context of the copyright reform we have seen hundreds of thousands of young people in the streets protesting against Internet censorship. Yet Commission and national governments seek to make compulsory upload filters a general rule, ignoring public protest as if nothing had happened.

What this means for you:

  • Error-prone upload filters will need to approve everything you want to post or upload to platforms like Instagram, YouTube, Snapchat, Facebook, Tumblr, WordPress.org, Wattpad, DeviantArt, SoundCloud, TikTok, Giphy etc. before it can appear online. This will mean delays and mistakes. Upload filters will consider you “guilty until proven innocent”, guaranteeing that perfectly legal contributions will be withheld – especially criticism and journalistic reporting on terrorist content.
  • Services you rely on will start geoblocking the EU if they can’t handle the liability.

What this means for all of us:

  • Freedom of expression would be limited as the internet turns from a place where contributions are welcome to one where they first need to pass automated scrutiny. Do we want intransparent algorithms to decide on what we can say and read?
  • Media coverage, education, arts, research, awareness raising and controversial debates on terrorism will suffer.
  • The censorship infrastructure established this way is sure to be expanded to other types of content. Once introduced for terrorist content the upcoming Digital Services Act is likely to make upload filters compulsory to look for any illegal material.
  • Innovation killed: This law guarantees there will never be a European alternative to the big social networks and sharing sites. The few US giants able to invest the giant sums required into upload filters will license them out to others and find their market dominance fortified.

* * *

Article 4: Ultra-fast cross-border removal orders

According to the proposed regulation service providers shall remove content within 1 hour of receiving a removal order. Excessive requirements endanger the freedom of expression, journalistic reporting, science, arts and culture, but also the digital economy.

Satisfying content removal orders within one hour, even at night or on week-ends, is impossible for many operators of Internet services to do. The exception proposed by the European Parliament by way of compromise is too narrow and lacks the legal certainty to protect small businesses, non-profit organizations and individuals operating websites.

Furthermore there is no consensus in the EU on what constitutes a “terrorist group”, for example on Catalonian separatists, Kurds, Palestinians etc. Every Member State can take their own decision. Hungary has in the past been labeling environmental activists “ecoterrorists”, France has used anti-terror legislation on social protesters. If any EU country can request the suppression of content in another, this would affect content that is perfectly legal in the hosting country, and would result in a race to the bottom on freedom of expression.

Take-down orders that do not require a judicial order could be used for political purposes by “law and order” politicians.

What this means for you:

  • Individuals or small organisations may decide to terminate services and platforms rather than stand ready 24/7 to respond within one hour to a removal order.

What this means for all of us:

  • Freedom of expression would be limited as “law and order” politicians have political content taken down by labelling it “terrorist”, even if hosted in a country where it is perfectly legal.
  • Media coverage, education, arts, research, awareness raising and controversial debates on terrorism will suffer.
  • Innovation killed: This law harms small European Internet businesses. The few US giants able to react within one hour will find their market dominance fortified.

* * *

What you can do

Do you fear that this law will cause massive damage to a free and open internet and that its benefits do not outweigh this damage?

If EU governments in Council continue to insist on ever further-reaching content removal provisions and mandatory automated filtering (upload filters) of online content, there is a risk that it could become more important to a majority of your representatives in the European Parliament to be seen to do “something” about terrorist content online than insist on the respect for fundamental rights, free speech and protecting the Internet community.

Our best bet right now is to increase pressure on the governments represented in Council. Contact the government’s permanent representations or the Ministries of the interior.

Help keep up public attention throughout the trilogues: Reach out to local media, comment on the progress of the negotiations, make videos, share my posts, keep your friends informed. Nothing is set in stone just yet – but we need to step it up to stop excessive content removal requirements and mandatory upload filters.

Timetable

This is the schedule for the trilogue negotiations:

  • First trilogue: 17/10/2019
  • Technical meeting: 4-6/11/2019
  • Technical meeting: 11/11/2019
  • Second trilogue: 20/11/2019
  • Technical meeting: 21-22/11/2019
  • Technical meeting: 5/12/2019
  • Technical meeting: 09/12/2019
  • Third trilogue: 12/12/2019
  • Internal Meeting: 22/01/2020
  • Technical meeting: 23/01/2020
  • Internal Meeting: 30/01/2020
  • Technical meeting: 03/02/2020
  • Shadows Meeting: 11/02/2020
  • Internal Meeting: 17/02/2020
  • Technical meeting: 18/02/2020
  • Shadows Meeting: 19/02/2020
  • Technical meeting: 03/03/2020
  • Shadows Meeting: 10/03/2020 – postponed due to Corona virus
  • Fourth trilogue: 18/03/2020 – postponed due to Corona virus, no new date yet
  • EP staff meeting: 19/05/2020
  • Shadows Meeting: 05/06/2020
  • Shadows Meeting: 11/06/2020
  • EP staff meeting: 18/09/2020
  • Shadows Meeting: 23/09/2020
  • Fourth trilogue: 24/09/2020
  • Technical meetings: 01/10/2020 and 13/10/2020
  • EP staff meeting: 12 October
  • Shadows Meeting: 27/10/2020
  • Fifth trilogue: 29/10/2020
  • Shadows Meeting (tbc): 09/11/2020- postponed to  16/11/2020
  • Shadows Meeting: 02/12/2020 – postponed to 07/12/2020
  • Sixth trilogue: 10/12/2020

Negotiators

The Parliament is represented in the trilogues by the following members of the Civil Liberties Committee:

Rapporteur: JAKI Patryk (ECR)

Shadow Rapporteurs: ZARZALEJOS Javier (EPP), KALJURAND Marina (S&D), ERNST Cornelia (GUE/NGL), PAGAZAURTUNDÚA Maite (Renew), BREYER Patrick (Greens/EFA), GARRAUD Jean-Paul (ID)

The Committee on Internal Market and Consumer Protection (IMCO) is represented by the Rapporteur for Opinion KOLAJA Marcel (Greens/EFA), while the Committee on Culture and Education (CULT) is represented by the Rapporteur for Opinion KAMMEREVERT Petra (S&D).

Expanded list of contacts

Documents

Here are some useful material that could guide you through the process thus far.

  1. Initial Commission proposal
  2. Commission’s Impact Assessment
  3. Council’s General Approach
  4. Parliament’s mandate
  5. Council’s comparison of the two texts (Parliament and Commission texts)
  6. EDRi document pool

Company responses to EU Commission consultation in mid 2018:

Government responses to EU Commission consultation in mid 2018 (knowing your government’s position is useful for reaching out):

Some parts were blackened upon government request (reasoning here).

Comments

1 Comment
  • nofuture

    unacceptable censorship, the EU is evil

Write a comment:

All information is voluntary. Your email address will not be published.