European Parliament Freedom, democracy and transparency

EU terror filter negotiations: Here’s where we stand [updated 10 December 2019]

“Trilogue” negotiations on the EU scheme to “prevent the dissemination of terrorist content online” are underway which may make, inter alia, upload filters mandatory.

In a series of closed-door meetings, the European Parliament and the Council (representing the member state governments) will hammer out a final text acceptable to both institutions. It’s the last chance to make changes before the Regulation gets adopted. Meetings started in October.

State of play

Let’s take a close look at the most contested similarities and differences of the Commission and Parliament positions, and break down what they would mean for you:

Commission and Council want Parliament wants What this means for you
  • Terrorist content shall be removed not only by platforms that make user uploads available to the general public, but also by electronic communications services (Commission’s proposal) and cloud services
  • Terrorist content shall be removed only by platforms that make user uploads available to the general public
Your private file storage and communications, so far often end-to-end encrypted, could be subject to content screening and removal. So could private communications in closed user groups.
  • Hosting service providers shall remove content within 1 hour of receiving a removal order
  • Hosting service providers shall remove content within 1 hour of receiving a removal order except in cases of “de facto impossibility not attributable to the hosting service provider, including for technical or operational reasons”; 12-hour advance warning for the first order
Individuals or small organisations may decide to terminate services and platforms rather than stand ready 24/7 to respond within one hour to a removal order that will never come, because 99%+ of platforms are never targeted with terrorist propaganda.
  • Removal of terrorist content can be ordered even if disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
  • Protect and preserve content disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
Media reports on terrorist attacks could disappear if the Commission’s proposal goes through. Video archives that document war crimes, for example in Syria, will partially disappear, resulting in impunity of perpetrators. Spain has prosecuted artists for satirical performances in the past, so recordings of performances could be deleted in the future.
  • Member states can freely decide which authority may order content take-downs
  • Only a judicial or functionally independent administrative authority may order the removal of content
Where no requirement of independence is in place, ministers could directly order the removal of content for political reasons
  • Member state authorities can have content blocked by hosting service providers anywhere in the EU (Commission’s proposal)
  • Member state authorities can have content removed by hosting service providers located in their own country and request the Member State of establishment for an additional EU wide removal order
If the Commission’s approach succeeds, content you wish to access could have been removed due to orders from Member States with populist, authoritarian governments. For example, Hungary and Poland have been found in the past to disrespect the rule of law. France has also been reported to excessively request content removals.
  • EU authorities can order content removals with a global effect (e.g. order a US provider to remove content with effect for US citzens)
  • Service providers can satisfy removal orders by disabling access to content “in all Member States” (in the EU), without having to delete content altogether
Removing content globally could result in other states such as Turkey, Russia, Saudi Arabia or China requiring the removal content which, in Europe, is perfectly legal and legitimate
  • Hosting service providers shall decide with priority on whether or not to take down content upon receiving a “referral” for consideration from a national police authority
  • National authorities shall issue a removal order in due process or refrain from action (no referrals)
Referrals would result in the removal of content that is perfectly legal and merely violates the arbitrary rules set by a private hosting service provider.
  • Hosting service providers shall use automated upload filters to identify and block alleged terrorist content
  • Hosting service providers shall not be required to use automated upload filters or to generally monitor their platforms.
Automated upload filters are censorship machines that have been proven to suppress completely legal content (e.g. documentation of human rights violations in civil wars). Even a filter with an extremely high accuracy rate close to 99% would delete more legal content than illegal content, because terrorist material is extremely rare compared to the overall number of uploads.

Removal orders and their effects

My primary concern on the text voted by Parliament remains the so called “one-hour” rule. Many operators do not have the resources to block content within one hour of receiving an order (even at night-time). We don’t want Internet services to cease operations. Otherwise the trilogue will be about defending the Parliament’s amendments in order to protect free speech online. The Commission has so far not demonstrated willingness to accept the changes requested by the Parliament.

Despite the improvements made by the European Parliament, I still fundamentally doubt whether this regulation will effectively prevent terrorism. Even if terrorist content did cause terrorist acts (which is disputable), content “removals” are too easy to circumvent. The traditional platforms can be assumed to apply simple geolocation techniques when receiving blocking orders. They will not be required by the regulation to actually delete “terrorist content” but only to block it for EU users. It will be technically easy to circumvent such geo-blocking, for example by using foreign proxy servers. Also terrorist groups are increasingly moving away from the traditional platforms and use encrypted private communications channels or alternative platforms that will not honour European removal requests. The spreading of terrorist propaganda will hardly be contained by this regulation.

Timetable

This is the schedule for the trilogue negotiations:

  • First trilogue: 17/10/2019
  • Technical meeting: 4-6/11/2019
  • Technical meeting: 11/11/2019
  • Second trilogue: 20/11/2019
  • Technical meeting: 21-22/11/2019
  • Technical meeting: 5/12/2019
  • Technical meeting: 09/12/2019
  • Third trilogue: 12/12/2019

Negotiations will continue in 2020.

Negotiators

The Parliament will be represented in the trilogues by the following members of the Civil Liberties Committee:

Rapporteur: JAKI Patryk (ECR)

Shadow Rapporteurs: ZARZALEJOS Javier (EPP), KALJURAND Marina (S&D), ERNST Cornelia (GUE/NGL), PAGAZAURTUNDÚA Maite (Renew), BREYER Patrick (Greens/EFA)

The Committee on Internal Market and Consumer Protection (IMCO) will be represented by the Rapporteur for Opinion KOLAJA Marcel (Greens/EFA), while the Committee on Culture and Education (CULT) will be represented by the Rapporteur for Opinion WARD Julie (S&D)

Documents

Here are some useful material that could guide you through the process thus far.

  1. Initial Commission proposal
  2. Commission’s Impact Assessment
  3. Council’s General Approach
  4. Parliament’s mandate
  5. Council’s comparison of the two texts (Parliament and Commission texts)
  6. EDRi document pool

3 comments on “EU terror filter negotiations: Here’s where we stand [updated 10 December 2019]

  1. EU terror filter negotiations to begin: Here’s where we stand https://www.patrick-breyer.de/?p=589500&lang=en

  2. @patrickbreyer

    One of the first questions is: what *is* “terrorist content”? As pointed out, this is guaranteed to be abused by the likes of (as they already do), (if staying in), , etc.

    The 2nd question is: what does it try to achieve? How likely is it to succeed and at what cost?

  3. @patrickbreyer

    Lastly: what allowances are made for the size of a) a service provider, and b) the potential audience? One thing is someone posting objectionable material to Facebook or Medium. Quite another posting on a gopher server.

Leave a Reply

Alle Angaben sind freiwillig. No field required.

Your email address will not be published.

Stoppt die Vorratsdatenspeicherung! Jetzt klicken & handeln!Willst du auch bei der Aktion teilnehmen? Hier findest du alle relevanten Infos und Materialien: