Sprache ändern: English
Share:

EU terror filter negotiations: Here’s where we stand [updated 11 March 2020]

European Parliament Freedom, democracy and transparency
“Trilogue” negotiations on the EU scheme to “prevent the dissemination of terrorist content online” are underway which may make, inter alia, upload filters mandatory.

In a series of closed-door meetings, the European Parliament and the Council (representing the member state governments) will hammer out a final text acceptable to both institutions. It’s the last chance to make changes before the Regulation gets adopted. Meetings started in October.

english🇬🇧
deutsch🇩🇪
français🇫🇷
español🇪🇸
italiano🇮🇹
hrvatski🇭🇷

State of play

Let’s take a close look at the most contested similarities and differences of the Commission and Parliament positions, and break down what they would mean for you:

Commission and Council want Parliament wants What this means for you
  • Terrorist content shall be removed not only by platforms that make user uploads available to the general public, but also by electronic communications services (Commission’s proposal) and cloud services
  • Terrorist content shall be removed only by platforms that make user uploads available to the general public
Your private file storage and communications, so far often end-to-end encrypted, could be subject to content screening and removal. So could private communications in closed user groups.
  • Hosting service providers shall remove content within 1 hour of receiving a removal order
  • Hosting service providers shall remove content within 1 hour of receiving a removal order except in cases of “de facto impossibility not attributable to the hosting service provider, including for technical or operational reasons”; 12-hour advance warning for the first order
Individuals or small organisations may decide to terminate services and platforms rather than stand ready 24/7 to respond within one hour to a removal order that will never come, because 99%+ of platforms are never targeted with terrorist propaganda.
  • Removal of terrorist content can be ordered even if disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
  • Protect and preserve content disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity
Media reports on terrorist attacks could disappear if the Commission’s proposal goes through. Video archives that document war crimes, for example in Syria, will partially disappear, resulting in impunity of perpetrators. Spain has prosecuted artists for satirical performances in the past, so recordings of performances could be deleted in the future.
  • Member states can freely decide which authority may order content take-downs
  • Only a judicial or functionally independent administrative authority may order the removal of content
Where no requirement of independence is in place, ministers could directly order the removal of content for political reasons
  • Member state authorities can have content blocked by hosting service providers anywhere in the EU (Commission’s proposal)
  • Member state authorities can have content removed by hosting service providers located in their own country and request the Member State of establishment for an additional EU wide removal order
If the Commission’s approach succeeds, content you wish to access could have been removed due to orders from Member States with populist, authoritarian governments. For example, Hungary and Poland have been found in the past to disrespect the rule of law. France has also been reported to excessively request content removals.
  • EU authorities can order content removals with a global effect (e.g. order a US provider to remove content with effect for US citzens)
  • Service providers can satisfy removal orders by disabling access to content “in all Member States” (in the EU), without having to delete content altogether
Removing content globally could result in other states such as Turkey, Russia, Saudi Arabia or China requiring the removal content which, in Europe, is perfectly legal and legitimate
  • Hosting service providers shall decide with priority on whether or not to take down content upon receiving a “referral” for consideration from a national police authority
  • National authorities shall issue a removal order in due process or refrain from action (no referrals)
Referrals would result in the removal of content that is perfectly legal and merely violates the arbitrary rules set by a private hosting service provider.
  • Hosting service providers shall use automated upload filters to identify and block alleged terrorist content
  • Hosting service providers shall not be required to use automated upload filters or to generally monitor their platforms.
Automated upload filters are censorship machines that have been proven to suppress completely legal content (e.g. documentation of human rights violations in civil wars). Even a filter with an extremely high accuracy rate close to 99% would delete more legal content than illegal content, because terrorist material is extremely rare compared to the overall number of uploads.

Removal orders and their effects

My primary concern on the text voted by Parliament remains the so called “one-hour” rule. Many operators do not have the resources to block content within one hour of receiving an order (even at night-time). We don’t want Internet services to cease operations. Otherwise the trilogue will be about defending the Parliament’s amendments in order to protect free speech online. The Commission has so far not demonstrated willingness to accept the changes requested by the Parliament.

Despite the improvements made by the European Parliament, I still fundamentally doubt whether this regulation will effectively prevent terrorism. Even if terrorist content did cause terrorist acts (which is disputable), content “removals” are too easy to circumvent. The traditional platforms can be assumed to apply simple geolocation techniques when receiving blocking orders. They will not be required by the regulation to actually delete “terrorist content” but only to block it for EU users. It will be technically easy to circumvent such geo-blocking, for example by using foreign proxy servers. Also terrorist groups are increasingly moving away from the traditional platforms and use encrypted private communications channels or alternative platforms that will not honour European removal requests. The spreading of terrorist propaganda will hardly be contained by this regulation.

More information

For information on what you can do, on timetable, negotiators and further reading please refer to my more recent blogpost Lifting the veil on the secretive EU terror filter negotiations: Here’s where we stand.

Comments

4 Comments
  • Patrick Breyer

    EU terror filter negotiations to begin: Here’s where we stand https://www.patrick-breyer.de/?p=589500&lang=en

  • Anonymous

    @patrickbreyer

    One of the first questions is: what *is* “terrorist content”? As pointed out, this is guaranteed to be abused by the likes of #Spain (as they already do), #UK (if staying in), #France, etc.

    The 2nd question is: what does it try to achieve? How likely is it to succeed and at what cost?

  • Anonymous

    @patrickbreyer

    Lastly: what allowances are made for the size of a) a service provider, and b) the potential audience? One thing is someone posting objectionable material to Facebook or Medium. Quite another posting on a gopher server.

Write a comment:

All information is voluntary. Your email address will not be published.