Sprache ändern: English
Share:

Digital Services Act: Legal Affairs Committee attacks user privacy and free speech online

Press releases Sonstiges

Today, the European Parliament’s Committee on Legal Affairs (JURI) adopted its recommendations on the Digital Services Act, as proposed by French opinion rapporteur Geoffroy Didier (EPP). 

For the benefit of citizens, the Committee calls for a right to use and pay for digital services anonymously and for a ban on behavioral tracking and advertising (AM411). Voluntary own-initiative investigations by online platforms shall not lead to ex-ante control measures based on upload filters (Art. 6). There shall generally be no obligation for companies to use the controversial upload filters (Art. 7), as “such tools have difficulties of effectively understanding the subtlety of context and meaning in human communication, which is necessary to determine whether assessed content violates the law or terms of service”. The DSA shall not prevent the offering of end-to-end encrypted services (Art. 7). Public authorities shall be given the right to order the reinstatement of legal content that was removed by platforms (Art. 8a). Dark patterns are to be banned (Art. 13a).

However MEP Patrick Breyer (Pirate Party), shadow rapporteur of the Greens/EFA group, warns about other parts of the opinion:

“These proposals threaten the confidentiality of private correspondence, encourage error-prone ex-ante upload filtering, introduce excessively short content take-down delays, enforce excessive national laws (e.g. in Poland or Hungary) throughout the EU, turn ‚trusted flaggers‘ into ‚trusted censors‘ and much more. I don’t think all my colleagues in the Legal Affairs Committee are aware of the implications. They reflect massive lobbying by the content and rights holder industry.”

Attack on confidentiality of instant messaging

Specifically the proposed Article 1 would add private communications/messaging services to the scope of the DSA. This threatens the privacy of correspondence and secure encryption. Obliging messaging providers to review and remove the content of private messages (Art. 8, 14) would prohibit secure end-to-end encryption which citizens, businesses and governments rely on. The Committee’s proposal to exempt the personal use of messaging services does not work because it is impossible for the service to know the purpose of an account or message without reading correspondence and breaking encryption.


Risk of Overblocking

Furthermore, the proposed Article 5 would fundamentally change the liability regime, burden businesses, favour overblocking of content and threaten fundamental rights of users:

  • Par. 1(b) would mandate error-prone upload filters by requiring providers to “permanently” remove certain pieces of content. Algorithms cannot reliably identify illegal content and currently routinely result in the suppression of legal content, including media content. Reappearing content can be legal in a new context, for the new purpose or posted by another author. 
  • Par. 1a would impose inflexible and excessively short take-down delays, some even shorter than for terrorist content. Without the time for proper scrutiny providers will either have to underblock illegal content (“we didn’t have time to establish this is illegal”) or to overblock legal content (“we’ll take it down just to be on the safe side”). This is a major threat to the fundamental right to free speech. 

Race to the bottom regarding free speech

The proposed Article 8 would allow one Member State with extreme national legislation to order the removal of content published legally in another Member State. This would result in a race to the bottom regarding freedom of speech, with the most repressive legislation of all prevailing throughout the Union. Also enforcing EU law globally by removing content published legally in non-EU countries would result in retaliation by those non-EU countries (e.g. Russia, China, Turkey) asking EU providers to remove perfectly legal and legitimate content on the basis of their excessive national rules.


Error-prone upload filtering

The proposed Article 14 would introduce a strict 72h time limit for deciding on reported content. Without the time for proper scrutiny providers will either have to underblock illegal content (“we didn’t have time to establish this is illegal”) or to overblock legal content (“we’ll take it down just to be on the safe side”). It would also allow providers to use error-prone re-upload filters to block the uploading of deleted content (“stay-down”). Algorithms cannot reliably identify illegal content and currently routinely result in the suppression of legal content, including media content. Reappearing content can be legal in a new context, for the new purpose or posted by another author. Filtering algorithms cannot reliably tell legal from illegal. 

Trusted censors”

Art. 14a(2a) would essentially allow private “trusted flaggers” to have content directly removed or blocked without even the provider needing to assess the legality. This would turn “trusted flaggers” into “trusted censors” and threaten the accessibility of legal content. 

Art. 20 (3c) would indirectly abolish anonymous accounts and mandate identification of all users in order to prevent suspended users from using or registering another account. Multiple online identities are essential to activists, whistleblowers, human rights defenders, women, children and many more who cannot disclose their real identity.

 

Outlook

The Legal Affairs Committee’s recommendations will be discussed in the lead Internal Market (IMCO) Committee, which plans to finalise the text before the end of the year. Next week the IMCO negotiators will meet for the first round of debating politically controversial issues.

 

Comments

1 Comment
  • Dennis Nilsoon

    Thanks for the information and your great work.

Write a comment:

All information is voluntary. Your email address will not be published.