Upcoming vote on Digital Services Act in JURI committee: Pirate MEP Patrick Breyer fears massive threats to fundamental rights
On Thursday, the European Parliament’s Committee on Legal Affairs (JURI) will vote on the compromise proposals drafted by French opinion rapporteur Geoffroy Didier (EPP) on the Digital Services Act. MEP Patrick Breyer (Pirate Party), shadow rapporteur of the Greens/EFA group, considers the proposals dangerous in many respects. Together with his Renew and S&D colleagues he puts alternative compromise amendments to the vote:
„The Rapporteur’s proposals are radical. They would i.e. threaten the secrecy of private correspondence and end-to-end encryption, mandate and encourage error-prone ex-ante upload filtering, introduce excessively short content take-down delays, enforce excessive national laws (e.g. in Poland or Hungary) throughout the EU and even globally, turn ‚trusted flaggers‘ into ‚trusted censors‘ and much more. I expect the vote to be very tight on several of these issues.
Private communications/messaging services
For example compromise amendment (CA) 1 on Article 1 would add private communications/messaging services to the scope of the DSA. This threatens the privacy of correspondence and secure encryption. Obliging messaging providers to review and remove the content of private messages (Art. 8, 14), would prohibit secure end-to-end encryption which citizens, businesses and governments rely on. The Rapporteur’s proposal to exempt the personal use of messaging services only does not work because it is impossible for the service to know the purpose of an account or message without reading correspondence and breaking encryption. Our alternative compromise amendment 1A proposes to include instant messaging services only where they offer public channels to publish content (e.g. public user groups that anybody can subscribe to). Publishing posts is different from interpersonal electronic communications.
Overblocking
Furthermore, the Rapporteur’s compromise amendment (CA) 4 on Article 5 would fundamentally change the liability regime, burden businesses, favour overblocking of content and threaten fundamental rights of users:
- Par. 1(b) would mandate error-prone upload filters by requiring providers to “permanently” remove certain pieces of content. Algorithms cannot reliably identify illegal content and currently routinely result in the suppression of legal content, including media content. Reappearing content can be legal in a new context, for the new purpose or posted by another author.
- Par. 1a would impose inflexible and excessively short take-down delays, some even shorter than for terrorist content. Without the time for proper scrutiny providers will either have to underblock illegal content (“we didn’t have time to establish this is illegal”) or to overblock legal content (“we’ll take it down just to be on the safe side”). This is a major threat to the fundamental right to free speech.
- Par. 2 would introduce extensive conditions to the liability regime which are extremely difficult to establish and verify. According to JURI opinion PE652.326v02, par. 9, the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers. Case-law to this effect has resulted in legal uncertainty, including due to conflicting decisions by different levels of courts as to the ‘active’ or ‘passive’ role of the same type of service (see also Impact Assessment, p. 105). The safe harbour provisions shall apply where providers have neither knowledge nor control. No additional criteria shall be introduced in recitals or case-law.
Our alternative compromise amendment 4A maintains the text of the Commission’s proposal and excludes references to the unclear concept of active/passive providers.
Race to the bottom regarding free speech
The Rapporteur’s compromise amendment (CA) 7 on Article 8 would allow one Member State with extreme national legislation to order the removal of content published legally in another Member State. This would result in a race to the bottom regarding freedom of speech, with the most repressive legislation of all prevailing throughout the Union. Also enforcing EU law globally by removing content published legally in non-EU countries would result in retaliation by those non-EU countries (e.g. Russia, China, Turkey) asking EU providers to remove perfectly legal and legitimate content on the basis of their excessive national rules. Our alternative compromise amendment 7A proposes to align the territorial effect of cross-border removal orders with territorial effect of the relevant prohibition. Cross-border removal orders on the basis of national law, for example, should only have a national effect (by means of geo-blocking content in that country).
Error-prone upload filtering
The Rapporteur’s compromise amendment (CA) 14 on Article 14 would
- apply notice and action to private communications/messaging services, threatening the privacy of correspondence and secure encryption (see above)
- apply notice and action to violations of terms and conditions which address legal content. It should be up to the provider how to deal with ToS violations.
- introduce a strict 72h time limit. Without the time for proper scrutiny providers will either have to underblock illegal content (“we didn’t have time to establish this is illegal”) or to overblock legal content (“we’ll take it down just to be on the safe side”).
- allow providers to use error-prone re-upload filters for ex-ante control. Algorithms cannot reliably identify illegal content and currently routinely result in the suppression of legal content, including media content. Reappearing content can be legal in a new context, for the new purpose or posted by another author. Filtering algorithms cannot reliably tell legal from illegal. Simply mentioning human review is much too unspecific to prevent over-blocking.
Our alternative compromise amendment 14A fixes those issues. In addition it removes paragraph 2 which contradicts the case-law on “actual knowledge”. A complete notice triggers awareness of content once it is read, but the provider will often not know whether the reported content is illegal or not. According to CJEU case-law precise and substantiated notices only represent a factor of which the court must take account when determining whether the provider was actually aware of facts or circumstances on the basis of which a diligent economic operator should have identified the illegality (Judgement of 12 July 2011, L’Oréal, C-324/09, ECLI:EU:C:2011:474, par. 122).“
Comments