“We want the Internet to be an open space in which all users communicate with each other freely and on an equal level. Instead, what we are seeing is monopolisation by a few large corporations whose business model is based on spying on and manipulating users for advertising purposes, and which results in censoring and de-platforming them at will. Similar to the General Data Protection Regulation (GDPR), the Digital Services Act could be a turning point in the digital era with a global impact.”
Breyer calls for effective protection of fundamental rights in the digital era
In line with the responsibilities of the LIBE Committee, the measures proposed by Breyer focus on better protection of fundamental rights. Most of the proposals implement reports and opinions that have already had a majority backing in committee or in plenary. Key proposals are:
Right to privacy
- Anonymous use of digital services: The Digital Services Act should provide for anonymous use of and payment for internet services wherever technically possible. This will prevent data scandals, identity theft, stalking and other forms of misuse of personal data, such as most recently as a result of the Facebook data breach that affected 500 million people.
- Digital privacy: The collection and exploitation of a user‘s online activities (“tracking”) is to be limited to what is technically absolutely necessary. A person’s online activities allow for deep insights into their (past and future) behaviour and thus enable targeted manipulation.
- Contextual instead of surveillance-driven advertising: Personalised and behavioural advertising is to be phased out to protect users privacy as well as the professional media.
- Secure encryption: Public authorities should not be allowed to restrict end-to-end encryption, as it is essential for online security.
Right to freedom of expression and information
- Independent judiciary to decide: To protect freedom of expression and media freedom, the decision on the legality of content should be in the hands of the independent judiciary and not administrative authorities taking political orders or corporations.
- No foreign removal orders: Content published legally in one country should not have to be deleted because it violates the laws of another EU country. This protects against “illiberal democracies” like Hungary and Poland taking down content published elsewhere.
- No internet blocking: Illegal content should be deleted where it is hosted. Internet access providers should no longer be obliged to block access to content.
- No upload filters: Error-prone algorithms should not be allowed to prevent publications. Such upload filters cannot reliably identify illegal content and keep resulting in the suppression of legal content.
- No mandatory account suspension: Providers should not be forced to suspend users for posting allegedly illegal content, as such an obligation would circumvent the sanctions laid down in law and the requirement of a court decision.
- Preventive measures in line with fundamental rights: In order to curb the spread of illegal content, supervisory authorities should be given the power to impose specific measures, subject to judicial redress, instead of relying on intransparent and unaccountable self-regulation and co-regulation, as the Commission does.
Users’ right to decide
- User control over content recommendations: Users should be able to switch off the platform algorithms for proposing content (e.g. in the timeline) and to choose third-party recommender systems instead. This will curb the profit-driven spread of problematic content such as misinformation, conspiracy theories or hate speech.
- Cross-platform interaction: Currently users are locked in to the dominant platforms by needing to be in touch with other persons using those platforms (e.g. employers, educational institutions). In order to overcome this lock-in and to open up real competition (also in terms of data protection and data security), users of large platforms should be able to switch to alternative platforms and still interact with their contacts across platforms.
After the General Data Protection Regulation, the Digital Services Act (DSA) is considered to be the next major project for regulating digitalisation at EU level: The Act is to replace the e-Commerce Directive, which has been in place since 2000, and thus establish fundamental new rules for digital platforms. After three resolutions of the European Parliament, the EU Commission presented its legislative proposal in December 2020.