European Commission’s Proposed Digital Services Act Got Several Things Right, But Improvements Are Necessary to Put Users in Control
Deeplinks 2020-12-15
Summary:
The European Commission is set to release today a draft of the Digital Services Act, the most significant reform of European Internet regulations in two decades. The proposal, which will modernize the backbone of the EU’s Internet legislation—the e-Commerce Directive—sets out new responsibilities and rules for how Facebook, Amazon, and other companies that host content handle and make decisions about billions of users’ posts, comments, messages, photos, and videos. This is a great opportunity for the EU to reinvigorate principles like transparency, openness, and informational self-determination. Many users feel locked into a few powerful platforms and at the mercy of algorithmic decision systems they don’t understand. It’s time to change this. We obtained a copy of the 85-page draft and, while we are still reviewing all the sections, we zeroed in on several provisions pertaining to liability for illegal content, content moderation, and interoperability, three of the most important issues that affect users’ fundamental rights to free speech and expression on the Internet. What we found is a mixed bag with some promising proposals. The Commission got it right setting limits on content removal and allowing users to challenge censorship decisions. We are also glad to see that general monitoring of users is not a policy option and that liability for speech rests with the speaker, and not with platforms that host what users post or share online. But the proposal doesn’t address user control over data or establish requirements that the mega platforms work towards interoperability. Thus, there is space for improvement and we will work with the EU Parliament and the Council, which must agree on a text for it to become law, to make sure that the EU fixes what is broken and puts users back in control. Content liability and monitoring The new EU Internet bill preserves the key pillars of the current Internet rules embodied in the EU’s e-Commerce Directive. The Commission followed our recommendation to refrain from forcing platforms to monitor and censor what users say or upload online. It seems to have learned a lesson from recent disastrous Internet bills like Article 17, which makes platforms police users’ speech. The draft allows intermediaries to continue to benefit from comprehensive liability exemptions so, as a principle, they will not be held liable for user content. Due to a European-style “good samaritan” clause, this includes situations where platforms voluntarily act against illegal content. However, the devil lies in the details and we need to make sure that platforms are not nudged to employ “voluntary” upload filters. New due-diligence obligations The DSA sets out new due diligence obligations for flagging illegal content for all providers of intermediary services, and establishes special type and size-oriented obligations for online platforms, including the very large ones. We said from the start that a one-size fits all approach to Internet regulations for social media networks does not work for an Internet that is monopolized by a few powerful platforms. We can therefore only support new due diligence obligations that are matched to the type and size of the platform. The Commission rightly recognizes that the silencing of speech is a systemic risk on very large platforms and that transparency about content moderation can improve the status quo. However, we will carefully analyze other, potentially problematic provisions, such as requiring platforms to report certain types of illegal content to law enforcement authorities. Rules on supervision, investigation, and enforcement deserve in-depth scrutiny from the European Parliament and the Council.
Takedown notices and complaint handling Here, the Commission has taken a welcome first step towards more procedural justice. Significantly, the Commission acknowledges that platforms frequently make mistakes when moderating content. Recognizing that users deserve more transparency about platforms’ decisions to remove content or close accounts, the draft regulations call for online platforms to provide a user-friendly complaint handling system and restore content or accounts that were wrongly removed. However, we have concerns that platforms, rather than courts, are increasingly becoming the arbiters of what speech can or cannot be posted online. A harmonized notification system for all sorts of content will increase the risk that the platform becomes aware about the illegality of content and thus held liable for it. Interoperability measures are missing The Commission missed the mark on giving users more freedom and control over their Internet experience, as rules on interoperability are absent from the proposal. That may be addressed in the Digital Markets Act draft proposal. If the EU wants to break the power of platforms that monopolize the
Link:
https://www.eff.org/https%3A//www.eff.org/deeplinks/2020/12/european-commissions-digital-services-act-proposalFrom feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks