Systemic Risk Reporting: A System in Crisis?
Deeplinks 2025-01-16
Summary:
The first batch of reports assessing the so called “systemic risks” posed by the largest online platforms are in. These reports are a result of the Digital Services Act (DSA), Europe’s new law regulating platforms like Google, Meta, Amazon or X, and have been eagerly awaited by civil society groups across the globe. In their reports, companies are supposed to assess whether their services contribute to a wide range of barely defined risks. These go beyond the dissemination of illegal content and include vaguely defined categories such as negative effects on the integrity of elections, impediments to the exercise of fundamental rights or undermining of civic discourse. We have previously warned that the subjectivity of these categories invites a politization of the DSA.
In view of a new DSA investigation into TikTok’s potential role in Romania’s presidential election, we take a look at the reports and the framework that has produced them to understand their value and limitations.
A Short DSA Explainer
The DSA covers a lot of different services. It regulates online markets like Amazon or Shein, social networks like Instagram and TikTok, search engines like Google and Bing, and even app stores like those run by Apple and Google. Different obligations apply to different services, depending on their type and size. Generally, the lower the degree of control a service provider has over content shared via its product, the fewer obligations it needs to comply with.
For example, hosting services like cloud computing must provide points of contact for government authorities and users and basic transparency reporting. Online platforms, meaning any service that makes user generated content available to the public, must meet additional requirements like providing users with detailed information about content moderation decisions and the right to appeal. They must also comply with additional transparency obligations.
While the DSA is a necessary update to the EU’s liability rules and improved users’ rights, we have plenty of concerns with the route that it takes:
- We worry about the powers it gives to authorities to request user data and the obligation on providers to proactively share user data with law enforcement.
- We are also concerned about the ways in which trusted flaggers could lead to the over-removal of speech, and
- We caution against the misuse of the DSA’s mechanism to deal with emergencies like a pandemic.
Introducing Systemic Risks
The most stringent DSA obligations apply to large online platforms and search engines that have more than 45 million users in the EU. The European Commission has so far designated more than 20 services to constitute such “very large online platforms” (VLOPs) or “very large online search engines” (VLOSEs). These companies, which include X, TikTok, Amazon, Google Search, Maps and Play, YouTube and several porn platforms, must proactively assess and mitigate “systemic risks” related to the design, operation and use of their services. The DSA’s non-conclusive list of risks includes four broad categories: 1) the dissemination of illegal content, 2) negative effects on the exercise of fundamental rights, 3) threats to elections, civic discourse and public safety, and 4) negative effects and consequences in relation to gender-based violence, protection of minors and public health, and on a person’s physical and mental wellbeing.
The DSA does not provide much guidance on how VLOPs and VLOSEs are supposed to analyze whether they contribute to the somewhat arbitrary seeming list of risks mentioned. Nor does the law offer clear definitions
Link:
https://www.eff.org/deeplinks/2025/01/systemic-risk-reporting-system-crisisFrom feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks