Facebook’s White Paper on the Future of Online Content Regulation: Hard Questions for Lawmakers

lrosenberg's bookmarks 2020-02-18

Summary:

I once wrote, after Australia passed some particularly terrible legislation, that maybe Facebook CEO Mark Zuckerberg should have been more specific when he publicly asked for more regulation of “harmful content.” It seems that Zuckerberg has finally gotten around to clarifying what he meant. His recent comments at the Munich Security Conference, where he suggested that online platforms should be treated somewhere between telecoms and media industries, were more confusing than not. But Facebook released a white paper today by Monika Bickert, the company’s vice president of content policy, that is a more constructive step forward.

Facebook’s motivation in releasing the paper is obvious. The next year is looking to be critical for the governance of online speech: Regulators around the world are considering legislation and rethinking their previously hands-off approach. The paper focuses on regulatory structures “outside the United States,” which are likely to be more aggressive, given the higher tolerance for governmental regulation of speech than under the First Amendment. Facebook obviously has an interest in what that regulation looks like. But motivations aside, the white paper is a thoughtful document that raises serious questions that regulators, and the rest of us interested in the future of online content regulation, need to reckon with.

Overview

The report begins by listing four characteristics of online content that make prior models of regulation ill-fitted and will require new frameworks to address:

  • Platforms are global: Many internet platforms have a global user base and straddle jurisdictions with very different legal rules and expectations for what speech is acceptable.
  • Platforms are constantly changing: Internet platforms are not homogeneous—each has its own affordances and dynamics, and all are “constantly changing to compete and succeed.”
  • Platforms will always get some speech decisions wrong: The unfathomable scale of modern internet platforms—which requires millions of speech decisions to be made every day—means that enforcement of platform standards will always be imperfect.
  • Platforms are intermediaries, not speakers: Platforms facilitate speech, but they do not and cannot review every piece of content posted before it is posted—and therefore should not be treated the same as publishers.

The white paper identifies four key questions that need to be answered in order to design a framework that meets these new challenges:

  1. How can content regulation best achieve the goal of reducing harmful speech while preserving free expression?
  2. How should regulation enhance the accountability of internet platforms?
  3. Should regulation require companies to meet certain performance targets?
  4. Should regulation define which “harmful content” should be prohibited on internet platforms?

The white paper then proceeds to engage with these questions in a substantive and reasonably detailed way and is worth reading in full. Here, I’ll highlight some key themes that emerge.

Incentives Matter

The paper emphasizes the importance of ensuring that legislation does not create perverse incentives. When performance targets are enshrined in law, they risk incentivizing actors to focus on meeting the targets themselves, rather than focusing on the overarching goal of the regulatory scheme. So regulators need to be careful to create performance targets and metrics that platforms will not game to decrease enforcement burdens. The white paper gives a number of examples of how this might occur, such as:

  • Measuring response times to user or government reports of violating content, which could incentivize companies to define violation categories (like hate speech) narrowly—or to make it harder for users to report violations, thus decreasing the number of reports needing a quick response.
  • Transparency mandates in certain areas (such as the rate at which platforms find content proactively, before users flag it) could incentivize companies to neglect other areas (like the accuracy of this detection) to boost performance in measured areas.
  • Hard deadlines for removal (such as “within 24 hours of upload”) could d

Link:

https://www.lawfareblog.com/facebooks-white-paper-future-online-content-regulation-hard-questions-lawmakers

From feeds:

Berkman Klein » lrosenberg's bookmarks

Tags:

community addedcommunity

Authors:

Evelyn Douek

Date tagged:

02/18/2020, 12:49

Date published:

02/18/2020, 12:07