When “Jawboning” Creates Private Liability

Deeplinks 2022-06-22

Summary:

A (Very) Narrow Path to Holding Social Media Companies Legally Liable for Collaborating with Government in Content Moderation

For the last several years we have seen numerous arguments that social media platforms are "state actors" that “must carry” all user speech. According to this argument, they are legally required to publish all user speech and treat it equally. Under U.S. law, this is almost always incorrect. The First Amendment generally requires only governments to honor free speech rights and protects the rights of private entities like social media sites to curate content on their sites and impose content rules on their users. 

Among the state actor theories presented is one based on collaboration with the government on content moderation. “Jawboning”—or when government authorities influence companies’ social media policies—is extremely common. At what point, if any, does a private company become a state actor when they act according to it?

Deleting posts or cancelling accounts because a government official or agency requested or required it—just like spying on people’s communications on behalf of the government—raises serious human rights concerns. The newly revised Santa Clara Principles, which outline standards that tech platforms must consider to make sure they provide adequate transparency and accountability, specifically scrutinize “State Involvement in Content Moderation.” As set forth in the Principles: “Companies should recognise the particular risks to users’ rights that result from state involvement in content moderation processes. This includes a state’s involvement in the development and enforcement of the company’s rules and policies, either to comply with local law or serve other state interests. Special concerns are raised by demands and requests from state actors (including government bodies, regulatory authorities, law enforcement agencies and courts) for the removal of content or the suspension of accounts.”

So, it is important that there be a defined, though narrow, avenue for holding social media companies liable for certain censorial collaborations with the government. But the bar for holding platforms accountable for such conduct must be high to preserve their First Amendment rights to edit and curate their sites. 

Testing Whether a Jawboned Platform is a State Actor

We propose the following test. At a minimum: (1) the government must replace the intermediary’s editorial policy with its own, (2) the intermediary must willingly cede the editorial implementation of that policy to the government regarding the specific user speech, and (3) the censored party lacks an adequate remedy against the government. These findings are necessary, but not per se sufficient to establish the social media service as a state actor; there may always be “some countervailing reason against attributing activity to the government.” 

In creating the test, we had two guiding principles.

First, when the government coerces or otherwise pressures private publishers to censor, the censored party’s first and favored recourse is against the government. Governmental manipulation of the already fraught content moderation systems to control public dialogue and silence disfavored voices raises classic First Amendment concerns, and both platforms and users should be able to sue the government for this. In First Amendment cases, there is a low threshold for suits against government agencies and officials that coerce private censorship: the government may violate speakers’ First Amendment rights with “system[s] of informal censorship” aimed at speech intermediaries. In 2015, for example, EFF supported a lawsuit by Backpage.com after the Cook County sheriff pressured credit card processors to stop processing payments to the website. 

Second, social media companies should retain their First Amendment rights to edit and curate the user posts on their sites as long as they are the ones controlling the editorial process. So, we sought to distinguish those situations where the platforms clearly abandoned editorial power and ceded editorial control to the government from those in which the government‘s desires were influential but not determinative. 

We proposed this test

Link:

https://www.eff.org/deeplinks/2022/06/when-jawboning-creates-private-liability

From feeds:

Fair Use Tracker » Deeplinks
CLS / ROC » Deeplinks

Tags:

analysis speech social media legal government free blocking

Authors:

David Greene

Date tagged:

06/22/2022, 02:03

Date published:

06/21/2022, 16:55