The PACT Act Is Not The Solution To The Problem Of Harmful Online Content

Deeplinks 2020-07-30

Summary:

The Senate Commerce Committee’s Tuesday hearing on the PACT Act and Section 230 was a refreshingly substantive bipartisan discussion about the thorny issues related to how online platforms moderate user content, and to what extent these companies should be held liable for harmful user content.

The hearing brought into focus several real and significant problems that Congress should continue to consider. It also showed that, whatever its good intentions, the PACT Act in its current form does not address those problems, much less deal with how to lessen the power of the handful of major online services we all rely on to connect with each other.

EFF Remains Opposed to the PACT Act

As we recently wrote, the Platform Accountability and Consumer Transparency (PACT) Act, introduced last month by Senators Brian Schatz (D-HI) and John Thune (R-SD), is a serious effort to tackle a serious problem: that a handful of large online platforms dominate users’ ability to speak online. The bill builds on good ideas, such as requiring greater transparency around platforms’ decisions to moderate their users’ content—something EFF has championed as a voluntary effort as part of the Santa Clara Principles.

However, we are ultimately opposed to the bill, because weakening Section 230 (47 U.S.C. § 230) would lead to more illegitimate censorship of user content. The bill would also threaten small platforms and would-be competitors to the current dominant players, and the bill has First Amendment problems.

Important Issues Related to Content Moderation Remain

One important issue that came up during the hearing is to what extent online platforms should be required to take down user content that a court has determined is illegal. The PACT Act provides that platforms would lose Section 230 immunity for user content if the companies failed to remove material after receiving notice that a court has declared that material illegal. It’s not unreasonable to question whether Section 230 should protect platforms for hosting content after a court has found the material to be illegal or unprotected by the First Amendment.

However, we remain concerned about whether any legislative proposal, including the PACT Act, can provide sufficient guardrails to prevent abuse and to ensure that user content is not unnecessarily censored. Courts often issue non-final judgments, opining on the legality of content in a motion to dismiss opinion, for example, before getting to the merits stage of a case. Some court decisions are default judgments because the defendant does not show up to defend herself for whatever reason, making any determination about the illegality of the content the defendant posted suspect because the question was not subject to a robust adversarial process. And even when there is a final order from a trial court, that decision is often appealed and sometimes reversed by a higher court.

Additionally, some lawsuits against user content are harassing suits that might be dismissed under anti-SLAPP laws, but not all states have them and there isn’t one that consistently applies in federal court. Finally, some documents that appear to be final court judgments may be falsified, which would lead to the illegitimate censorship of user speech, if platforms don’t spend considerable resources investigating each takedown request.

We were pleased to see that many of these concerns were discussed at the hearing, even if a consensus wasn’t reached. It’s refreshing to see elected leaders trying to balance competing interests, including how to protect Internet users who are victims of illegal activity while avoiding the creation of broad legal tools that can censor speech that others do not like. But as we’ve said previously, the PACT Act, as currently written, doesn’t attempt to balance these or other concerns. Rather, by requiring the removal of any material that someone claims a court has declared illegal, it tips the balance toward broad censorship.

Another thorny but important issue is the question of competition among online platforms. Sen. Mike Lee (R-UT) expressed his preference for finding market solutions to the problems associated with the dominant platforms and how they moderate user content. EFF has urged the government to consider a more robust use of

Link:

https://www.eff.org/deeplinks/2020/07/pact-act-not-solution-problem-harmful-online-content

From feeds:

Fair Use Tracker » Deeplinks
CLS / ROC » Deeplinks

Tags:

230

Authors:

Sophia Cope, Aaron Mackey

Date tagged:

07/30/2020, 18:58

Date published:

07/30/2020, 18:38