O (no!) Canada: Fast-moving proposal creates filtering, blocking and reporting rules—and speech police to enforce them
Deeplinks 2021-08-11
Summary:
Policymakers around the world are contemplating a wide variety of proposals to address “harmful” online expression. Many of these proposals are dangerously misguided and will inevitably result in the censorship of all kinds of lawful and valuable expression. And one of the most dangerous proposals may be adopted in Canada. How bad is it? As Stanford’s Daphne Keller observes, “It's like a list of the worst ideas around the world.” She’s right.
These ideas include:
- broad “harmful content” categories that explicitly include speech that is legal but potentially upsetting or hurtful
- a hair-trigger 24-hour takedown requirement (far too short for reasonable consideration of context and nuance)
- an effective filtering requirement (the proposal says service providers must take reasonable measures which “may include” filters, but, in practice, compliance will require them)
- penalties of up to 3 percent of the providers' gross revenues or up to 10 million dollars, whichever is higher
- mandatory reporting of potentially harmful content (and the users who post it) to law enforcement and national security agencies
- website blocking (platforms deemed to have violated some of the proposal’s requirements too often might be blocked completely by Canadian ISPs)
- onerous data-retention obligations
All of this is terrible, but perhaps the most terrifying aspect of the proposal is that it would create a new internet speech czar with broad powers to ensure compliance, and continuously redefine what compliance means.
These powers include the right to enter and inspect any place (other than a home):
“in which they believe on reasonable grounds there is any document, information or any other thing, including computer algorithms and software, relevant to the purpose of verifying compliance and preventing non-compliance . . . and examine the document, information or thing or remove it for examination or reproduction”; to hold hearing in response to public complaints, and, “do any act or thing . . . necessary to ensure compliance.”
But don’t worry—ISPs can avoid having their doors kicked in by coordinating with the speech police, who will give them "advice" on their content moderation practices. Follow that advice and you may be safe. Ignore it and be prepared to forfeit your computers and millions of dollars.
The potential harms here are vast, and they'll only grow because so much of the regulation is left open. For example, platforms will likely be forced to rely on automated filters to assess and discover "harmful" content on their platforms, and users caught up in these sweeps could end up on file with the local cops—or with Canada’s national security agencies, thanks to the proposed reporting obligations.
Private communications are nominally excluded, but that is cold comfort—the Canadian government may decide, as contemplated by other countries, that chat groups of various sizes are not ‘private.’ If so, end-to-end encryption will be under further threat, with platforms pressured to undermine the security and integrity of their services in order to fulfill their filtering obligations. And regulators will likely demand that Apple expand its controversial new image assessment tool to address the broad "harmful content" categories covered by the proposal.
In the United States and elsewhere, we have seen how rules like this hurt marginalized groups, both online and offline. Faced with expansive and vague moderation obligations, little time for analysis, and major legal consequences if they guess wrong, companies inevitably overcensor—and users pay the price.
For example, a U.S. law intended to penalize sites that hosted speech related to child sexual abuse and trafficking led large and small internet platforms to censor broad swaths of speech with adult content. The consequences of this censorship have been devastating for marginalized communities and groups that serve them, especially organizations that provide support and services to victims of trafficking and child abuse, sex workers, and groups and individuals promoting sexual freedom. For example, the law prevented sex workers from organizing and utilizing tools that have kept them safe. Taking away online forums, client-screening capabilities, "bad date" lists, and other intra-community safety tips means putting more workers on the street, at higher risk, which leads to increased violence and trafficking. The impact was particularly harmful for
Link:
https://www.eff.org/deeplinks/2021/08/o-no-canada-fast-moving-proposal-creates-filtering-blocking-and-reporting-rules-1From feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks