The Filter Mandate Bill Is a Privacy and Security Mess

Deeplinks 2022-11-08

Summary:

Among its many other problems, the Strengthening Measures to Advance Rights Technologies Copyright Act would mandate a slew of filtering technologies that online service providers must "accommodate." And that mandate is broad, so poorly-conceived, and so technically misguided that it will inevitably create serious privacy and security risks. 

Since 1998, the Digital Millennium Copyright Act (DMCA) has required services to accommodate "standard technical measures" to reduce infringement. The DMCA’s definition of standard technical measures (STMs) requires them to be developed by a broad consensus in an open, fair, multi-industry, and perhaps most importantly voluntary process. In other words, current law reflects an understanding that most technologies shouldn’t be adopted as standards because standards affect many, many stakeholders who all deserve a say.

But the filter mandate bill is clearly designed to undermine the measured provisions of the DMCA. It changes the definition of standard technical measures to also include technologies supported by only a small number of rightsholders and technology companies. 

It also adds a new category of filters called "designated technical measures" (DTMs), which must be "accommodated" by online services. "Accommodating" is broadly defined as "adapting, implementing, integrating, adjusting, and conforming" to the designated technical measure. A failure to do so could mean losing the DMCA’s safe harbors and thereby risking crushing liability for the actions of your users.  

The Copyright Office would be in charge of designating those measures. Anyone can petition for such a designation, including companies that make these technologies and want to guarantee a market for them.

The sheer breadth of potential petitions would put a lot of pressure on the Copyright Office—which exists to register copyrights, not evaluate technology. It would put even more pressure on people who have internet users' rights at heart—independent creators, technologists, and civil society—to oppose the petitions and present evidence of the dangers they'd produce. Those dangers are far too likely, given the number of technologies that the new rules would require services to "accommodate."

Requiring This "Accommodation" Would Endanger Security

The filter mandate allows the Copyright Office to mandate "accommodation" for both specific technologies and general categories of technologies. That opens up a number of security issues.

There’s a reason that standardization is a long, arduous process, and it’s to find all the potential problems before requiring it across the board. Requiring unproven, unaudited technology to be universally distributed would be a disaster for security.

Consider a piece of software developed to scan uploaded content for copyrighted works. Even leaving aside questions of fair use, the bill text places no constraints on the security expertise of the developer. At large companies, third-party software is typically thoroughly audited by an in-house security team before being integrated into the software stack. A law, especially one requiring only the minimal approval of the Copyright Office, should not be able to bypass these checks, and certainly shouldn’t require it for companies without the resources to do them themselves. Poorly implemented software leaves potential security vulnerabilities that might be exploited by malicious hackers to exfiltrate the personal information of a service’s users.

Security is hard enough as it is. Mistakes that lead to database breaches happen all the time even with teams doing their best at security; who doesn’t have free credit reporting from a breach at this point? With this bill, what incentive does a company that makes content-matching technology have to invest the time and money into building secure software? The Copyright Office isn’t going to check for buffer overflows. And what happens when a critical vulnerability is found after software has been approved and widely implemented? Companies will have to choose between giving up their DMCA protection and potentially being sued out of existence by turning it off or letting their users be affected by the bug. No one wins in that scenario, and users lose the most.

"Accommodation" Would Also Hurt Privacy

Similar concerns arise over privacy. It’s bad enough that potential bugs could be exploited to divulge user data, but this bill also leaves the door wide open for direct collection of user data. That’s because a DTM could include a program that identifies potential infringement by collecting personal data while a service is being used and t

Link:

https://www.eff.org/deeplinks/2022/11/filter-mandate-bill-privacy-and-security-mess

From feeds:

Fair Use Tracker » Deeplinks
CLS / ROC » Deeplinks

Tags:

blocking

Authors:

Katharine Trendacosta, Erica Portnoy

Date tagged:

11/08/2022, 19:04

Date published:

11/08/2022, 18:32