Oppose STOP CSAM: Protecting Kids Shouldn’t Mean Breaking the Tools That Keep Us Safe
Deeplinks 2025-06-11
Summary:
A Senate bill re-introduced this week threatens security and free speech on the internet. EFF urges Congress to reject the STOP CSAM Act of 2025 (S. 1829), which would undermine services offering end-to-end encryption and force internet companies to take down lawful user content.
Tell Congress Not to Outlaw Encrypted Apps
As in the version introduced last Congress, S. 1829 purports to limit the online spread of child sexual abuse material (CSAM), also known as child pornography. CSAM is already highly illegal. Existing law already requires online service providers who have actual knowledge of “apparent” CSAM on their platforms to report that content to the National Center for Missing and Exploited Children (NCMEC). NCMEC then forwards actionable reports to law enforcement agencies for investigation.
S. 1829 goes much further than current law and threatens to punish any service that works to keep its users secure, including those that do their best to eliminate and report CSAM. The bill applies to “interactive computer services,” which broadly includes private messaging and email apps, social media platforms, cloud storage providers, and many other internet intermediaries and online service providers.
The Bill Threatens End-to-End Encryption
The bill makes it a crime to intentionally “host or store child pornography” or knowingly “promote or facilitate” the sexual exploitation of children. The bill also opens the door for civil lawsuits against providers for the intentional, knowing or even reckless “promotion or facilitation” of conduct relating to child exploitation, the “hosting or storing of child pornography,” or for “making child pornography available to any person.”
The terms “promote” and “facilitate” are broad, and civil liability may be imposed based on a low recklessness state of mind standard. This means a court can find an app or website liable for hosting CSAM even if the app or website did not even know it was hosting CSAM, including because the provider employed end-to-end encryption and could not view the contents of content uploaded by users.
Creating new criminal and civil claims against providers based on broad terms and low standards will undermine digital security for all internet users. Because the law already prohibits the distribution of CSAM, the bill’s broad terms could be interpreted as reaching more passive conduct, like merely providing an encrypted app.
Due to the nature of their services, encrypted communications providers who receive a notice of CSAM may be deemed to have “knowledge” under the criminal law even if they cannot verify and act on that notice. And there is little doubt that plaintiffs’ lawyers will (wrongly) argue that merely providing an encrypted service that can be used to store any image—not necessarily CSAM—recklessly facilitates the sharing of illegal content.
Affirmative Defense Is Expensive and Insufficient
While the bill includes an affirmative defense that a provider can raise if it is “technologically impossible” to remove the CSAM without “compromising encryption,” it is not sufficient to protect our security. Online services that offer encryption shouldn’t have to face the impossible task of proving a negative in order to avoid lawsuits over content they can’t see or control.
First, by making this protection an affirmative defense, providers must still defend against litigation, with significant costs to their business. Not every platform will have the resources to fight these threats in court, especially newcomers that compete with entrenched giants like Meta and Google. Encrypted platforms should not have to rely on prosecutorial discretion or favorable court rulings after protracted litigation. Instead, specific exemptions for encrypted providers should be addressed in the text of the bill.
Second, although technologies like client-side scanning break encryption, members of Congress have misleadingly claimed otherwise. Plaintiffs are likely to argue that providers who do not use these techniques are acting recklessly, leading many apps and websi
Link:
https://www.eff.org/deeplinks/2025/06/oppose-stop-csam-protecting-kids-shouldnt-mean-breaking-tools-keep-us-safeFrom feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks