Encryption as protest

Freedom to Tinker 2014-06-10

As a computer scientist who studies Privacy-Enhancing Technologies, I remember my surprise when I first learned that some groups of people view and use them very differently than I’m used to. In computer science, PETs are used for protecting anonymity or confidentiality, often via application of cryptography, and are intended to be bullet-proof against an adversary who is trying to breach privacy.

By contrast, Helen Nissenbaum and others have developed a political and ethical theory of obfuscation [1], “a strategy for individuals, groups or communities to hide; to protect themselves; to protest or enact civil disobedience, especially in the context of monitoring, aggregated analysis, and profiling..”  CV Dazzle and Ad Nauseam are good examples.

Let’s consider the use of traditional PETs like Tor for obfuscation. The computer science literature is very comfortable with the first two uses (to hide and to protect oneself), but not the latter two (protest and civil disobedience). Adversarial thinking, the model used to analyze PETs by the computer security community, has nothing to say about these uses.

In this post, I want to examine the hypothesis that users of encryption tools also have protest and civil disobedience in mind, instead of (or in addition to) self-defense and anonymity. Encryption is explicitly outside the ambit of obfuscation as its theorists conceive it. Brunton and Nissenbaum are downright deferential:

Obfuscation, as we have presented it here, is at once richer and less rigorous than academically well–established methods of digital privacy protection, like encryption. It is far more ad hoc and contextual, without the quantifiable protection of cryptographic methods — a “weapon of the weak”

Can there ever be a science of obfuscation? With encryption, for example, algorithms have standard metrics based on objective measures such as key length, machine power, and length of time to inform community evaluations of their strength. By contrast, the success of obfuscation is a function of the goals and motives of both those who obfuscate and those to whom obfuscation is directed, the targets. We are tempted, for this reason, to characterize obfuscation as a relatively weak practice. Yet, when strong solutions, such as avoidance, disappearance, hiding (e.g., through encryption) are not available and flat out refusal is not permitted, obfuscation may emerge as a plausible alternative, perhaps the only alternative.

My claim, then, is that even when the supposedly strong weapon of encryption is available, it is often used for the “weak” purpose of obfuscation, specifically as a form of protest.

The key difference when encryption is used as protest is that it is a collective and participatory activity, rather than individualistic. Such users hope, in conjunction with other users, to make life a little bit harder for the powers that be and to protest the surveillance regime. Further, they would like to signal to their peers that they are conscientious citizens who will not accept the status quo. [2]

As a corollary, users will seek the simplest possible tools to achieve the objectives of protest and signalling, even at the expense of security. This is because there aren’t any major personal benefits to encrypting nor any repercussions from the encryption being defeated. It’s a bit like recycling — we’d like to act responsibly, but won’t do it if it’s too hard. Of course, users always favor convenience over security more than developers would like, but this is an extreme version.

My evidence for all this is primarily anecdotal, but there’s a lot of it. Here’s one comment that brings together a lot of what I’ve said above:

IMO, the “encryption as protest” idea has a lot of merit. A big challenge is finding tools that to make it easy for non-technical users. I’ve been toying with the apps from these guys for encrypting text and voice on Android based devices. All and all, they are pretty darn easy to use and seem to do the trick.

On a more puerile level, I’ve also been trying to figure out how to merge the “forbidden keyword” idea with photo-bombing our good workers at the NSA pictures of my naked butt. Since we know they don’t look at stuff like “Let’s ram buildings in America with commercial airliners” … I was thinking maybe something like an email saying “Let’s all picket the XL pipeline next Thursday with this poster” might be a better trigger.

So far, so good — some people use encryption to hide; many others use it to protest. But here’s the catch. There is an inescapable trade-off between convenience and security — tools that put security first require user training and informed decisions, whereas insecure ones pitch themselves as one-click solutions. [3] Because of the network effects in the market for these tools, those that cater to the lowest common denominator will win. This might explain why the vast majority of encryption tools that have cropped up over the last year appear to be insecure.

What can we learn from this? First, security and privacy researchers should study how users actually use PETs instead of assuming that all users have the same set of values and preferences. Even an insecure encryption tool is perfectly fine if used for obfuscation or protest. The security community regularly gives users too little credit. Conversely, users who do care about cryptographic security of their encryption tools should be aware that there are a lot of misleading claims out there, and strong security is probably not achievable without effort, training, and vigilance. Finally, I’d love for someone with a background in sociology or digital anthropology to research this topic and improve our understanding of why people encrypt!

[1] While it is tempting as a technologist to map concepts like obfuscation into existing technical terms, it is important to remember that no such mapping exists. Obfuscation is not a technology but the act of using certain technologies for certain ends.

[2] Kate Crawford takes this a step further and argues that it has become a status symbol to blend in, whether digitally or in the real world, and connects it to the normcore fashion trend.

[3] This was the root of Pete Zimmerman’s criticism of Wickr last week.

Thanks to Solon Barocas for comments on a draft.