The Good News and the Troubling News: We’re Not Going Dark - Lawfare

data_society's bookmarks 2016-02-01

Summary:

Just over a year ago, with support from the William and Flora Hewlett Foundation, Harvard’s Berkman Center for Internet & Society at Harvard University convened a diverse group of security and policy experts from academia, civil society, and the U.S. intelligence community to begin to work through some of the particularly vexing and enduring problems of surveillance and cybersecurity.

The group came together understanding that there has been no shortage of debate. Our goals were to foster a straightforward, non-talking-point exchange among people who do not normally have a chance to engage with each other, and then to contribute in meaningful and concrete ways to the discourse on these issues.

A public debate unfolded alongside our meetings: the claims and questions around the government finding a landscape that is “going dark” due to new forms of encryption introduced into mainstream consumer products and services by the companies who offer them. We have sought to distill our conversations and some conclusions in a report, which can be found here.

We also invited participants to draft individual comments — mine is below; Bruce Schneier and Susan Landau will share their own thoughts this week on Lawfare as well.

***

Two trends have dominated the U.S. foreign intelligence landscape for the past fifteen years.

The first arises from the terrorist attacks of 9/11. The attacks reshaped the priorities of the U.S. intelligence community, as extraordinary resources have been allocated to prevent and counter terrorism. Our national security establishment has pioneered new technological tools and new legal authorities (or interpretations of existing ones) in an effort to secure safety.

The second trend is the mainstreaming of the Internet and surrounding technologies built around and upon it, which has led to an unprecedented proliferation of data that can be analyzed by the intelligence services. In late 2001 there were no smartphones and no social media. Facebook and Twitter were still years away from capturing our imagination, our time — and our data. The more bits we generate, actively through typing and talking, and passively by sharing our location, our social relationships, and other information as we go about our lives, the more there is for vendors — and the governments to whom they answer — to potentially review, whether in bulk or individually.

The intersection of these trends led to what Peter Swire and Kenesa Ahmad in 2011 called “the Golden Age of Surveillance.” Since then, that high water mark for opportunities for surveillance has receded in places. Some communications and data previously accessible by governments through vendors is no longer so easily obtained, because some vendors have refined the technologies they offer to prevent even themselves from seeing the data the users generate and exchange with one another. Such technologies, including the use of encryption, are not new as a category, but their entry into mainstream usage perhaps is. Losing a tool, rather than never having had it to begin with, is no doubt highly salient for the director of the FBI and others charged with protecting security. They ask: if we have a warrant or other legal authority, why should previously-accessible information now be off-limits to us?

I empathize with the idea that just how much government can learn about us should not depend on the cat and mouse game of technological measure and counter-measure. Ideally, a polity would carefully calibrate its legal authorities to permit access exactly and only where it comports with the imperatives of legitimate security — and with basic human rights as recognized through the protections of conventions and constitutions. For one intriguing attempt to reconcile government use of technological hacking tools with appropriate privacy protections, you might read the proposal for “lawful hacking” that civil liberties-minded computer scientists Steven Bellovin, Matt Blaze, Sandy Clark, and fellow project participant Susan Landau have advocated.

But it is a very large step — a leap, even — to go beyond the legal demand for information already in a company’s possession, and beyond the use of technological tools to reveal what otherwise is obscure, to requirements on how technology must be deployed to begin with. I’ve written reasons why this leap is ill-advised. To try to constrain the generative Internet ecosystem in that way would be either futile or require that we, in the fitting words of the U.S. Supreme Court, “burn the house to roast the pig.”

Link:

https://www.lawfareblog.com/good-news-and-troubling-news-were-not-going-dark

From feeds:

Data & Society » data_society's bookmarks

Tags:

dsreads privacy security surveillance governance generativity law regulation internet of things

Authors:

Jonathan Zittrain

Date tagged:

02/01/2016, 10:30

Date published:

02/01/2016, 06:34