The second day of the Privacy Enhancing Technologies (PET) Symposium here in Amsterdam hosted a panel on PETs post Snowden: Implications of the revelations of the NSA and GCHQ Surveillance Programs for the PETs community. The panel consisted of Susan Landau,
Wendy Seltzer, Nadia Heninger, Marek Tuszynski, and George Danezis. Seda Gürses prepared and moderated the panel in an excellent way. The Privacy & Identity Lab and NWO provided financial support. Here is a brief summary of the discussion that ensued. (There is also a handout that Seda produced.)

Susan: Started by saying that it is extremely important, when (re)negotiating the balance between security and privacy, to understand the other side’s needs, and take these into serious consideration. She also noted that there was a massive change in USA SIGINT collection policy, that was essentially initiated by a relatively junior official at the Justice Department. She asked the question how anything like this was possible, and what technical and policy measures should be taken to prevent this happening again in the future.

Marek: USA has set a bad example and precedent. There no longer is a good example one can point to when trying to convince other nations to respect (digital) freedom. Government no longer is in control. Big corporations (like Google and Facebook) sit on the bulk of personal data, controlling it (and essentially free to do whatever they wish to do with it). Also, these corporations are a fierce lobbying power.

George: Privacy is not the only thing at stake. The issue is much bigger, and something George calls Cyber-Colonialism. If you own a valuable resource, nations will try to subvert and gain access to it, for their own good (e.g. for economic benefit). In fact, nations try to control as much as possible of the international infrastructure they can, because if you control the infrastructure you control the destiny of whole nations.

Wendy: Concerned about power imbalances, and fragmentation. Because nations try to get tighter control over their own infrastructure, we are gradually moving towards many small, separate, internets (without ordinary people necessarily being aware of this fact). (Note: the same effect also plagues the commercial Internet services that do not interconnect at all, and create all kinds of ‘filter bubbles’).

Susan: Law enforcement must be able to hack systems, because people protect their communications and systems better these days. Which begs the question: do we, as a community, develop protection for the masses (to prevent dragnet surveillance) or do we develop protection for the individual (to prevent targeted surveillance).

Marek: The private sphere has become much more political. We have not lost privacy; it simply has become a different value; has gotten a different meaning.

Nadia: The problem we as a community face is the fact that we now have a new threat model: the global passive adversary. One we usually assumed did not exist. (Tor explicitly mentions it cannot resist a global passive adversary.) This is problematic because to build security, you need some basis of trust to build it, yet there are very few sources for trust left.

Wendy: Doesn’t think nothing can be trusted anymore. We need ways to restore justifiable trust, for example by being transparent, both in terms of technology and in terms of policy. We are all stakeholders in this risk analysis.

Nadia: Considers it a huge loss that NIST subverted the technical infrastructure in an unexpected and non-public way. You don’t want Dan Bernstein to be the only source of trustable cryptography…

George: It takes collaborative action to create a secure, trustworthy infrastructure (not only IT, also e.g. water, electricity). Typically this action is coordinated by the nation state. Yet in the case of security, the nation state has just proven not to be a benign actor. So: can we define our own processes? Can we create action ourselves, without the nation state.

Marek: Can we learn from environmentalists? In that domain there is a similar lack of incentives for governments and business to act. Yet the political power of the user achieved some movement in that domain.

?: Align incentives of companies with privacy conscious users. For example, because of Snowden revelations US cloud service providers have been hit financially, and Google is now applying cryptography to protect their customers.

Marek: This is bullshit. Google has got all the data about us it wants. They do this to protect the data they have from their competitors!

Susan. Suggests to move the frame from security (prevent all terrorism, which people in their hearts know is impossible) to resilience (restore systems after terrorism). Resiliency is much easier to align with privacy.

George: “Every multi-billion dollar business is based on a scam.” In Google’s case the scam is to convince everyone that it is logical to implement services for their users in a centralistic way, such that Google gets to collect all data. This may be necessary for search. But is definitely not necessary for email or social networking. Surveillance is an existential threat to cloud service providers.

This is where the audience was invited to join the discussion.

Jake Applebaum: Argues that secret surveillance must be destroyed, and argues that with the right cryptographic tools one can detect (and thus prevent) secret surveillance.

Susan: Disagrees. Surveillance is sometimes appropriate. (Which was not what Jake said; unclear whether Susan agrees secret surveillance is sometimes necessary.) But bulk global surveillance is god awful: what kind of a moral standard is it that allows a democratic society to do a thing like that? Susan also points to a WEIS paper by Ross Anderson on the network effects of surveillance.

George: We are moving towards a world in which (secret) surveillance becomes harder.

Marek: Wonders about the “Theatre of Accountability”: we see so many bad, obviously illegal things, revealed yet nobody is held accountable.

Nadia: We need to distinguish methods, that don’t need (and in fact should not) be secret, and targets, that may need to be secret. Also worried about the total absence of understanding technology by US policy makers.

Wendy: Sitting on vulnerabilities (i.e. not disclosing them) leaves these vulnerabilities unpatched so that others can abuse.

Jean-Pierre Hubaux: Secret surveillance is not going away. Can we design systems to make surveillance accountable, to deal with this fact?

Rene (NIST): Claims that the DUAL EC random number generator debacle would not happen again. Backdooring is hard, especially to ensure that you are the only entity that can access the system through the backdoor. It is important to keep vigilance over what NIST produces: you may not be able to keep NSA accountable, but NIST is accountable.

Caspar Bowden: Put law in the threat model: some countries have laws that violate fundamental human rights.

Rop Gonggrijp: Democracy is under threat: people should realise the very real risk that we fall into the abyss of totalitarianism.

Nadia: Because of the Duel EC debacle, can we still trust NIST for other standards, like those on elliptic curves? Probably so, but we cannot be sure, because we cannot trust the process (anymore).

George (answering Jean-Pierre): Most surveillance occurs outside of the rule of law. In order to fix this is equivalent to solving other global problems, like climate change or achieving world peace. Hence George calls this problem “World Peace Complete”.

Paul Syverson: George is wrong. NSA is insanely concerned with law: everything must be vetted. We may not agree with the law or the process, but there is a process.

Joris van Hoboken: Is it feasible to develop cryptographic standards at the international level (to avoid dependence on NIST)?

Axel Arnbak: Why do we debate the issues within their frame? Why don’t we discuss much more the bizarreness of it all, for example the start trek control room designed by Keith Alexander? Why don’t we engage other communities (art!) to make this bizarreness much more visible.

Jake Applebaum (responding to certain figures mentioned by Susan): the NSA will only publish figures that serve their own interests. “Why should we trust these people”.

George: step outside the threat model; some technology already creates difficulties for the NSA. Secret services are truly concerned of ‘blackouts’ if the technology we have developed will actually be used on a large scale.

Nadia (answering Joris’ question): there are international cryptographic standards; the problem is that they just aren’t as good…
We should aim for transparency, to rebuild trust in institutions.

Susan: As said before, we need to change the conversation to talk about resilience, not security. And remember: in the end we elect the people in power….

In conclusion: this was a very interesting panel. My main concern (and frustration) was still the US centricity of the debate. Personally, I have no reason to trust NIST, and in fact I think any standardisation should take place at the international level anyway.