Clearghost: Using the laws of nature to limit digital surveillance by law enforcement.

October 24, 2023

Digitisation owes its disruptive power to the near zero marginal cost of digital products and services. Although the initial investment to create a product or service may be huge, creating a new digital copy, adding new users, or processing more work, costs next to nothing. As a result, these products and services can scale up very quickly without control, creating all kinds of societal problems. In this blog post I will focus on the particular problem of digital surveillance by law enforcement, and will study a speculative approach based on laws of nature to inherently limit their reach.

Introduction

Traditionally, surveillance by law enforcement was a resource intensive affair. Following someone, or intercepting someone’s communication, required significant amounts of time, would involve several law enforcement officers, and would require physical proximity. Broadening the scope of the investigation (following more suspects, listening in to more conversations) created more work that increased linearly with the number of people under surveillance. To surveil twice the number of people would require twice as many law enforcement officers. Given clear resource limitations (on staff and budget), law enforcement would have to make tough choices on which crimes to investigate, and who to put under surveillance for that. Moreover, staying under the radar while surveilling someone is harder in the traditional setting. We see that certain laws of nature (doing twice as much requires twice the effort, hiding physically takes effort) inherently restricts the power and reach of law enforcement.

As criminals move their operations to the digital domain, law enforcement follows suit. They have to. But given the disruptive nature of digital tools sketched above, the shift to such tools for surveillance alter the precarious balance between law enforcement surveillance and the fundamental right to privacy. It is much easier to tail someone attaching a GPS tracker to their car, or tracing the location of their mobile phone. It is much easier to monitor social networks and scan for hateful content, death threats, or calls for unrest. It is much easier, at least in principle, to eavesdrop on a digital service.

On the other hand, the security of digital tools and services has improved, and many use encryption to protect the data we store or communicate. Cloud services and messaging are increasingly end-to-end encrypted, meaning that only the person storing the data, or the sender or recipient of a message, has access: the providers of these services (and anybody else, like law enforcement) no longer have.

The crypto-wars

As a result, the so-called ‘crypto-wars’ have been raging over the past decades. These crypto-wars concern the effort of law enforcement to get (conditional) lawful access to encrypted communications and stored encrypted data, and the push-back of privacy advocates and cryptographers against such weakening of encryption. The first group claimed there is severe problem (‘we are going dark’), the second group claimed there are plenty of opportunities (‘this is the golden age of surveillance’) for law enforcement. The question this blog post is concerned with, is whether this war of trenches can somehow be bridged. Is the nature of digital surveillance really an all or nothing affair? Do we either have total surveillance, or unconditional privacy? It is my strong conviction, even as a privacy researcher, that both would be undesirable.

Several years ago, two proposals for lawfully bypassing encryption were heavily debated: Clear (to get access to stored encrypted data, proposed by Ray Ozzie, of Lotus Notes fame) and Ghost (to get accessed to encrypted messaging data, proposed by Ian Levy and Crispn Robinson from GCHQ). These provide the perfect examples to illustrate our ideas.

Ghost: lawful access to encrypted messages

Let us start with lawful access to encrypted communications. The debate has shifted since the first proposals for key escrow appeared almost thirty years ago. In particular, current proposals call for key recovery. Rather than keeping a copy of the device key used to encrypt or decrypt messages into government custody (that could provide access to all current and past communications) the idea is to create a mechanism that will allow the government to recover only the short lived session keys used to encrypt specific communications on request. One recent example is the Ghost proposal from GCHQ, the UK intelligence organisation.

End-to-end encrypted messaging services like WhatsApp or iMessage maintain a central repository of public keys to use when encrypting messages to the associated user. Every time you send a message, the necessary keys of the recipients are requested from this central repository, maintained by Facebook (in the case of WhatsApp) or Apple (in the case of iMessage), and are subsequently used locally to generate a session key with which to encrypt the message before sending it out. GCHQ’s Ghost proposal essentially proposes to silently add a law enforcement key to this list of retrieved keys when necessary, ensuring that the session key with which the message is encrypted is also recoverable by law enforcement. As a result after intercepting the encrypted message, law enforcement can also decrypt it. The law enforcement key should only be added to the conversation after a properly motivated request to the service provider, for example using a warrant. This prevents access to messages exchanged before the request was made. Also, as soon as access is no longer warranted, the law enforcement key is no longer added to the conversation. This blocks access to any future conversations by law enforcement. The crucial part of the proposal is the need to add such a law enforcement key silently, i.e. without users being aware. Otherwise, people would stop using the service as soon as they saw law enforcement being interested in their conversations.

Analysis

The problem, as Susan Landau observes, that “the service provider is being asked to change its communication system to provide exactly what the end-to-end encryption system was designed to prevent: access by a silent listener to the communication.” As a result, the modification proposed by Ghost enables a technically limitless surveillance opportunity for law enforcement, only limited by procedural means that need to be enforced by the service provider. This is undesirable.

Clear: lawful access to stored encrypted data

Let us now consider the more interesting case of lawful access to stored encrypted data. Modern smartphones encrypt almost all the data they store, releasing the device decryption key when the user unlocks the phone with a PIN code or a biometric, and destroying that key as soon as the user locks the phone. As a result, law enforcement has no access to the data of a locked phone.

The naive approach to implement lawful access to encrypted data stored on smartphones is to demand that all device decryption keys are escrowed and handed to law enforcement. This would provide law enforcement access to the data even remotely (given some type of network access to the phone). Clear is different in that it requires law enforcement to have physical access to the phone in order to get access to the data. This is guaranteed as follows.

Alongside the normal user PIN code, every phone also generates a vendor PIN code that could be used to unlock the phone (and hence provide access to the data it stores). The vendor PIN is encrypted against the public key of the phone manufacturer, and then stored on the phone. This encrypted PIN can be retrieved (for example as a QR code) through a separate law enforcement unlock screen on the phone. Law enforcement with physical access to the phone can take a snapshot of this QR code, send it to the manufacturer with a request to decrypt it (together with the legal documents that authorise this request). If the manufacturer is convinced of the legality of the request it can decrypt the vendor PIN, send it back to the authorities, who then can enter it in the law enforcement unlock screen to gain access. If a phone is unlocked in this special way, a physical fuse inside the phone is blown to freeze its internal state. Each phone has a different, secret, vendor PIN, so this only provides access to a single phone. Also, the procedure outlined guarantees that the authorities have physical access to the phone. (The Clear proposal even uses WiFi or Bluetooth MAC addresses as further attempt to prove physical proximity.)

(It is worth noting that I sketched a similar idea, forcing law enforcement to have physical access to a device for which they want to access stored data, a few years before the Clear proposal was published.)

Analysis

On the face of it, Clear looks like yet another key-escrow proposal. But it is subtly different in that the law enforcement access key (in the case of Clear the vendor PIN) is not stored remotely on some government or manufacturer server, but stored only on the phone, accessible only given physical access to it.

Modern smartphones come with a remote software update feature that allow manufacturers to update the operating system of the phone. For security reasons, the remote update feature requires the new software to be signed by the manufacturer. Phones verify this signature using the public key of the manufacturer. Essentially, this means that anybody with access to the software signing key can create a malicious software update that, once installed, would offer access to all encrypted data on the phone (once it is unlocked by the user at least once, as in the worst case the device storage encryption key is stored in trusted hardware that even the operating system itself cannot access). Given this, storing an encrypted access key against yet another manufacturer public key should not significantly alter the risk, provided that the private decryption key is as closely guarded (in a hardware security module) as the software signing key is: compromise of either gives access to the data on all phones.

Now the Clear proposal has received strong criticism. But this criticism isn’t always entirely fair. For example, the Clear proposal does not require a manufacturer to keep a huge vault of billions of device keys secret: it only needs to keep a single private decryption key secret (that is used to decrypt the encrypted vendor PINs) - just as the software update mechanism requires the manufacturer to keep a single software signing key secret. The main difference is the number of times the key is used: once every few months in case of software signing keys, versus dozens times a day in the case of the decryption key. Yes this increases risks, but does it really increase risk by a significant factor? I am not convinced.

As it stands, Clear doesn’t allow revocation of the decryption key. But just as a software update can revoke old signing keys, Clear can be altered to re-encrypt the vendor PIN against a new encryption key with every software update. Even stronger mechanisms can be envisioned to guarantee that physical access to the device is required to be able to decrypt the data on the phone, for example ensuring that access to the encrypted vendor PIN is only possible through a separate hardware connector inside the phone, and that to obtain the encrypted device key requires a separate access request command signed by the manufacturer.

Anyway, the point here is not trying to argue that this method if offering law enforcement access is 100% foolproof. Rather, it offers a glimpse of an approach that might help re-calibrate the balance between surveillance and privacy, by again limiting the law enforcement surveillance capabilities to laws of nature: in order to get access, they need physical access. (And this still holds even if the vendor PIN decryption key would leak.)

Generalising: towards a Laws of Nature doctrine

The Clear proposal shows us a glimpse of how a physical law (you need physical access to open a box) could be used to put an inherent limitation to the surveillance powers of law enforcement. Can this be generalised? Are there other examples where similar laws of nature have been put to good use? In fact there are.

Hardware-based approaches, for example, are commonly used to improve security. The whole concept of two factor authentication (using a phone or a physical token for authentication) is based on the ‘something you have’ principle: without having something in physical possession, you cannot gain access to an account. Such approaches increase security, as they require an attacker to somehow gain access to your phone or token physically (or trick you in using it yourself). Similarly, secure elements or hardware security modules are used to store secret information that require physical access to access or use (and often deploy tamper resistant approaches to prevent the secret information from being extracted).

Another interesting approach (even though I am not a fan of Bitcoin or the blockchain in general) is proof-of-work: a technique that requires an entity to provably carry out a certain amount of work before getting access. In the case of Bitcoin, proof of work is used to ensure that a random participant in the network is given the power to add a block to the blockchain, namely the first one that manages to provably solve a cryptographic puzzle that requires, on average, ten minutes to solve on current hardware. (Note that any real world application of this technique should first seriously consider its impact on the climate given its huge energy cost, before actual deployment.)

We see that the idea to somehow incorporate the laws of nature, to add intentional friction, into the design of a system has been used before, without necessarily making explicit reference to the concept itself. This approach is an example of the ‘code as law’ principle1 (how code regulates and limits how software works and what it can and cannot do). A principle that also underlies the privacy by design paradigm: if we, by design, through code, protect privacy within the software itself, it cannot be violated. And this level of protection is much stronger than any protection that could be offered by laws and norms alone.

As Emile van Bergen succinctly put it on LinkedIn:

Only physical, not legal laws can protect against mission creep and mitigate the social problems inherent in the frictionless scaling found in the digital domain.

So we can ask ourselves whether it would be useful to develop a more general ‘laws of nature doctrine’, defined as follows:

Incorporate physical laws in the design of digital systems and services to curb their otherwise frictionless scaling, to counteract the societal problems this would otherwise cause.

I intend to explore this in future blog posts.


  1. Or perhaps it could even be framed as a separate ‘physics as law’↩︎

In case you spot any errors on this page, please notify me!
Or, leave a comment.