Who should have access to the data on our devices?

March 16, 2016

The recent order of an US court for Apple to comply with the FBI’s request for technical assistance in the recovery of data on an iPhone 5C used by a terrorist has sparked a huge debate.

And rightly so. This is an important case, not so much because of the particulars of the case, but because of the broader issues that are at stake here. Unfortunately, the debate centers still on the particulars of the case and is not really broadening up to the wider perspective.

Smart phones are very intimate devices. They know a lot about us, and store a lot of sensitive, personal, data about us. But with the advent of the Internet of things we will be surrounded by more and more similarly intimate devices. Think about smart thermostats (that know when you are at home), smart TV's (that know what you watch), and self driving cars (that know where you are going and what you are doing in your car). All these devices collect and store personal data about you. Moreover, they are networked and can typically be remotely updated.

Therefore the way this particular case is decided will have a bearing on how (if at all) government access to the data on all these other devices will be mandated in similar (and maybe not so similar) cases in the future. There is a significant risk that this will set a precedent that poses a real threat to privacy, as well as a significant weakening of the security of the Internet at large.

On the other hand it can be argued that requiring government access to data on specific devices in the custody of law enforcement officials is what we have argued for all along. As a response to the Snowden revelations, among others, many of us have said we wanted targeted investigation instead of dragnet surveillance. If this is not what we want either, then what do we want?

The central question then is the following. Under which circumstances, if any, should government have access to the (personal) data stored on our devices? And if such access is deemed warranted, how should that access be provided?

I believe this question should not be answered by a single judge that happens to be involved in this court case. It should be answered based on a more fundamental political and societal debate, informed by sound technical, legal and ethical insights. This debate should deliver guiding principles that will also help us decide on similar cases in the future.

But first let us look at the specific case, because it does help identifying some important aspects that are relevant to address this question. (Note: What I will write below about Apple and iPhones concerning the case at hand equally applies to Android and other devices.)

iOS security

Data on an iPhone is always stored encrypted against a phone specific key that not even Apple itself knows. When the phone is locked, this key is destroyed, making all data on a locked phone inaccessible. When unlocking the phone, the key is recreated using some secret data stored on the phone and the passcode entered on the phone (or using the data from the fingerprint sensor). The phone only allows a limited number of attempts to unlock the phone (and even imposes a delay after a few attempts). If you try too many times, the phone remains locked forever. This test is, unfortunately, not implemented in hardware and therefore can be bypassed with a software update.

What the FBI is requesting

FBI possesses a locked iPhone that belonged to one of the San Bernardino terrorists. It believes the phone contains important information relevant to the ongoing investigation into the attack. (Let us, for arguments sake, assume this is true.)

The FBI is requesting that Apple provides it with a software update that bypasses the passcode try-counter test and that does not slow down the passcode test after a few wrong attempts (as the 'normal' version of iOS does). The update should allow the FBI to submit passcode tries electronically to the iPhone in their possession, and should not modify the iOS or the data on the phone in any other way. The FBI itself should be given this 'backdoored' software update to upload it to the iPhone in their possession. The update should be tailored in such a way that it only installs and runs on this particular iPhone.

It is unclear whether this last requirement, that the update only works on a specific device, can be met by Apple. (This blog contains very useful background information on the exact nature and consequences of FBI's request.)

The update process: a golden key in disguise

Apple could have made the update more secure. For example by requiring that updates on locked phones always wipe all user data. This makes data on locked phones secure. For normal users this is not desirable, however. Unless they made an encrypted backup (through iTunes, not iCloud) they will loose all their data if the device for some reason needs to be updated when bricked. A reasonable compromise would be to offer the option to disable such user data wiping when updating the phone. Of course this setting should only be changeable when the device is unlocked.

Unlocked phones are of course still insecure. Users will typically accept any update offered to them. And they have no way to verify what is inside the update anyway. Unless we move to a radically different update system where updates are not offered by Apple directly, but are built using a reliable built process from trusted sources, this vulnerability remains. For unlocked phones, Apple can always access (or always be forced to access) all user data on the phone by pushing an update that collects this data. It is worth stressing that this vulnerability is inherent to the current update process.

Note, by the way that this whole idea of automatic updates has dramatically increased the security of smart phones and PCs alike. Any bugs will be patched with the next update soon after their discovery. This no longer depends on explicit actions by individual owners. As a result the number of unpatched and vulnerable devices is much lower. So to move away from automatic updates has (other) security consequences.

The difference between stored data and communications data

As I already wrote in my critique of the Keys Under Doormats report, there is a big difference between stored data and communications data, in terms of the way in which government access can be provided.

In particular, it appears to be possible to design a protocol to access data on devices (like smart phones) in a very restricted way such that the data can only be accessed by the device manufacturer, and only when the device is physically present.

For example, Apple could create a special update of its iOS operating system once, sign it and securely store it in a box (the 'exfiltrator') next to the box that secure stores the cryptographic keys with which Apple signs all its updates. After that the update is destroyed. (It only exists inside that box.) The box has an activation mechanism and a USB port. If the box is activated and an iPhone is physically connected to the USB port, the following happens

  • The special update is installed on this iPhone.
  • When the iPhone is switched on, the special update allows the box to send an unlimited number of passcode attempts to unlock the phone.
  • When the correct passcode is found, the phone will dump all its data to the external hard disc, also connected to the box.

For additional security, the phone could even have some special internal port to connect with the 'exfiltrator', and the update could be signed with a separate key bound to this special port. (I.e. updates signed with this special key are not accepted as normal updates pushed to the phone either over the WiFi channel or the standard Lightning port.)

This assumes the current situation in which

  • an update of a locked phone does not completely reset it, and does not erase (or make unreadable) all user data, and
  • the passcode try counter test is not implemented in hardware.

Apple may choose to tighten its security and change the default to erase all data when updating a locked phone. However, it can always reverse this decision (when forced to do so) with a later version of iOS that all users will happily install on their unlocked phones.

Increasing smart phone security even further

In newer iPhone models, the secret code and the passcode verification used to derive the phone specific encryption/decryption key is all handled in a special piece of secure hardware that is very hard to tamper with. As a result it is quite difficult to extract the secret code, so the FBI cannot try all possible passcodes offline to create a key and test if it works to decrypt the data on the phone. Instead the FBI has to give its guess of the passcode to this special piece of hardware and see if it unlocks the phone.

This would have been very secure, if the mechanism to limit the number of passcode tries would have been implemented in the same piece of hardware. It appears however that this test is implemented in software. As a result, a software update can bypass this test and allow the FBI to send an unlimited number of passcode guesses to the hardware, until it finds the correct one.

Apple could fix this hole in newer iPhone models to ensure that the whole passcode test, including the functionality to limit the number of tries, is done within the special piece of secure hardware. Then, no software update will allow anyone an unlimited number of passcode tries. Attempts to recover any data from a locked phone will then require more sophisticated hardware attacks.

Unlike software updates, Apple cannot be forced to undo this change for sold models. It can only change the hardware design for future models.

What does all this mean?

The above setup is much more restrictive than what the FBI is requesting. There is only one box, in Apple's possession, and which it guards as vigilantly as its signing keys, that can access user data on an iPhone (if the passcode is easy enough to guess). And only if that iPhone is physically connected to the box. The security risk associated with this setup is no larger than the security risk introduced by an automatic update mechanism. (And we have already argued that such a mechanism has huge security benefits.)

All in all I think that the above setup is a good example of targeted surveillance, and a reasonable compromise between our right to privacy and the need for law enforcement to get access to personal data in exceptional cases. (It is not without problems however, see below.)

However, Apple can easily increase the security of its phones by changing the update mechanism to wipe locked phones and to fix the passcode try counter test. In that case, the above setup would no longer work: data on locked phones becomes much harder to access, and would require hardware-based attacks. Data on unlocked phones is vulnerable as ever. Which leaves us in the bizarre situation that, when designed properly, for locked phones Apple can convincingly claim to be unable to recover any data on the phone. While for unlocked phones Apple can always obtain access to the data if Apple wanted to (or is forced to).

So the real question is the following: is it reasonable to force Apple not to fix this security vulnerability? In other words: do we, as society, think there should always be some kind of special mechanism (a backdoor, a front door, a golden key) that provides government access to the data stored on our devices. If so: what should that access mechanism look like, under which circumstance should that access be provided, who decides whether access is given, and what additional security risks does this mechanism introduce?

For example, instead of requiring the FBI to try all possible passcodes, Apple could escrow the device encryption/decryption key to a key Apple itself possesses. The secure hardware responsible for protecting this device key could be instructed to release this key, but only when connected to the 'exfiltrator' sketched above. (Note again that we are talking about a very restricted scenario in which Apple can get access to data on locked phones, while we already established that Apple can always get the data on unlocked phones if it so desires, even remotely.)

All this is not without issues, that need to be taken into account.

Apple decides whether it will comply with a request by the FBI to recover the data on an iPhone the FBI hands over to them. This is similar to how Apple (and any other company) currently cooperates with law enforcement requests for user data, e.g. those that are stored on iCloud. And similar schemes also exist to assist law enforcement in placing wiretaps.

This reliance on a single commercial entity to balance government access requests with our right to privacy is a concern. Government, especially the government of the country in which the company is based or where the company has a large presence, is in the position to exert strong pressure to comply with a request for access. This could be mitigated somewhat by distributing the mechanism to activate the 'exfiltrator' over several geographically separate entities.

All this puts an extra burden on Apple (and all other manufacturers of similar devices), especially if a special internal port needs to be provided in all future iPhone models. But this also not unheard of in other domains. Wiretapping and data retention requirements impose significant costs on telecommunication providers, because they need to adjust their infrastructure. Note however that the laws apply to a small number of large (national) telecom operators, that only need to comply with governmental access requests from their 'own' country. An important, argument against the approach outlines above is that it imposes a large burden on companies like Apple to handle requests for access form all over the world, and to handle them all in a fair way.

Another concern is that this creates a precedent for other countries, think China or Syria, to press for similar arrangements with Apple (and other manufacturers). Some people I spoke with recently told me they believe such arrangements are in place already, and that they believe even the NSA has some kind of arrangement with Apple to obtain access. In any case we have to realise that any move towards targeted surveillance, or targeted investigation, opens the door for all countries across the globe to aim for similarly targeted (or less targeted) access. (And I tend to side with the cynical that this type of access is already happening anyway, but in secret. I'd rather have it in the open, through accountable mechanisms.)

This is an important political and societal debate, that I hope will deliver guiding principles that will help us decide on similar cases in the future.

In case you spot any errors on this page, please notify me!
Or, leave a comment.