Apple has taken a first step. Now it needs to pursue its privacy pledge.

September 22, 2014
2

Apple recently updated its privacy policy, making clear that, unlike other companies (Hi, Google, Facebook,...!), in many cases it cannot or does not access your personal data. At the same time, in an open letter, its CEO Tim Cook states

Our business model is very straightforward: We sell great products. We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t “monetize” the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you. Our software and services are designed to make our devices better. Plain and simple.

Even though I believe this is a very significant step, improving the privacy of ordinary users worldwide, it's only a first step on a long trip to properly protect our privacy.

From a technical perspective not a lot has changed, apart from fixing several important bugs). Unlike Google, Apple iOS encrypts local data by default for some time now. The change is that since iOS 8 Apple can no longer bypass your passcode and access your personal data (such as photos, messages, contacts, call history, etc.). For a device with an earlier iOS version law enforcement can get access to some of that data by sending it to Apple to be partially dumped. (Interestingly, the link describing that process now points to the aforementioned page mentioning Apple can no longer do this...)

As pointed out by others, this by itself offers only limited protection. In general mobile device security is a very hard problem, and there are so many attack vectors (the proprietary baseband processor used for cellular communication, the main hardware, all software running on the device) that we will not see this solved any time soon.

But there are quite a few other things Apple can and needs to do to bring privacy at an adequate level when using Apple products.

First of all, the default iOS settings are not privacy friendly yet. For example, Apple insists on switching on Bluetooth after each iOS update, even if you have switched it off. Similarly, when allowing location based services, iOS enables all system services to track your location.

Secondly, privacy guarantees should be extended to the apps allowed to run on the device. Apple is in a unique position to do so as it tightly controls which apps are allowed in its App Store. But despite Apple's claims to do so, there are several iOS apps that violate your privacy in several ways. A very recent and very bad example is Facebook Messenger. According to iOS security researcher Jonathan Zdziarski this app does more monitoring of user behaviour than 'professional' surveillance apps. It even has access to undocumented iOS system calls. Given Apple's pledge to care for our privacy, it is unbelievable that such an app gets approved for the App Store. Such excesses need to be fixed.

It's good to see that Apple continues to publish technical information about its security (as they started doing in February 2014). But this is not enough. The published documents lack detail (like exact protocol flows and details about cryptographic implementations) to allow any thorough independent evaluation. Such an evaluation would be a first step towards confirming Apple's claims about increased security and privacy protection. Right now all we have to go on are slick presentations and high-level design documents. It's a start, and could be used to do some protocol reverse engineering, but I'd rather see Apple throw us some real bones to chew on. Ideally Apple would publish the source code (like even Microsoft has been doing for some of its products). This would also make it possible to verify, at least partially, whether there are backdoors in the system. I.e. not only verify that Apple does what it claims, but also verify that it doesn't do 'more'.

Apple has designed some of its services, like iMessage, in such a way that there is no central component that has access to the content of the information exchanged. All information is encrypted to guarantee end-to-end security. There are still residual risks in that we need to trust Apple to give us the correct keys for the intended recipient. More importantly though, Apple does not protect our metadata. Its central servers still see who is sending a message to who, or who is connecting to who over Facetime. Such metadata is also very privacy sensitive. It allows people to infer all kinds of information about you. Moreover, as Michael Hayden (former CIA/NSA director) said: "We kill people based on metadata".

Finally, there is iCloud. Like most cloud services it is not secure. The information Apple provides about iCloud security is misleading, claiming that for many of the iOS services the data is encrypted both in transit and at rest (when stored). Although technically speaking this is true, Apple does not very explicitly state that this involves separate keys. In fact the data Apple receives is decrypted first before it is reencrypted with keys Apple itself holds before the data is stored. This means Apple can decrypt the data you store in the cloud (for example when confronted with a legal obligation to do so) without your cooperation or even knowledge. This is easily fixed (although it is very hard to hide all metadata, like when files are accessed, and by whom).

Apple has taken a first step to take our privacy seriously, by pledging publicly that our privacy is important to them. This is significant. But this has to be followed up by concrete action to create the necessary privacy safeguards in their products.

In case you spot any errors on this page, please notify me!
Or, leave a comment.
Martin
, 2014-09-22 08:42:06
(reply)

Not sure if it is true what Tim Cook claims. The iAd advertising program says: “With a few quick clicks you can define your target audience by specifying targeting criteria such as device, gender, age, location, context, time of day.”

Each specific data point is most likely not unique to a single user, but the combination of them most likely is.

Is Apple’s commitment to privacy just a cynical smokescreen or could it be for real? // Jaap-Henk Hoepman
, 2014-11-03 09:18:01
(reply)

[…] these issues not withstanding, Apple is in the unique position to change all this. So what does Apple have to do to convince […]