Google Apple Contact Tracing (GACT): a wolf in sheep's clothes.

April 19, 2020
56

I wrote a critical piece about the Google Apple Contact Tracing (GACT) platform a few days ago. This resulted in quite some discussion, which brought some more arguments to the fore, that I would like to address and clarify here. Here I focus on my area of expertise: privacy. My colleague Tamar Sharon wrote an eloquent article about the (much) broader picture, that we should definitely not lose sight of.

Note 2020-12-15: I wrote an updated and much more detailed analysis based on this blog post, and published this on arXiv.

Quite a few people actually like the GACT platform. According to them, the platform protects privacy, because it uses a decentralised form of contact tracing (where no central server learns who were in close contact). Moreover, if Apple and Google only offer access to the Bluetooth hardware (necessary to do contact tracing in the first place) through this interface, this would prevent apps form doing contact tracing in a centralised way (where the central server does learn which people were in close contact). In this view, the move by Google and Apple saves the whole world from a centralised contact tracing disaster.

I beg to differ, however, for the following reasons (expanding on my previous blog post and clarifying some of my arguments).

How GACT works

First, let's describe in a little more detail how GACT is supposed to work. Notice that some of this is a good faith effort: the published specifications (Google, Apple) are sparse and still in flux. For example, the version I saw earlier mentioned that apps could see the location where a contact took place. This is no longer the case in the current versions of the standard.

Detecting nearby phones

Phones detect proximity of other phones by broadcasting random looking proximity identifiers over Bluetooth while listening for such random looking identifiers of phones nearby. A phone stores all proximity identifiers it receives in a local database, together with an estimate of the distance between the two phones (based on the signal strength). Identifiers received more than 14 days ago are deleted.

These random looking proximity identifiers are generated as follows. Each phone (or other device) generates a random tracing key once. From this tracing key a daily tracing key is derived each day. And from this daily tracing key the proximity identifiers are generated, generating a new identifier whenever the MAC address of the Bluetooth interface changes, which is every 10 to 20 minutes. A hash function is used to derive the daily tracing key and the proximity. This ensures that given two different proximity identifiers you cannot determine whether they belong to the same device. In particular, you cannot recreate the daily key used to generate them. (This preserves privacy against eavesdroppers.) However, if you know the daily tracing key of a device, you can generate all proximity identifiers the device broadcast that day. This allows the contact tracing to work, even though the proximity identifiers look random and cannot normally be linked to each other.

This generating, broadcasting and collecting proximity identifiers happens as soon as users install an Android or iOS update containing the new standard, presumable after giving explicit consent to this (and assuming users have enabled Bluetooth to begin with). It is important to note here that this is not limited to users that actually have a Covid-19 tracing app installed.

Notifying contacts

Users can only notify or be notified if a Covid-19 tracing app is installed on the device. This app uses the API provided by the GACT platform that offers (controlled) access to the set of daily keys used by a phone, and the database of proximity identifiers it collected from other nearby phones over the last 14 days.

Google and Apple themselves only standardise the API and the scanning phase described above. They (apparently) do not provide a service that coordinates the notification of contacts. This has consequence for how the API works, and hence what the contact tracing app sees and what the servers used by the contact tracing app see.

Whenever a user tests positive, the daily keys his or her devices used the last 14 days can be retrieved by the app through the GACT API, presumably only after an authorised request from the health authorities. How this exactly works, and in particular how a health authority gets authorised to sign such request or generate a valid confirmation code is not clear (yet).

The assumption is that these keys are submitted to a central server set up by the contact tracing app. Other instances of the same app on other people's phones are supposed to regularly poll this central server to see if new daily keys of phones of recently infected people have been uploaded. Another function in the GACT API allows the app to submit these daily keys to the operating system for analysis. The OS then uses these keys to derive all possible proximity identifiers from them, and compares each of these with the proximity identifiers it has stored in the database of identifiers recently received over Bluetooth. Whenever a match is found, the app is informed, and given the duration and time of contact (where the time may be rounded to daily intervals).

The problem with GACT

Contact tracing moves from the app layer down to the OS layer

Google and Apple announced they intend to release the API's in May and build this functionality into the underlying platforms in the months to follow. This means that at some point in time operating system updates (through Google Play Services updates in the case of Android) will contain the new contact tracing code, ensuring that all users of a modern iPhone or Android smartphone will be tracked as soon as they accept the OS update. (Again, to be clear: this happens already even if you decide not to install a contact tracing app!) It is unclear yet how consent is handled, whether there will be OS settings allowing one to switch on or off contact tracing, what the default will be.

Also it is unclear whether switching off Bluetooth will actually affect the contact tracing capability (i.e whether switching off Bluetooth will switch of all Bluetooth services except the Bluetooth hardware and the contact tracing functionality. (This is one of the reasons why a physical on-off switch for both WiFi and Bluetooth, that is known to switch off these networks at the hardware level, is desirable.)

As I wrote earlier this change has serious ramifications:

Instead of an app, the technology is pushed down the stack into the operating system layer creating a Bluetooth-based contact tracing platform. This means the technology is available all the time, for all kinds of applications. Contact tracing is therefore no longer limited in time, or limited in use purely to trace and contain the spread of the COVID-19 virus. This means that two very important safeguards to protect our privacy are thrown out of the window.

Moving contact tracing down the stack fundamentally changes the amount of control users have: you can uninstall a (contact tracing) app, you cannot uninstall the entire OS (although on Android you can in theory disable and even delete Google Play Services).

This (technically) creates a dormant functionality for mass surveillance

But the bigger picture is this: it creates a platform for contact tracing that works all across the globe for most modern smart phones (Android Marshmallow and up, and iOS 13 capable devices) across both OS platforms. Unless appropriate safeguards are in place (including, but not limited to, the design of the system as described above - we will discuss this more below) this would create a global mass-surveillance system that would reliably track who has been in contact with whom, at what time and for how long. (And where, if GPS is used to record the location.) GACT works much more reliably and extensively than any other system based on either GPS or mobile phone location data (based on cell towers) would be able to (under normal conditions). I want to stress this point because some people have responded to this threat saying that this is something companies like Google (using their GPS and WiFi names based location history tool) can already do for years. This is not the case. This type of contact tracing really brings it to another level.

Even though the data collection related to contact tracing starts as soon as you accept the operating system update, real contact tracing only starts to happen when people install a contact tracing app that uses the API to find contacts based on the data phones have already collected. But this assumes that both Apple and Google indeed refrain from offering other apps access to the contact tracing platform (through the API) through force or economic incentives, or suddenly decide to use the platform themselves. GACT creates a dormant functionality for mass surveillance, that can be turned on with the flip of a virtual switch at Apple or Google HQ.

GACT is leveraged because contact tracing apps are required to use it

Apple and Google's move is significant for another reason: especially on Apple iOS devices, access to the hardware is severely restricted. This is also the case for access to Bluetooth. In fact, without approval from Apple, you cannot use Bluetooth 'in the background' for your app (functionality you need to collect information about nearby phones even if the user phone is locked). Some people have interpreted GACT as a way for Apple to say: if you want to build a contact tracing app, you need us to give you special access to Bluetooth. We are not going to give you that. Instead, we allow you to use the GACT API. If this is the case, it would prevent downright centralised implementations of contact tracing (like the ones favoured by the European Commission apparently). Seen from this light, the GACT platform strengthens privacy. (If we trust Apple and Google).

This means the contact tracing microdata is under Apple/Google control

Seen from another angle however, it creates an enormous leverage for the GACT platform, as it essentially becomes the only way to do contact tracing using smartphones in the first place. With this move, Apple and Google make themselves indispensable, ensuring that this potentially global surveillance technology is forced upon us. And as a consequence all microdata underlying any contact tracing system is stored on the phones they control.

(Note: some people have claimed that Apple would offer other distributed contact tracing solutions (and in particular DP-3T) access to Bluetooth. This has not been confirmed however.)

All in all this means we all have to put a massive trust in Apple and Google to properly monitor the use of the GACT API by others, as well as trusting that they will not abuse GACT themselves. They do not necessarily have an impeccable track record that warrants such trust…

Distributed can be made centralised

The discussion in the preceding paragraphs implicitly assumes that the GACT platform truly enforces a decentralised form of contact tracing, and that it prevents contact tracing apps from automatically collecting information on a central server about who was in contact with who. This assumption is not necessarily valid however (although it can be enforced provided Apple and Google are very strict in the vetting process used to grant apps access to the GACT platform).

In fact, GACT can easily be used to create a centralised from of contact tracing, at least when we limit our discussion to centrally storing information about who has been in contact with an infected person.

The idea is as follows. GACT allows a contact tracing app on a phone to test daily tracing keys of infected users (which the app has to download from a central server) against the proximity identifiers collected by the phone over the last few days. This test is local; this is why GACT is considered decentralised. However, the app could immediately report back the result of this test to the central server, without user intervention (or without the user even noticing). It could even send a user specific identifier along with the result, this allowing the authorities to immediately contact anybody who has recently been in the proximity of an infected person. This is the hallmark of a centralised solution.

In other words: the GACT technology itself does not prevent a centralised solution. The only thing preventing it would be Apple and Google being strict in vetting contact tracing apps. They could already do so now, without rolling out their GACT platform. Which begs the question what the real purpose of this platform is….

Malicious apps can learn which people an infected person has been in contact with

The centralisation mechanism outlines above does not reveal which infected person this particular user has been in contact with. So far however, the GACT API does not seem to put a minimum on the number of keys that the app can test. This means that if the app tests all daily tracing keys one by one, it could keep track of each daily tracing key for which the test was positive, and report these back to the server. As the server knows which daily tracing key belongs to which infected person, this allows the server to know exactly with which infected persons the user of this phone has been in contact with. What's worse, the current version of the API allows the app to retrieve the duration of the contact in 5 minute increments, and even when the contact occurred (although that timestamp may have reduced precision such as within one day of the actual time).

Even if the GACT API would put additional restrictions in place (test at a least a minimum number of keys at the same time, and restrict the number of times keys can be tested), the API could still be abused to obtain some additional information that allows the server to link people, for example by performing timing correlation attacks, or testing sets of keys that are constructed in such a way that they belong to people known to be from different geographical regions.

Malicious app can also track non-infected persons

The GACT API is supposed to release the daily tracing keys of an infected user only after receiving a one-time confirmation code from the health authorities)[https://www.getrevue.co/profile/caseynewton/issues/apple-and-google-answer-our-questions-239830]. How this should work is not described yet, however. Regardless, abuse is certainly conceivable.

For example, if the (health) authorities themselves are malicious (which is not an entirely unreasonable assumption in certain countries), then such a confirmation code is useless. In other words, a malicious app could act as if the user is infected (in a way that is unnoticeable to the user) and extract the daily tracing keys and upload them to the server surreptitiously. Together with the trick outlined above, this essentially allows the providers of such a malicious app to map the full social graph (i.e. who has been in contact with who, when, where and for how long).

Google and Apple can decide to do the same at any time. Again we have to trust them not to do so, or not to be forced to do so.

Function creep

The use of contact tracing functionality as offered through GACT is not limited to controlling just the spread of the COVID-19 virus. As this is not the first corona type virus, it is only a matter of time until a new dangerous virus will roar its ugly head. In other words, contact tracing is here to stay.

And with that, the risk of function creep appears: with the technology rolled out and ready to be (re)activated, other uses of contact tracing will at some point in time be considered, and deemed proportionate.

The tracking technology could be embedded within software libraries used in innocent looking apps (or apps that you are more or less required to have installed). Ordinary smartphones containing the GACT technology could be bought by governments and other organisations to be installed at fixed locations of interest for monitoring purposes. China could consider it to further monitor Uyghurs. Israel could use it to further monitor Palestinians. You could monitor the visitors of abortion clinics, coffee shops, gay bars, ….

Contact tracing also has tremendous commercial value. Facebook used a crude version of contact tracing (using the access it had to WhatsApp address books) to recommend friends on Facebook. The kind of contact tracing offered by GACT (and other Bluetooth based systems) gives a much more detailed, real time, insight in people's social graph and its dynamics. How much more precise could targeted adverting become?

Other concerns

Many of the attacks on DP-3T described by Serge Vaudenay also apply to GACT, like generating false possible infection alerts, and privacy risks related to the fact that the derivation of proximity identifiers from daily tracing keys is public. This allows an adversary to collect proximity identifiers and later correlate them to daily tracing keys of infected people released by the server. In other words, even decentralised contact tracing has privacy risks.

Concluding remarks

Form a purely technological perspective, decentralised contact tracing is preferred over centralised solutions because it better protects our privacy. Unfortunately embedding it in the operating system, as proposed by Google and Apple, does not make centralised solutions impossible.

This requires strong oversight by Apple and Google over the apps that want to use the GACT platform for contact tracing. Which makes the whole GACT platform a smokescreen really, as exactly the strong oversight would be required if the GACT platform was not there, and apps requested special access to Bluetooth to implement their own contact tracing technology. It is one thing to agree on a few best practices for contact tracing and force app providers to adhere to those standards. It is quite another to bake that standard into the operating system layer, active in the background, ready to be deployed.

We have to trust Apple and Google to diligently perform this strict vetting of apps, to resist any coercion by governments, and to withstand the temptation of commercial exploitation of the data under their control. Remember: the data is collected by the operating system, whether we have an app installed or not. This is an awful amount of trust….

In case you spot any errors on this page, please notify me!
Or, leave a comment.
Sven Türpe
, 2020-04-19 11:16:15
(reply)

So how should an institutional and legal framework for contact tracing be designed to address your concerns regardless of the used technology?

Hassan
, 2020-04-19 13:02:28
(reply)

Can’t speak for the author, but my personal answer were “not”. Contact tracing just must not happen, because there is no concept, neither centralised nor decentralised, that both works and respects users rights. Just give up on the apps and spend the money in face masks and hospital workers.

jb
, 2020-04-19 13:54:03
(reply)

It should not be there in the first place, period. It goes against all privacy fundamental right.

Casper
, 2020-04-21 08:09:39
(reply)

GDPR art 9(2)(i) would allow for processing of special categories of personal data for “for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health”. The processing must be “necessary”. So, it is not an inconceivable matter to be addressed and privacy would need to take a step back in certain cases. Such as Covid-19, I would argue. Maybe some would say, not. for Covid-19. If so, what other situation would fit under this article? I want to thank the author for a thorough description and understand that there is political pressure for governments and big tech to act, but legally speaking, this kind of data processing is not unfounded. Of course, the question is HOW it is done and what to improve, but the idea is not te be bashed.

Jaap-Henk
, 2020-04-21 08:34:24
(reply)

The idea persé is not to be bashed indeed, but to be challenged for sure. And there are less intrusive ways to implement such an approach.

Anonymous
, 2020-04-19 12:29:07
(reply)

My question is, what is the fundamental difference between the contact tracing data and all other personal data saved o your phone (contacts, SMS). If we couldn’t trust Apple and Google, why bother use the smart phones?

Jaap-Henk
, 2020-04-19 23:52:37
(reply)

Indeed, we actually do trust Apple and Google with a lot of our personal data, and there is very little we can do to verify that trust. We can decide to not use the phone for certain things, or not store certain data on it, but that’s about all the real control we have left. For me the difference here is that a global, interoperable (between Google and Apple) system for contact tracing is embedded in all modern smartphones. It adds another capability, that was not present yet.

Martin
, 2020-05-30 08:04:41
(reply)

I do not own or use a smartphone (in fact, I’m not using any mobile phone), because of that any many other reasons.

But if I would like to use one, I would select either PinePhone or Librem 5, which do not run Android or iOS, but some flavour of Debian Linux.

DM
, 2020-04-19 12:54:08
(reply)

Wait until you hear about Apple’s “Find My” feature:

““” In upcoming versions of iOS and macOS, the new Find My feature will broadcast Bluetooth signals from Apple devices even when they’re offline, allowing nearby Apple devices to relay their location to the cloud. That should help you locate your stolen laptop even when it’s sleeping in a thief’s bag. And it turns out that Apple’s elaborate encryption scheme is also designed not only to prevent interlopers from identifying or tracking an iDevice from its Bluetooth signal, but also to keep Apple itself from learning device locations, even as it allows you to pinpoint yours. ““”

If you don’t trust Apple or Google for this, then there’s nothing else you should trust them with, so you should probably stop using them completely.

x
, 2020-04-19 13:19:01
(reply)

https://www.euromomo.eu/ The C virus doesn’t look unusual regarding excess mortality. We don’t need an app. And would it even based on scientific evidence? Google and Apple might lose trust because of the APIs. Losing trust could lead to privacy friendly Android devices that exclude Google software. New app stores with privacy could appear.

craig
, 2020-04-19 14:29:21
(reply)

“Also it is unclear whether switching off Bluetooth will actually affect the contact tracing capability (i.e whether switching off Bluetooth will switch of all Bluetooth services except the Bluetooth hardware and the contact tracing functionality. (This is one of the reasons why a physical on-off switch for both WiFi and Bluetooth, that is known to switch off these networks at the hardware level, is desirable.)”

It’s NOT unclear as turning off the radios is a HARD requirement for Airplane mode. Can’t take you seriously when you spew conspiratorial junk like this.

Jaap-Henk
, 2020-04-19 20:18:02
(reply)

I was not talking about Airplane mode, but Bluetooth only. And remember that in iOS 11, the Bluetooth switch in the control panel confusingly did not turn of the actual Bluetooth radio

Kim Schulz
, 2020-04-19 20:58:52
(reply)

Actually not… try going to flight mode and then notice that you can easily turn on wifi and bluetooth. They are both considered non-intrusive for airplanes which is also why you can use bluetooth headsets and in-air wifi access on planes.

Igor
, 2020-04-19 16:50:41
(reply)

So, if I patch Android to give me a list of proximity identifiers as Bluetooth picks them up, I can log them together with GPS coordinates. My patched kennel will also log all Daily Keys of infected people Google’s app downloaded from Central servers.

Now I can drive by my town, collecting proximity identifiers from people in their homes. Then I will know exactly who the sick people are, once their phones upload their daily keys to the central servers and my phone downloads them.

Jaap-Henk
, 2020-04-19 20:21:25
(reply)
Nice. Didn’t even consider that possibility.

Which raises another concern: what if the server collecting the daily tracing keys (remember, this server is hosted by some health authority) is poorly secured or does not properly authenticate requests for such keys by the app. Then anybody could get these keys, and the attack you describe could be mounted with any raspberry pi…

Marco
, 2020-04-22 17:01:59
(reply)

Not sure what you mean with that? Requests for diagnosis keys are not meant to be authenticated. They are suppose to be public. Only the posting of new diagnosis keys are to be protected but it’s outside of the specification how that is to be done. This is something every government or health authority should decide for themselves. And yes, if that is done incorrectly it might allow people to wrongly post infections and possibly scare other people.

Jaap-Henk
, 2020-04-22 17:40:11
(reply)

I sure hope requests for daily (diagnosis) keys are authenticated. There is a kind of ‘hidden assumption’ in the discussion on contact tracing apps that the logic of detecting whether you have been close to someone who turns out to be infected is hidden from the user (in the GACT case this matching is done in the OS layer). If this is not the case, or if you are able to get the diagnosis keys and log all proximity identifiers you see around you yourself (together with the time and location), you can ‘see’ the matching or do the matching yourself, see proximity identifier matched a diagnosis key and with the additional information about location and time you might be able to deduce who in fact turns out to be infected.

Marco
, 2020-04-23 08:52:32
(reply)

There is really no need for it to be authenticated. The logic of detection of you have been in proximity of an infected person is not hidden, it’s just taken care of by the OS. It could very well be done by the app (it’s documented how it’s done, it’s no secret) but every app would need to implement that so why not build it into the OS (or better said, framework / library that is bundled with the OS).

At least on iOS an app would not be able to log identifiers with GPS locations without using the GACT framework. An app will simply not have access to the necessary bluetooth SIG. Of course you could build your own device (e.g. a Raspberry Pi with a bluetooth dongle, assuming it supports that SIG) and log it all with your own code and then match is all up. Yes, that’s why it’s an open specification, to allow you to do that. Remember, this functionality is not limited to iOS or Android.

It’s not very likely that the extra GPS location would help you much in identifying who it was that might be the infected person. If you can already remember who you saw or were close to at that moment, it’s very likely that you already know that person (with or without GPS location).

Gilles Ampt
, 2020-04-23 15:04:21
(reply)

This attack is an example of deanonymizing known reported users as described by Serge Vaudenay/ EPFL in the DP3T analysis, https://eprint.iacr.org/2020/399. The attack scenario can be drive by as Igor mentioned or an employer or a paparazzo or you name it. Although this type of data collection is illegal according to GDPR it can be done with little effort. The authentication of infected key requests alone will not be a sufficient security control here. The uploaded infection keys should be encrypted such that only genuine apps are able to compare infection keys with broadcasted contact keys. The genuine apps should be protected against eavesdropping.

Jaap-Henk
, 2020-04-23 16:50:59
(reply)

I assumed here the sandbox offered by the OS is strong enough to (more or less) ensure that other apps cannot access the data processed by the contact tracing app. And that the keys were downloaded over an encrypted channel.

Ottoman
, 2020-04-25 02:58:08
(reply)

Or wardrive for dissidents with low social credit scores.

Marco
, 2020-04-22 16:54:59
(reply)

No you won’t know WHO is sick, you will only know that you have been in the proximity of somebody who is now sick at a certain GPS location. Given that Bluetooth LE can work up to 100m it’s still uncertain who that is (especially in densely populated areas like towns). It might even be somebody in another car near you.

(
, 2020-04-19 18:43:18
(reply)

After reading the entire article, I understand your concerns with this issue, but I’ve got two things left over. 1) Why does this honestly matter? What are you/individuals doing that you want to hide from this(I’m assuming nothing, but still), 2) When making arguments for your points, could you thoroughly explain them? There was a point you said “users claim Google can already track it’s users, but this is different from contact tracing.”(it’s not exact my bad). I was waiting for an explanation though, a good reason, but I felt as though I just got a huge analysis of the app’s functionality and how it works versus actual threats. The scenarios you listed are possibilities, but it seems like a bit ‘half-baked’ if you will.

It was a great article though, kudos!

Sam
, 2020-04-20 04:40:02
(reply)

Would you let someone install a camera in your bathroom? You’re probably not doing anything that other people aren’t also doing, right? So then, what’s the problem?

Same concept. It’s about your right to privacy. You have a right to privacy in your bathroom, that’s why they have locks. Even public restrooms have stalls. Do some people go in there and do drugs? Yes. Does that mean we should put cameras in there to catch those people? No. It’s about privacy, regardless of what you are or are not doing.

Ottoman
, 2020-04-25 03:00:04
(reply)

Without privacy there is no freedom. Think about that for a while.

Ian Witham
, 2020-04-19 18:58:07
(reply)

Hi

I’m not sure I understand how this works.

“.. could even send a user specific identifier along with the result, this allowing the authorities to immediately contact anybody who has recently been in the proximity of an infected person”

How does the identity of someone who has been near someone infected allow the authorities to identify who the infected person might be?

Jaap-Henk
, 2020-04-19 23:39:32
(reply)

This assumes that you have to register yourself (and thus getting this specific identifier) when installing a contact tracing app. Then the app can send this identifier, both when reporting infection and when reporting close contact.

Kevin
, 2020-04-29 10:39:04
(reply)

The whole point is that you don’t need to register yourself.

The only thing the system should know is that the device sending in infected keys is legitimate. That can be done on several levels and may require identifying the person at that point. Technically this would not even be needed.

There’s no need to report close contact to the server. The device will be notified of infected keys and can draw this conclusion by itself. Then it’s up to the user to take action.

Erik Poll
, 2020-04-19 21:17:34
(reply)

As you point out in the last paragraphs, some (all?) of the issues with GACT that you raise would also arise if the OS provides a more basic API for accessing Bluetooth which would leave it up to the app to decide on a scheme for coming up with proximity identifiers etc. For instance, such functionality might also be abused by malicious apps and might also lead to function creep. It would be nice to factor out these generic problems with any access to Bluetooth by some tracing app to clearer understanding in what ways GACT is making things better, worse, or essentially the same as a more basic API. (Obviously it is worse in that it makes it impossible to come up with better forms of pseudonimisation than GCAT and better in that it makes it impossible to come up with worse forms. It’s not so clear to me if the uniformity between different apps, say from different countries, introduces additional risks.)

I can see an advantage for Google and Apple in offering the GACT API as opposed to a more bare bones API: vetting apps becomes a lot easier if all apps use the same GACT API. If every app designs its own contact tracing solution it will be a lot harder to vet them. So what you call a “smokescreen” probably saves them a lot of work, or at least makes it easier to claim they have done their due diligence.

Jaap-Henk
, 2020-04-19 23:37:12
(reply)

The problem (as I explain in the blog post) is that by moving the functionality to the OS it is always enabled (even if no app installed) and global. This goes beyond mere issues with Bluetooth based contact tracing apps in general. I’m not so sure that vetting apps becomes that much easier: the app still has to contain a lot of logic. And as you point out, it becomes impossible to come up with better forms of pseudonimisation - GACT is pretty basic to be honest.

dave
, 2020-04-19 22:07:24
(reply)

There is no reasonable way to make this work, unless we explicitly trust our phone OS platform. I do not. I have to use something on a cell phone network, and have been roped into using iOS or Android. There are no other good choices. This whole “we will track you for your protection” is just another security theatre episode. I am getting to a point where I will stop using a mobile phone. I may be the 1%, but at least I will not be tracked.

Ruud
, 2020-04-24 13:49:32
(reply)

“There is no reasonable way to make this work”, is just an opinion.

Math has ways to do this without hurting privacy, is what I know :).

None
, 2020-04-19 23:04:54
(reply)

Welcome to 1984

Conrad
, 2020-04-19 23:37:20
(reply)

But phones are supposed to be exchanging proximity identifiers, not daily keys.

Arjen Schoneveld
, 2020-04-20 21:43:51
(reply)

The fact that the Contact Tracing becomes part of the Bluetooth stack does not mean it is always enabled. According to the API published by Apple/Google you need to explicitly enable Contact Tracing (CTStateGetRequest).

Edwin
, 2020-04-20 23:35:54
(reply)

>> by moving the functionality to the OS it is always enabled This seems to be a key part of the argument, but sounds like an assumption that is not based on anything concrete. There is no reason these vendors wouldn’t be able to make this user configurable. Most likely it would be managed the same way as all of the already existing privacy sensitive resources in the platform. Like the mic or camera. It is a bit like claiming that having a mic in a mobile phone platform is undesirable because it is “always enabled”.

Jaap-Henk
, 2020-04-21 08:36:50
(reply)

As I write in the blog post itself: “It is unclear yet how consent is handled, whether there will be OS settings allowing one to switch on or off contact tracing, what the default will be.”

Winfried
, 2020-04-25 15:21:24
(reply)

Consent can be meaningless depending on the app build on top of. Take the hypothetical case of a government who restricts access to, shops, public places, transportation, passing roadblocks, work etc. if you cant show a ‘green’ status on your app. You will be forced to ‘consent’ with the GACT.

The small issue is: this API is designed fomaker an activity that is by definition a violation of fundamental rights. Such a violation can be justified, but the bar for that is high.

The big issue is: Google and Apple are deciding who can use of GACT, so they have legislative, executive and judiciary power over a fundamental right issue.

Should we bring the use of this API under democratic control or the API itself? Should that be regulating or banning? What are the consequences for other API’s that are an inherent violation of fundamental rights, like Googles advertisement tracing API?

This is not an issue that can be solved with an opt-in.

Patrick
, 2020-04-21 12:24:23
(reply)

You conclude with “he data is collected by the operating system, whether we have an app installed or not” implying that the pure collection of contact data is out of user’s control. At least https://covid19-static.cdn-apple.com/applications/covid19/current/static/contact-tracing/pdf/ContactTracing-FrameworkDocumentation.pdf implies otherwise though.

I agree that a non-user-controllable collection of contacts would indeed raise serious concerns, but that’s valid for a lot of potential privacy issues. So is there any indication so far that this indeed will happen automatically?

Jaap-Henk
, 2020-04-21 13:27:03
(reply)

The documents imply otherwise, but do not make specific what kind of control will be offered. Will there be a switch that turns both broadcasting identifiers and collecting such identifiers broadcast by others off at the same time? Will there be two independent switches? What will be the default setting?

In any case the control is opaque and counterintuitive, as the collection may start before any app is installed, and may continue after the app is deinstalled.

Marco
, 2020-04-23 08:36:02
(reply)

The GACT Bluetooth specification says “Users decide whether to contribute to contact tracing”. How exactly that is implemented is up to Apple or Google to decide how they want to implement that in their OS. It’s not up to a specification to detail how that is to be implemented in a UI.

My assumption (and I’m allowed one as you make dozens) for iOS will be that Apple will add a per app switch that enables access to contract tracing (after user consent). Only if at least one app has access to contact tracing, the tracing is enabled. And the default for any app will of course be off as it is also for e.g. GPS location or bluetooth access.

Herbert Rutgers (PA0SU)
, 2020-04-21 15:59:49
(reply)

‘…. together with an estimate of the distance between the two phones (based on the signal strength). Identifiers…’

This is absolute nonsense. There is NO ANY relation between the signal strength and the distance between a transmitter and a receiver if the environment is not defined as could bee on a lake (without wind) or a desert.

Jaap-Henk
, 2020-04-21 19:24:03
(reply)

Interesting… This makes it even more unlikely that a contact tracing app is going to work reliably.

Marco
, 2020-04-23 13:23:11
(reply)

There will be a lot of identifiers received that will be of no interest. Signal strength is a good measure as only the very strong receptions are of interest. To be of potential harm to someone you must be pretty close and therefore the signal strength must be very strong. It doesn’t matter if I also receive a signal from someone on another boot on a lake. The signal strength is never gonna be as strong unless that person is within a couple of meters. But still, yes there is always the potential of false positives but by filtering on only very strong signal strengths I think it will still be pretyy good.

Petros
, 2020-04-23 13:20:12
(reply)

Hi, Thanks for this blogpost. One q from a non-CS reader: Assuming that contact-tracing does NOT move down to the OS layer + that a government wishes to launch a contact-tracing app; in such a case, what other alternatives would there be other than using the GACT API? What I am trying to figure out is the quality and the degree of the leverage created by the additional decision of moving the functionality down to the OS layer. Hope that makes sense.

Jaap-Henk
, 2020-04-23 13:31:00
(reply)

Many proposals for contact tracing have appeared recently. DP3T, PACT (MIT), and ROBERT (INRIA), some of which are more advanced than the GACT proposal. The main ‘leverage’ purely from the contact tracing perspective is the fact that by moving it to the OS, there is a much better chance of making this interoperable across countries

Petros
, 2020-04-23 14:34:22
(reply)

Thanks-As it stands, if a government chooses DP-3T or ROBERT, or PACT over GACT it will still need Apple’s approval for enabling bluetooth in the background. Will this change after the moving of the functionality in the OS?

Jaap-Henk
, 2020-04-23 16:48:11
(reply)

Apple would still need to approve the app, and approve its use of the GACT API. Approval to use Bluetooth would no longer be required as the app itno longer needs to access Bluetooth directly.

Marco
, 2020-04-24 10:47:19
(reply)

There is now also a Pan-European Privacy-Preserving Proximity Tracing initiative that is mostly driven by “experts” from Germany, France and Italy. See: https://github.com/pepp-pt/pepp-pt-documentation

Well Jaap-Henk, it’s very easy to bash this as this has centralized mass surveillance written all over it. All collected ID’s are send to a central server and all processing is done by the server which then sends a notification to the users. No really, I still prefer the GACT specification if correctly implemented as documented.

Anne van Rossum
, 2020-04-27 16:13:59
(reply)

I actually don’t think function creep or any of your other concerns are the most important from a privacy perspective. I think the most important phenomenon is privacy-erosion creep. People do care less and less about privacy because being pro-privacy is stated as being pro-virus, pro-terrorist, or pro anything that is bad. I’m not so much concerned about current actors in my part of town, the government is responsible, the companies will be fined of they go too far. However, what is currently too far seems to shift. “They are tracking is anyway, why do you care?” It is becoming normal that companies use you, that they know things about you they shouldn’t know, that they can manipulate you or even elections. It is the new normal.

The problem is, legislation will follow. It will not be on the side of the civilian. Or it will actually be… if all those civilians don’t care. It will be against the law to uphold information.

There will be a lot of organizations profiting from this move towards a nonprivate world. I don’t think those will be the most exemplary organizations. If we leave our children a world, it will be a world where they don’t care about their privacy, don’t care about their individuality, don’t care about their human rights.

Ralf Rottmann
, 2020-05-03 20:01:14
(reply)

I disagree. The current iOS beta has an opt-out switch in Settings, which turns off contact tracing completely. Once you decide to opt out, iOS won’t collect any tokens. Nor will it broadcast any. This can easily be proven with a nearby BLE sniffer. (Or just another iPhone with one of the BLE sniffing apps.) –– Now, one could argue that Apple could provide an OS update, which just ignores the switch completely and tracks you even against your explicitly denied consent. The thing is: Once you open up the possibility for Apple (and Google) to bluntly betray their users like this, it doesn’t take the newly introduced contact tracing features to be scared. You’d have to drop their platforms entirely. There’s one universal truth: At some stage, you’ve got to trust operating systems vendors to a certain degree, since they ultimately control the platform. So, while I appreciate your article, I disagree with your interpretation. Nobody creates a dormant functionality for mass surveillance. When we all decided to take always-online handheld computers with us to wherever we go, we made an educated decision for potentially being mass surveilled. No news here.

Davey
, 2020-05-11 09:05:46
(reply)

Thank you for this excellent article, some at least are paying attention.

Anyone still in the ’I’ve done nothing wrong, I have nothing to hide” hive mind should spend a moment - perhaps longer - reading about IBM Dehomag

https://en.wikipedia.org/wiki/Dehomag

Esther Hoorn
, 2020-05-11 12:22:05
(reply)

Maybe it is relevant for this discussion that in Germany a group of scientists and data protection experts conducted a reference Data Protection Impact Assessment for COVID 19 contact tracing apps: https://www.researchgate.net/project/Data-Protection-Impact-Assessment-for-COVID-19-Contact-Tracing-Apps

Pete Toon
, 2020-06-04 18:03:34
(reply)

Well, it’s on my (Android) phone now, unsolicited, and I have to explicitly download a local health authority app in order for it to do anything. “Users decide whether to contribute to contact tracing” This year maybe…

Joki
, 2020-07-09 02:12:47
(reply)

Perhaps the point of the apps and/or OS changes are simply to help further indoctrinate people into acceptance of surveillance in general?

It isn’t as if you cannot already be tracked to an alarming (to me, anyway) degree, assuming you use a credit card, a cell phone, drive a car equipped with OnStar or GPS-tracking, walk down a street or drive through an intersection with a camera or three…. Should I bother to continue?

But as someone else already mentioned above, there are a growing number of people who seem willing to take a very “meh” attitude to all of that, so why should they start worrying now? Your article is well-written and, to my layman’s eye, technically sound and deserves to be read far wider than it perhaps is. But I fear it wouldn’t matter anyway. Hell, I’ve talked to friends who agree completely that the world is becoming more orwellian by the minute but… meh. They can’t be bothered to do anything because what would they do? They certainly aren’t going to give up their phones at this point.

More important to me than anything you brought up in your article is this: is the tracing necessary, at all, in the first place? I’d say no, which makes the rest of the debate pointless as far as I’m concerned. I don’t care whether it’s Apple, Google or the reanimated corpse of Abraham Lincoln in charge of it, we don’t need tracking thankyouverymuch.

(Incidentially, speaking of orwellian, it was a clever bit of newspeak for someone to call it contact tracing and avoid the word tracking. Tracing sounds far less intrusive, doesn’t it? Prey are tracked.)