Lessons to be learned from the Polar incident

July 9, 2018

Last sunday, journalists from The Correspondent revealed that it was trivially easy to find the names and addresses of military and intelligence service personnel that use Polar, a popular runners wearable and fitness app. All runs (even private ones) made by owners of a Polar fitness device are stored on a central server, and can be viewed on a map. Even though the user interface restricted access to only public runs, bypassing the user interface and entering URLs manually allowed them to extract all runs made by anyone since 2014. Polar switched off access to the map recently to prevent further abuse of this. What can we learn from this incident?

First of all: how can a name and address be found using this information? It works like this. Runs are attributed to unique user identifiers. Also personal profiles were accessible. Looking for runs made close to military bases or known intelligence agency offices allows one to identify possible military personnel. Subsequently looking for many runs made by the same people that start and stop at the same location will, in all likelihood, reveal the home address of these people. A little more digging (e.g. in property owner registers) will reveal the name.

This is a known privacy issue with centralised location based services for a long time, at least within computer science. In that sense, we should not be surprised at all. Also, we were warned a few month ago, when a similar issue with another fitness app from Strava proved that you could determine (secret) locations of military and intelligence bases when inspecting the runs shared with the app.

It is important to realise that the problem does not go away if Polar gets their security act together and make sure that runs cannot simply be collected from the server by anyone smart enough to guess a URL. Even then a significant security and privacy problem remains: Polar itself has access to all runs made by anyone using the app! In other words: Polar can do the analysis the journalists did by themselves, or can be asked or forced to do this analysis (or provide the raw data) by law enforcement agencies or intelligence services. Now Polar is Finnish company, so the risks of that may be low. But support the company was Chinese, Iranian or even American: would we want theses countries to be able to determine the names and addresses of our secret agents or members of our special forces?

Moreover, the Polar app is clearly poorly designed, and quite possibly in violation with the General Data Protection Regulation (GDPR). First of all, the app offers the option to keep runs private. But in practice this does not mean they are actually private. Instead even private runs are uploaded to central servers of Polar. This is insane. Runs that are private should really be private and not be stored on any central server at all. Why not store them simply on the smartphone of the user with which the Polar device is paired?

Even when runs should be shared, it is certainly possible to design the sharing in such a way that only the people you intend to share the runs with have access. You could even design the system in such a way that run data is “end to end encrypted”, i.e. stored encrypted on a central server so that even the service provider cannot access it. Still those who have been given explicit access can view your runs.

Finally, we should be very aware of the fact that these problems are not restricted to fitness apps and wearables, like those offered by Polar. Many other apps have access to location data and collect and process that centrally. In fact, even smartphones in general have (of course…) access to that data, and even though sharing of that data is no longer turned on by default, users are very much enticed turning location services on to make photo albums, maps, etc. much more personalised and useful. In which case the location data is again collected and processed centrally by the providers of these systems. It is important to keep in mind that these providers have access to that information, and may be forced to provide access to that information to the authorities, as well as using that data for their own benefits.

In case you spot any errors on this page, please notify me!
Or, leave a comment.
Ad Gerrits
, 2018-07-17 10:51:50

Not storing private data on a server is of course a way to prevent misuse of your data. But it does require local infrastructure that offers (part of) the functionality that nowadays is mostly offered by servers. The ability to back up data safely in order to be able to restore in case of loss or crash is already a considerable challenge for many people (‘oh wait, Apple is so friendly to solve this for me’). So I totally agree with you about the direction for future solutions but I still see a lot of ‘bears on the road’.