The day before the annual CPDP conference, EDRi (the association of civil and human rights organisations from across Europe) organised Privacy Camp 2017 with a panel on the Internet of Things. Here is a summary.

Finn Myrstad (from the Norwegian consumer organisation) kicked off the panel, talking about the serious problems imposed by internet connected dolls and fitness trackers.

Internet connected dolls show what happens if a traditionally ‘analog’ company like a toy company goes digital: they make the same ‘beginner’ mistakes that e.g. the energy grid companies made when developing the first generation smart meters. The dolls have poor security: the Bluetooth connection is unauthenticated, meaning anybody can connect to it and play a sound over the doll’s speaker, or listen in using the built-in microphone. The dolls are also a privacy nightmare: the microphone is always on and voice data is sent to a third party which processes it also for other purposes. In fact the terms and conditions were so unacceptable that the sale of these dolls was stopped in both the US and many EU countries after complaints.

Finn also talked about fitness trackers and their ridiculously long and complex terms and conditions, exemplified by this hilarious video) they made. He could also have talked about the insecurity of sex toys or cameras and DVRs. In other words: many different types of internet connected devices are insecure, easily hacked and a threat to your privacy. (If you want to ‘hack’ a device yourself, search for a vulnerable device using shodan.)

One approach to deal with this is to improve product safety standards (the famous CE label), which currently do not include security requirements. This point was raised during the Privacy Camp as well as during a discussion at CPDP. By including security and even privacy requirements into these safety standards, non-compliant products can be stopped at the border and prevented to flood the market. The level of protection offered by this measure of course depends on the exact security and privacy requirements added to the safety standards.

This is going to be an interesting balancing act between having strong enough requirements to offer a significant level of protection, while still respecting business interests and taking specific aspects of the domain of Internet of Things into account. For example, while it would make sense to require products to have a remote update function to allow vulnerabilities to be patched, the software of many Internet of Things devices is actually implemented in firmware which is harder to update (as noted by Katitza Rodriguez (EFF)). This has to be discussed further, and it looks like this is going to be on the agenda of the next IPEN meeting in Vienna on June 9 after the Annual Privacy Forum.

In any case we need something stronger than simply requiring companies to be more transparent and force them to open up the code. This point was well-argued by Fieke Jansen (Tactical Tech): few people will be able to read and understand the code. Moreover, it is a fundamentally flawed approach as it shifts responsibility to the user, which is exactly what companies try to do all along with their long and complex terms and conditions. Andreas Krisch (EDRi) also mentioned later that technology that shows what the problem is (e.g. showing how insecure a device is or showing what information it leaks) is not enough as it takes a lot of expertise and time to apply the tools, analyse the results and reach a conclusion.

All the while, surveillance possibilities are skyrocketing as the initial design mistakes made with the first generation smart phones (fixed MAC addresses and other identifiers) are being made again with internet-connected objects, and better, more personalised tracking becomes possible (again a point made by Katitza Rodriguez).

One shimmer of hope was offered by Fieke when she pointed out that the internet of things can also refuel interest in peer-to-peer approaches and thus challenge the currently very centralised nature of many internet services. She mentioned the example of remote software updates in a peer-to-peer fashion where one doll with the latest software updates ‘heals’ another doll it encounters. Of course the opportunities do not stop there…

In the end, as (I think) Andreas Krisch said later during the panel:

We need to organise society such that things work like we expect them to
work.

I couldn’t agree more. Current laws (GDPR) are not strong enough. Perhaps the approaches discussed during the panel and summarised here help.