Several important distinction were made (again) during the event.
- Privacy versus data protection: data protection gives people control over their data and implements a form of informational self-determination that extends the mere ‘right to be left alone’.
- Social versus institutional privacy: social privacy describes how users manage their interactions with their social circle, whereas institutional privacy describes how users can control the information collection by institutions/governments. Roughly speaking, the first class is served with privacy settings in e.g. social networks, whereas the second is served with privacy enhancing technologies. The question is whether we also need to distinguish a third form (lets call that commercial privacy) that describes how people manage their privacy when engaging in transactions beyond their immediate social circle.
- Attitudes versus behaviour: people’s attitude towards behaviour is something entirely different than how they behave in practice (when confronted with a privacy sensitive choice).
Users are focused on utility. They engage with systems to achieve a certain goal, they use systems because the functionality is useful. For example teens use Facebook to stay in touch with their peers. Their mode of communication (pull) versus the older ways (push) is important to them and will not change. Privacy enhancing technologies should take very good note of this fact of life, if they want to be actually used in practice. This is important to stress: we should create PETs that people really want and actually use. One approach is to research ways to make privacy enhancing technologies ‘socialise’. Moreover, designers should not be paternalistic.
In interesting case in point is age verification in social networking sites like Facebook. Because teens want to join Facebook at an early age, they lie about their age. Facebook makes this very easy because it is an honours based system. If Facebook would enforce the age restriction more strictly this has two consequences. First, a real need for cross border age verification would arise. Second, teens would really frustrated and be felt left out.
Many factors influence the choice people make, and the way they actually behave w.r.t. their privacy. For example users typically reveal a lot of information voluntarily, but that when forced to fill in mandatory fields, they leave the optional ones empty. Also, users rarely change the default option. I think it is an interesting avenue of psychological and human computer interaction research to investigate this further (similar to the research of Acquisti and Cranor, but then more oriented towards the psychological causes for the behaviour observed). Nudging alone (that puts the onus on the user) is not enough, however, and other measures that make privacy infringements less of an externality need to be considered. Framing privacy in terms of property rights is not very useful in that respect.
Finally, incentives pretty much determine not only how people act online, but also what options service providers offer in terms of privacy protection. The current incentive structure is not very promising unfortunately. The fact that unique identifiers are the currency of the Internet make the (business) case for invasive tracking of people and objects quite compelling. The alternatives business models explored in the third panel were not very convincing, so this may be a tough nut to crack.