Nothing is more pervasive than the “If you’ve got nothing to hide, then what do you have to fear?” myth. A common response is to list things that people do want to hide and have every reason to hide. But such a response actually falls into the trap that this argument cleverly sets up: it subscribes to the frame that privacy is about hiding bad things. As Daniel Solove puts it: “The problem, in short, is not with finding an answer to the question […]. The problem is in the very question itself.” So what, then, is the problem with the question itself?
(This is the third myth discussed in my book Privacy Is Hard and Seven Other Myths. Achieving Privacy through Careful Design, that will appear October 5, 2021 at MIT Press. The image is courtesy of Gea Smidt.)
First, let us consider the act of hiding itself. Hiding is natural, and is part of being human. In fact, it is impossible, unavoidable even, to hide our thoughts and feelings. Perhaps hiding things is the default human condition and sharing the choice (instead of the other way around). And often we are not even clearly aware ourselves what we think or feel.
If being fully transparent in our thoughts and feelings would be possible, and would in fact be the default, the result would be devastating. Imagine, for example, that you have a decent job but that every time you are contemplating to apply for a job elsewhere your boss immediately reads this in your mind. Every once in a while, people have doubts about their relationships with their friends or partner. This is entirely normal. But imagine the sense of insecurity you would experience if you were immediately aware of even the slightest doubt felt by your significant other.
Another argument against this myth focuses less on the act of hiding itself, but rather on the question what should or should not be hidden. The myth assumes that this question is easily answered: either “you did something you shouldn’t have done in the first place”, or there is really nothing to worry about. In practice, this is not clear at all. What is wrong or not, depends a lot on context and interpretation: is something you did illegal, or morally wrong? In which jurisdiction, or which cultural context? Is being gay wrong? Smoking pot? In all likelihood we all did something wrong at some points in our lives, according to some “definition” of wrong.
Moreover, each individual piece of information about you may seem insignificant and harmless enough. However, all these little pieces of information may be combined into one telling profile of you that can be used to classify and judge you. It’s impossible to tell whether this happens and which small breadcrumbs of information are used when it happens. In other words, saying “you have nothing to hide” completely ignores the power imbalance between the governments and large corporations collecting the information, and the individuals it pertains to (and that will suffer the consequences).
The apparent strength of the myth is in fact its most fundamental weakness. It is based on the assumption that privacy is merely an individual right, while security benefits society as a whole. But this frame is wrong, and the assumption is flawed. Privacy is also of societal, public value: it is essential in a democratic society. It allow us to think the unthinkable, to discuss it with like-minded people, without interference. It prevents unreasonable forms of self-censorship and conformance to existing norms. Labour rights, women’s voting rights, and gay rights all needed this space to be developed. A lack of privacy also erodes solidarity. Insurance is useless if the insurer can perfectly predict the future cost of insuring me and charge me accordingly. Privacy also has a collective value: it is a public good threatened by market forces that cannot be withstood by individual privacy choices and hence need protection at the collective level.
The book discusses this in much more detail, and then describes technologies that help to hide and to protect sensitive information. Symmetric and public key encryption of course, and their application to secure communications (virtual private networks, secure browsing, and end-to-end encryption) and to securely store data (in the cloud and on your local devices). It also explains techniques to make databases more privacy friendly, like statistical disclosure control, and differential privacy, and discusses their strengths and weaknesses.
(For all other posts related to my book see here)