The Future of Authenticating Websites (FAWN) - 1

October 6, 2011
4

The recent hack of DigiNotar and the resulting upheaval (it was even discussed in Dutch parliament yesterday), has made painfully clear that the current system of certifying websites is insecure and needs replacement. During a discussion on this topic with my colleagues of the Digital Security group of the Radboud University Nijmegen, the following issues and ideas came up. I'd like to share them with you, and welcome any comments you may have.

Some remarks on the problem

Concerning the scope and cause of the problem, it was noted that the found weakness will probably be exploited only by governments, not criminals. Governments have the means to control large parts of a the national Internet. For criminals, it is easier to attack victims using Trojans and viruses. This has some ramifications for the attack model to consider when trying to improve the certification system used by SSL. Moreover, we have to trust our browser to function correctly.

Although it might seem that there is not a single root CA in the current certification model, the browser vendor actually acts like a root CA (for all the 'root' CAs it includes in its browser). As there are several different browser vendors, the current model actually has more than one root, although the nodes and leaves in those certificate trees are shared.

Some CAs are too big to fail. Comodo was hacked before DigiNotar was, but could not be removed from the list of trusted CAs because then a significant portion of all secure websites would suddenly have become untrusted by the browser (until the website obtained a new certificate).

Requirements

During our discussions a few requirements that a potential solution should have to deal with came up.

  • Any solution should be user centric. Meaning that the solution is easy to use for ordinary Internet users, helping to protect them in an intuitive and non-obtrusive way. Also, users should be able to decide for themselves who to trust or not.
  • Trust is dynamic. It comes on foot and goes on horseback. Any solution should be able to cope with sudden changes in trust relationships, both on an individual and a national or global scale. Moxie Marlinspike calls this trust agility.
  • Given this requirement, a solution should ideally not rely on a single trusted third party (because in such a case it is hard to lift the trust in this party). Note that even a sub-CA in certificate tree is such a trusted third party (for those websites that it issued certificates to).

Potential solutions

The following ideas towards solving the problem of authenticating websites were suggested.

  • Since you have to trust your browser anyway, why not hardcode the certificates of the 1000 most important domains in the browser.
  • Add redundancy. Websites should have more than one certificate, issued by different CAs. If one such CA is no longer trusted, it can be removed from the list if trusted CAs and the website can immediately use on of the remaining certificates for authentication.
  • Use out of band signalling, like printing the fingerprint of a certificate on the bank card (so the certificate of the Internet banking site cannot be spoofed). Perhaps QR codes can be used for this purpose (as a particular instance of mobile identity management.
  • A CA should only sign certificates for a domain, if the domain has a business relationship with the CA. This relationship should be registered, like a marriage register.
  • Similarly, a certification authority should only issue a certificate for a domain if the domain cooperates.
  • Perhaps identity based cryptography can be used to directly link a domain name with the corresponding public key, avoiding the use of certificates altogether. One can only get the corresponding private key if one can prove to own the associated domain.

I will not elaborate on other approaches I discussed earlier (like using DNSSEC with DANE, Perspectives or Convergence).

On the name of this topic

This topic is not called: "the future of certification of Internet websites" or something similar, because using certificates is but one way to authenticate websites. (Moreover, using the term certification implicitly implies that the website has undergone some kind of audit process before being issued a certificate; this is typically not the case.)

The real issue at hand is how to authenticate a website. That is the problem that needs to be solved.

In case you spot any errors on this page, please notify me!
Or, leave a comment.
martijno
, 2011-10-06 20:29:42
(reply)

Hi JHH,

Here’s some thoughts…

  1. The root CA list that comes pre-installed with browsers is just a default list IMO. Users can remove anchors that they no longer trust or add new ones (such as cacert.org) as they see fit. How’s that for user-centricity? Maybe not for the average user? Browser vendors could perhaps make this easier? And I’m not sure how this interacts with automatic updates, especially in systems where the trust store is part of the OS and shared amongst different applications.

  2. This whole thing was discovered through “public key pinning” (http://www.imperialviolet.org/2011/05/04/pinning.html). I.e., one of the Iranian victims used Chrome which had some special knowledge about (the public key inside) google.com’s certificate. It’s like your hard-coding solution (but smarter). Clearly a solution that doesn’t scale, though.

  3. There’s a nice movie by Fox-IT (here: http://www.youtube.com/watch?v=wZsWoSxxwVY) showing OCSP traffic as a result of the MitM attack in progress. Could real-time monitoring of such traffic have alerted someone earlier on? (At DigiNotar? At Goolge? At GOVCERT.nl?)

(Oh, and you have a dangling pointer to “earlier”.)

Best, Martijn

Jaap-Henk
, 2011-10-06 20:45:48
(reply)

Manually editing a list of CA’s is not something an average user will (or even should) do. If he deletes the wrong one, he may end up with loads of warnings about untrusted sites (with whom nothing is wrong). So: not user centric at all, I’m afraid.

I agree that key pinning (or hard-coding certificates) is not a proper long term solution. However, it is a quick fix that helps protect the major websites (and thus the majority of Internet users).

P.S.: Thanks for spotting the dangling pointer! Fixed now. And thanks for the link to the movie. I’ll check it out later, once I’m off the train and back to a normal bandwidth connection…

The Future of Authenticating Websites (FAWN) – 2 « Jaap-Henk Hoepman – on security, privacy and…
, 2011-10-24 16:29:52
(reply)

[…] the discussion at the Radboud University on the future of authenticating websites, I lead a similar discussion at […]

Exploring innovations in trust mechanisms | Maarten Wegdam's Blog
, 2013-09-04 15:04:39
(reply)

[…] Much has been written about the weaknesses of our current systems for website certificates (e.g. here and in 2000). As the Diginator hack showed, any of the many Certificate Authorities that exist can […]