ECRYPT II workshop on Crypto for 2020, day #2

January 27, 2013
1

This is a brief summary day #2 of the ECRYPT II workshop Crypto for 2020 held on Tenerife on January 23 and 24, 2013. The summary of day #1 can be found here.

Panel on Crypto for security and privacy

Panel members: Daniel J Bernstein, Matthew Green, Nick Mathewson and Zooko Wilcox-O'Hearn. Despite the title, the panel mainly focused on the (lack of) connection between cryptographic theory and the practice of Internet security and privacy today.

Nick started explaining his position. According to him practical design lags what is known academically by 15 years. Current implementations are full of holes. In particular Nick has a gripe against TLS of which he says all parts have serious security issues: the handshake, the record format, etc. Looking ahead to 2020, Nick suggested that the cryptographic community should reach out to the implementers. Secondly, he would love to see a complete redesign of TLS.

Zooko followed. His main observation was that in 2020 things will be soo much different from now. If you look back 7 years from now, Google just had their IPO, smartphones weren't ubiquitous (PDA's existed as seperate entities from phones, mostly), and attack code was not deployed for profit on the scale we see now. In fact, we now have companies insuring themselves against the cost of cybercrime at a total of 1 billion dollars a year. His recommendation: work on weird ideas to solve the domain name problem.

Dan was next. He remarked that communication is usually not encrypted and not authenticated. TLS (despite of its problems mentioned above) is not as widely used as could be (although there are initiatives like https-everywhere, and recent pressure on Google and Twitter has resulted in them adopting https as the default). Dan also noted that the solution space is remarkably narrow, there are very few pieces of software out there that you can actually use. (This is similar to the recent observation by Matt Blaze that there are very few usable security solutions out there.) The interesting question is: why are there not more usable solutions? No (hint for an) answer was given...

Matt was last. He observed that, compared to 20 years ago, cryptography starts to be deployed more and more in practice, and hence will actually be attacked. The biggest problem in improving the security in the real world is the need for backwards compatibility. Therefore, more research into the formalisation analysis of protocol negotiation is needed. HTTP Strict Transport Security (HSTS) and certificate or public key pinning have a large effect on the security of the Internet in practice. However, these tools have all been developed independent of the cryptographic community. There needs to be a bridge between crypto theory and engineering.

In the discussion that followed, I raised the point that (especially Nick) appears to fall in the trap that in theory, theory and practice are the same, whereas in practice they are not. What I meant was that theoretically perfect solutions may have (non security related) drawbacks that in the end lead businesses to decide to implement things differently in practice than theory prescribes. I really do agree that the crypto community should reach out to the implementers. But reaching out is something different from shouting that you are right, and the others are wrong. Instead we need to understand the language of the other parties, need to understand their needs, requirements, intentions, and way of working together. In standardisation the outcome is a compromise, and the result of the power struggles between the major, often commercial, players that each have different, often conflicting, interests. It is important to realise that all these factors play a role. In my mind, this realisation is the first step towards crossing the divide between theory and practice.

Getting Post-Quantum Crypto Algorithmsd Ready for Deployment

Tim Güneysu presented a nice overview of 4 flavours of post-quantum cryptography (aka Alternative Public Key Cryptography - APKC): code based, hash based, multivariate-quadratic equations based, or lattice based.

The code based approach is based on the hard problem of decoding a syndrom/random linear code. The focus in the domain has been on encryption. There are but a few signature constructions, that are also not very efficient. In general the Niederreiter scheme is faster than McEliece. Key sizes for these schemes (for a security level comparable to 80 bits in the symmetric setting) are in the order of 500-1000 bytes.

The hash-based approach is based on the hardness of finding a pre-image (of a 2nd preimage resistant hash function). This approach yields reasonably small keys sizes, and the signature size is a few kilobytes. However, there are no hash-based encryption schemes available so far! Also, the height of the hash tree (used in all constructions) determines the maximum number of signatures you can create with a given key. Moreover, for security you need to keep track of the signatures already used. In other words: the signer needs to keep (a huge) state.

Multivariate-quadratic equations approaches rely on the hardness of finding solutions to such equations. The key size is large, typically 10-100KB (for 80 bit symmetric equivalent security), and there are only signature schemes known, no encryption schemes.

The lattice based approach relies on the hardness of finding the shortest/closest vector in a n-dimensional lattice. Traditionally, lattice based systems were either unpractical but provavbly secure (extremely large keys) or practical but without proofs (NTRU). Currently, ideal lattices can be used to create both practical and provable secure constructions, both for encryption and signing. Moreover, they can be used to construct hashfunctions, pseudo random functions, identity based primitives, or homomorphic schemes. Some versions for creating signatures have huge keys (in the order of 500 kB). For encryption, on ideal lattices, 500 byte keys suffice. For this class of problems, more cryptanalysis is needed. Moreover, most protocols use R = Zp[x]/(xn+1), so it would be worthwhile to implement a fast library for this case.

The main problems for getting APKC applied are that constructions are too novel, and therfore have not been properly analysed yet. Moreover, for implementers, no instances and/or parameters have been specified or standardised yet. Also, implementations are currently too slow.

Tim concluded by saying that coded based approaches are quite mature, and that lattice based approaches have big potential. For deployment in the real world, we need more solid implementations. Finally, the physical (side channel) security of post quantum cryptography has not been studied at all yet.

Quantum Computing

Charles Bennett, of BB84 fame, presented.

Classically the state of a physical system can be observed reliably without disturbing it, and can be described by describing the state of each of the individual components separately. In the quantum world you cannot: there is disturbance, and entanglement.

Charles very nicely explained disturbance as follows. Quantum state is like a dream: if you try to describe your dream, your memory of it changes until it becomes that what you already told others about it. This property can be used to implement quantum money and quantum key distribution (QKD).

Entanglement means that the state of one subsystem is related to the state of the other subsystem. Understanding entanglement is tricky - the wrong understanding might make you believe that information can travel faster than light. Charles explained that for two particles in an entangled state, a measurement of the state of one particle (hence disturbing its state) sends a message backward in time to fix the state of the other particle (but the sender cannot control the message being sent). Moreover, entanglement is monogamoustwo particles that are maximally entangled, cannot be entangled with anything else.

A classical wire is a quantum channel with an eavesdropper (it conducts 0 and 1 faithfully, but randomises superpositions of 0 and 1) This makes a classical computer a handicapped quantum computer with all wires eavesdropped. On a quantum computer we have exponential speedup for factoring and quadratic speedup for search. This has huge implications for cryptography, hence the study of Post Quantum Cryptography mentioned earlier.

Rephrased with the above eavesdropping analogy in mind: for a quantum computer, certain problems become much harder when the internal state is eavesdropped. Such eavesdropping is related to decoherence, which can be prevented by quantum error correction techniques. Those techniques are necessary to build quantum computers that can maintain a quantum state for a sufficient amount of time.

Remarks

I have not made notes during Stefan Tillich's talk on Physical Security: Status and Outlook, and the discussion after Charles Bennett's talk.

In case you spot any errors on this page, please notify me!
Or, leave a comment.
ECRYPT II workshop on Crypto for 2020, day
, 2013-01-27 21:00:23
(reply)

[…] This is a (not so) brief summary of day #1 of the ECRYPT II workshop Crypto for 2020 held on Tenerife on January 23 and 24, 2013. The summary of day #2 can be found here. […]