This is a (not so) brief summary of day #1 of the ECRYPT II workshop Crypto for 2020 held on Tenerife on January 23 and 24, 2013. The summary of day #2 can be found here.

Introduction

Bart Preneel kicked of the meeting with his outlook on the achievements of ECRYPT and the challenges ahead. According to Bart, the world (for cryptographers) has changed in two respects. First of all, cryptography is now everywhere (although that was challenged on day 2 on the panel on cryptography for security and privacy). Secondly, there is now a continuum of software and hardware that cryptographers need to be aware of.

According to him, the main challenges in the area of algorithms are the following.

  • Design for security that lasts for 50-100 years.
  • Authenticated encryption for Tb/s networks.
  • Cryptography that uses ultra-low footprint/power/energy

Also poorly understood right now is algorithm agility, i.e. how to change deployed algorithms (to improve performance or security).

With respect to cryptographic protocols, current trends are

  • multi-party computation has become practical
  • fully homomorphic encryption, and its applications
  • privacy protecting data mining
  • social and group cryptography

Bart finally noted that the cryptography community is not good at explaining and raising awareness and support for what are the long term research questions (roadmapping), because there is no culture for this. According to him, the unique selling point of ECRYPT is its relevance for practice (e.g. the recurring study on recommended keylenghts) and the Real World Cryptography workshop series started by Nigel Smart and Kenny Paterson.

Future Challenges for Lightweight Crypto

F.X. Standaert presented. He observed that implementing lightweight cryptography is much better understood in software than in hardware. He also noticed most research effort is aimed at block ciphers instead of hash-functions.

There is no clear definition or standard of what lightweight cryptography really is. Evaluation criteria are usually relative (and reflect algorithmic and implementation choices (e.g. underlying hardware assumptions). An often used measure of efficiency for software implementations is “code size x cycle count / block size”. This comparison is easy, because the hardware is fixed

In hardware, the situation is less clear. Area and power consumption are correlated. So are throughput and energy usage. In his research, Francois found that area is mainly determined by the size of the register needed to contain intermediate results, and not the number of block cipher calls within one round, so loop unrolling is generally not a bad idea in terms of area. He also noted that key scheduling has a huge impact on the hardware efficiency.

Francois found it very interesting to discover that the most efficient implementation of each block cipher really needs the same number of clock cycles per encryption, independent of the cipher. It looks as if this is an absolute lower bound for a certain security level. He suggested this effect may be caused by the fact that everybody is using the same design principles for block ciphers (This effect is not present in hash functions.)

According to Francois, AES is good enough for most applications. If you need a lightweight cipher, one probably already exist for your application. Moreover,
changing a cipher is expensive (in terms of analysis) and doesn’t deliver better performance improvement (compared to just changing the underlying hardware platform technology).

Future research challenges are the following.

  • There are huge differences in performance among software implementations of hash functions on single metrics. This needs to be understood better. Do we have larger security margins for hash functions than for block ciphers?
  • This is also true for hardware implementations. For AES, there are clear tradeoffs. For SHA3 candidates these are much less clear. Compact KECCAK implementations seem to be non-trivial.
  • How to design a key scheduling (sometimes these are complex; sometimes as simple as XOR-ing the master key)
  • Authenticated encryption for block ciphers, and making them side-channel resistant.

Permutation-based symmetric cryptography and KECCAK

Joan Daemen presented. Usually, hash functions are presented as the Swiss army knife of cryptography. But this is wrong, as you can do everything with a block cipher (block, stream encrypt, MAC, hash, authenticated encryption).

Block ciphers in general have a separate key schedule and data path, with diffusion only from the key schedule to the data path (and not vice versa). This because a block cipher needs to be invertible. If you remove this restriction, you arrive at the sponge design Joan presented.

The plain sponge cannot be used to construct authenticated encryption, for that you need a combined absorbe and squeeze phase. This is fixed in a duplex construction whose generic security is as good as the basic sponge.

Joan notes that in keyed modes, there are very few known attacks on hash functions, as compared to unkeyed modes. This has to do with the fact that attacker does not know (the full) internal state. As a consequence, in keyed modes, you can reduce the number of rounds for the same level of security.

Lightweight Cryptography: Mission Accomplished?

Peter Rombouts presented. Applications of lighweight cryptography lie in anti counterfeiting of luxury goods, and quality monitoring of perishable goods, while respecting privacy. The constraints are

  • cost (determined by chip area of tag, silicon process technology, and
    assembly – which is much more expensive than the tag itself, by the way)
  • read range (determined by operating frequency and power consumption)
  • transaction time (clock speed and memory access speed)

The new EPC Global Gen-2 UHF RFID protocol (860 MHz – 960 Mhz) version 2.0.0 standard includes new commands for security (challenge, authentication) and file management (ISO 29192-2).

There are still quite a few unsolved issues w.r.t. making really secure RFID tags. You do not only need a good crypto core, but also a good source of randomness, secure storage for the keys, and countermeasures to make design more robust against side channels. Peter noted that physically uncloneable functions (PUF’s) are not small, and consume quite some power.

Future research questions:

  • Integrate side-channel countermeasures into the design of a cipher, instead of adding later on top of the design (as is done mostly now).
  • Efficient enough public-key cryptography for RFID tags, as this makes key management easier (whether this actually makes key management easier was challenge later during the day).
  • Achieve low-latency (for when response time is critical, e.g. solid state disks, real time applications, or when the clock frequency is limited, e.g FPGA.

SYMLAB Panel

Members: Peter Rombouts, Joan Daemen and Fran├žois-Xavier Standaert.

Main observations:

  • Compared to 10 years ago, we now really know how to make a lightweight cipher. No major progress is to be expected.
  • In future applications low latency becomes more important
  • You can design ciphers with different S-boxes or round functions for each round. This may improve security (less rounds needed, and hence lower latency and less power), but because of lack of structure analysis is then much harder.
  • Research on lightweight ciphers has focused on hardware, while 99% of cryptography in the wild is in software. Research should focus on this.
  • Joan Daemen questioned whether keymanagement really is easier when using public key cryptography? Look at the problems with certificates needed to make the web secure (within SSL/TLS).

Key Reuse: Theory, Practice, and Future

Kenny Paterson presented his study of the problem of key reuse, that happens a lot in practice. This mainly happens to save storage space, or to reduce the number of certificates needed (which also reduces the cost of certification itself). This practice breaks the key separation principle, of course.

Standards encourage key reuse. X.509 does not specify for which purposes or which algorithms the certified key should be used. The subsequent key usage extension contains 9 bits for 9 usages, but does not restrict any combination of those bits!

As an example, Ken presented key reuse attacks on EMV. According to him, EMV is more important SSL as it is more widely deployed: there were 1.55 billion cards in use in Q2 2012. (The details of the attack are omitted here, as they are quite specific.)

Looking ahead to 2020, Ken observed that

  • Standards keep wanting to include weak cryptography (eg PKCS #1 v1.5).
  • Supporting legacy trumps supporting security.
  • We are good at theory, but bad at building and testing.
  • We are poor at theory for “supporting infratsructure” like key management, key hierarchies, software libraries, randomness, or crypto negotiation protocols.
  • We should analyse standards and implementations, and engage with standards bodies.

Ken urged the CRYPTO/EUROCRYPT conference community to also consider accepting more of such real-world cryptography papers, that currently only get accepted at conferences like USENIX Security. Interestingly, Matt Green responded that USENIX Security really welcomes good (more theoretical) cryptography papers.

Remaining remarks and observations

I have left out the talks of Giuseppe Persiano on Functional Encryptions and Cloudy Applications and Tanja Lange on Post Quantum Cryptography, as well as the closing panel, in this summary.