Förderjahr / Science Call #2 / ProjektID: / Projekt: PROFET
It’s been a while since the last blog post, but now it seems to be a good time to reflect on the year 2020 and the first half of the year 2021 (thus it’s going to be a bit long).
With the COVID-19 pandemic hitting the world in early 2020, it has since been impressively demonstrated how important science in general and in particular international scientific collaboration is to our society. One notable aspect is that it has shown that continuous scientific progress in fields that might subjectively not be super important up to a certain point in time to large parts of the society, can pay off quite significantly if certain (unexpected) bad things happen. The rapid development of vaccines to fight COVID-19 is one such impressive example of scientific work that probably did not have a very strong visibility before (despite continuous high quality research and important developments). But once a risk kicks in out of thin air that might have been underestimated before, certain disciplines turn to be the center of world wide attention. Since these fields are then expected to deliver solutions immediately, it demonstrates that a continuous funding of research is essential (as good research essentially lives from continuity).
Let us build the bridge to the field of IT security, which simply speaking is all about risk and in particular risk management. Whether a certain security mechanism should be put in place is typically hard to measure beforehand and whether it actually pays off usually only manifests if one did not implement it and something bad happens. But then it is typically already too late and the damage is done. Moreover, we are confronted with a phenomenon quite similar to the prevention paradox, i.e., if we implement some measure and nothing bad happens for a longer period of time, it might suggest that it would not have been necessary to implement the measure in the first place. Unfortunately, latter might give the false impression of being safe and sloppiness kicks in (arguably, there is a parallel to the COVID-19 pandemic).
Now the COVID-19 pandemic also significantly impacts IT security in many aspects and in essence it gets increasingly important. For instance, the number of people working from home has significantly increased. However, the IT infrastructure in the home office is different from what is available in companies and guaranteeing adequate security measures is not always an easy task. And as studies have shown criminals have increasingly moved their activity to focus on online crimes . Moreover, the pandemic has been a catalyst for digitalization in many sectors. Since in such a situation new digital services need to be developed and deployed within a short time frame, it is advisable to put a significant amount of resources into designing and deploying secure and privacy-friendly solutions (a necessity that seems to have been missed by quite a number of recent projects – at least in their initial design proposals).
One of the central technical tools to provide security guarantees when implementing IT security measures is cryptography, which finally brings us back to our project. There are many challenging issues when designing cryptography for our modern highly-connected information-centric society. One of them is the field of quantum computing (read the excellent article by Scott Aaronson here), being the sword of Damocles to cryptography. In particular, a sufficiently powerful quantum computer represents an enormous threat to large parts of cryptography deployed in our current communication infrastructure. Loosely speaking, it would immediately break all the asymmetric cryptography currently deployed all over the world. And while it is unclear when such powerful quantum computers will be available (and if at all), it is again all about risk and risk management. Luckily, we have quite a number of mathematical problems which are assumed to be quantum safe (i.e., assumed not to be efficiently breakable even when having a powerful quantum computer at hand) and can be used as a drop-in replacement for what is currently used. Given the enormous impact this event would have, it seems advisable to have quantum safe cryptography (so called post-quantum cryptography) ready early enough to protect against the potential disasters if this risk really kicks in. In order to attract strong interest from the cryptographic community and provide a time-line for the industry to when such schemes will be standardized, in 2017 the National Institute of Standards and Technology (NIST) initiated a post-quantum cryptography competition (PQC), currently being in its final round. NIST is well recognized for running cryptographic competitions leading to cryptographic algorithms that are predominantly used nearly all over the world and the final portfolio of algorithms within the PQC is scheduled to be determined by the end of this year.
Coming back to the pandemic for the last time in this blog post, among the many impacts on our daily lives, it has also significantly changed how scientific interactions looks like. While online collaborations have been frequently done in the pre-COVID-19 times, meetings, networking and conferences entirely moved to being online. And they will very likely continue to stay fully online at least for a significant part of the year 2021. While this comes with quite a number of drawbacks, the positive side-effect is that all conference talks are immediately and permanently available online to the general public. This finally brings us to initially promised recap of the last year and a half from a scientific perspective.
As the scientific results obtained during the last year and a half are quite numerous and cover quite a lot of different aspects, we will only briefly discuss a selected fraction of the results below and refer to the project website for links to all publications.
26th Annual International Conference on the Theory and Application of Cryptology and Information Security (AsiaCrypt 2020)
In this work published in ASIACRYPT 2020 we are mainly concerned with public key encryption (PKE) schemes, i.e., encryption schemes where we have a public key for encryption and a corresponding secret key for decryption. Intuitively, what we want to have is that whenever we encrypt some message with the public key, then decryption with the secret key gives us back the original message. However, in PKE schemes based on mathematical problems that are conjectured to be quantum safe (and in particular those relying on codes or lattices), we encounter that there is some small (but typically not negligible) probability that decryption fails and does not give back the original message. While this probability might be so small that it does not really matter in a practical use of the scheme, it turns out that this yields to a host of attacks against such schemes (even in practical scenarios) and thus this needs to be considered as a problematic “feature”. Now in this work we investigate whether we can come up with compilers that immunize PKE schemes against such attacks. More precisely, we have designed compilers that take a scheme with such decryption errors and turn it into a scheme with a negligible decryption error, latter which are no longer prone to such attacks. This is important, as many schemes in the NIST Post-Quantum Competition suffer from such problems and will be central building blocks for secure communication protocols of tomorrow’s Internet.
Watch Valerios talk for more details.
25th International Conference on Financial Cryptography and Data Security (FC 2021)
We will briefly discuss two works that we have published in FC 2021.
The first work is concerned with a specific variant of a digital signature scheme called an adaptor signature scheme. Loosely speaking, in an adaptor signature scheme signing runs in two phases and binds an instance of a cryptographic problem (a so called word in a hard relation) to the signature. The first phase of signature generation in contrast to conventional signatures additionally to the signing key takes an instance of the cryptographic problem and generates a pre-signature. The second phase of the signature generation is based on the pre-signature and can then be performed without the knowledge of the signing key, but rather requires the knowledge of the solution to the instance of the cryptographic problem. Finally, given both the pre-signature and the final signature it is efficiently possible to extract the solution to the instance of the cryptographic problem. The work now gives a construction of a post-quantum adaptor signatures relying on the hardness of problems based on isogenies. While the concept of adaptor signatures might sound a bit strange at first sight, adaptor signatures have proven to be of significant practical interest in blockchain applications.
Watch Erkans talk for more details.
The second work is concerned with the construction of public key encryption schemes that provide very fine-grained forward security guarantees. Forward security here means that in such encryption schemes the secret keys evolve over time in a way that one iteratively restricts the decryption capabilities of the key. We will not go further into the details here, since together with Sebastian Ramacher and Christoph Striecks we wrote a three part blog series on this topic which gives a good introduction to the interested reader. The blog posts can be found on the project website, where Part I covers introduction and motivation, Part II constructions and finally Part III applications and implementation.
Also, watch Christophs, Daniels and Sebastians talk for more details.
24th International Conference on Practice and Theory of Public-Key Cryptography (PKC 2021)
In this work published in PKC 2021 we are interested in authentication primitives, i.e., cryptographic primitives that protect the integrity and guarantee the authenticity (i.e., origin) of messages. More precisely, we consider digital signatures where one uses a secret key to sign a message and the corresponding public key can be used to verify a signature and message authentication codes (MACs), where signer and verifier share the same secret key (and thus there is no public verifiability). Now what we study in this work is to provide an updatability feature for these two primitives. In practical use of authentication primitives, a common practice that should mitigate the risk of leakage of the secret key is to perform key-rotation on a regular basis, i.e., one switches to a new key and deletes the old one. Now updatability means that when switching to a new key one can compute a compact token that can be used to efficiently update all existing signatures under the old key to the new key. So there is no need to recompute all the signatures (or MACs) under the new key. And this task of updating can even be delegated to some semi-trusted party such as the Cloud. In the work we show how such schemes can be constructed from classical assumptions as well as from assumptions that are conjectured to be quantum safe.
Watch Erkans and Valerios talk for more details.
This finally concludes our blog post and we hope that we soon provide updates on scientific results that we present in physical meetings (just as back in the old days)!