ETSI Security Conference 2024 – Post-Quantum Cryptography w/ Daniele Lacamera –


Daniele Lacamera is a software technologist and researcher. He is an expert in operating systems and TCP/IP, with more than 20 academic publications on transport protocol optimization. He began his career as a Linux kernel developer and had his first contribution accepted in Linux 2.6. Since 2012, he has been working on microcontroller-based architectures, focusing on the design, development, and integration of software for embedded systems. He is an active contributor to many free software projects, and co-author of a TCP/IP stack and a POSIX operating system (OS) for Cortex-M devices, both distributed under the GPL. Nowadays, as software engineer at wolfSSL Inc., his activities are focused on IoT security, cryptography, secure boot, and custom transport protocols.

 

To start off, could you provide some background on wolfSSL? What are the main areas of focus for the company, especially when it comes to encryption solutions and security?

WolfSSL was founded in 2004 by Larry Stefonic and Todd Ouska. Our main focus is providing security and we provide security through three main areas.

  • Securing data at rest, so everything that’s related to cryptography, all the cryptography algorithms that we have implemented in what we call wolfCrypt. wolfCrypt is a C-language-based embedded cryptography engine, offering low memory usage, optimized speed, high portability, and strong feature set.
  • Securing data in transfer, so TLS – historically called SSL, and nowadays TLS – up to TLS 1.3, where we also provide the implementation of the protocols that are extremely portable. So you can run the same stack protocols from the small restricted device up to regular computers or even cloud servers. 
  • The third area we’re focusing nowadays is secure boot and firmware authentication. Because we saw this coming from the market more and more: they need to verify the authenticity and the integrity of software before you run it, and especially verify that it’s coming from a trusted source.

 

Where are we right now with post-quantum cryptography standardization? What are some of the challenges in this space?

Historically, we started working on the post quantum since 2010 when we implemented NTRU and, of course, that was afterwards deprecated – we provided the other implementations in the meantime. We were a bit ahead of what the standardisation work was at the time. Now that we have more and more standards around the PQC algorithms and these PQC algorithms have been selected specifically by NIST. We do have the right guidelines through the NIST documents on how to implement these algorithms, and especially which algorithms among those available ones need to be implemented first. The point where we are now is – we have abandoned third-party implementations that we consider just an intermediate step to provide our own implementation of those crypto algorithms through the experience of our cryptographers and that those implementations are optimised for specific platforms. We do provide optimizations for different assembly languages to solve the issue of the performance of these algorithms. In critical use cases. For instance, if we come back to the firmware verification – the time that is spent to verify the signature impacts on the boot time of the device that has to verify the firmware at every boot and some industries have specific requirements over those boot times and those cannot be worsened because we are switching to quantum resistant algorithms. So, the effort of the development team at this point is to provide this kind of optimizations in order to make the switch to post-quantum cryptography smoother and not impact the boot times. This is the biggest challenge, currently.

 

The NSA’s Commercial National Security Algorithm Suite 2.0 (CNSA 2.0) has been mentioned. Can you explain what CNSA 2.0 is, why it’s critical for security, and what the expected timeline for implementation looks like?

The CNSA 2.0 is suggesting to switch to post-quantum cryptography and they have different timelines for different use cases. And those timelines are not that far in the future. In fact, for instance, for firmware authentication, they recommend that we already start using those algorithms next to the classic ones where we should be switching already at the beginning of next year to pass quantum algorithms as default. For these kinds of operations it is very important for us also to help us identify – what kind of algorithms, and which variants of the algorithms need to be implemented – according to the advice given from the CNSA. The one particularity of this CNSA specification is that they included LMS and XMSS, but without the HSS and MT variants that allow for multiple trees. And for some well documented reasons, which seems to be a bit in the opposite direction of what NIST was indicating in their standards at that point. So we’re still going to provide both interfaces, but eventually it’s usually the market that decides. And in our case, that is especially true because we often rely on hardware partners that need to implement those algorithms themselves inside the hardware. And once those become available on the market, this is likely the direction that some implementations will be forced to follow.

 

How do Europe and European standardization efforts compare with initiatives like CNSA 2.0? Are there significant differences in approach or focus? Is Europe lagging behind? 

Yes, I think that the advantage in time that NIST has had by starting a standardization process back in 2016 has allowed them to already define a family of crypto algorithms that should be the main building block. The approach that I’m seeing these days at this ETSI conference is targeting a broader scope, also looking at how the provisioning of those keys needs to change, needs to be adapted to these new algorithms and how the existing products need to migrate in a broader scope, not just targeting the crypto implementations. This probably has held ETSI a little bit behind on the standardizations of the crypto modules themselves, but they are doing a great job ensuring that the crypto modules will be used in the right way and provide these kinds of guidelines for a smooth migration towards adopting those technologies, not just the crypto modules themselves. So, I can’t say that we are lagging behind, it’s just that we are looking at the problem in a broader scope and probably we didn’t have enough focus on specific cryptography algorithms as NIST. 

 

When we talk about post-quantum readiness, what are some of the immediate concerns organizations should be focusing on? How can they begin mitigating risks today, even as standards are still evolving?

For some embedded projects, especially in some specific verticals, for example the space – where we do have a few customers and users. The lifetime of their products is really the challenge, as it has been also pointed out in other presentations in the session this morning. Sending an item to space now means that it will have to be alive for the next 20 years and – if it’s reachable from this planet it might have issues when the crypto algorithms have been already defeated and there is no way that we can go to space and replace those algorithms. So we need a way to be ready to switch out those algorithms when it’s time for it. So, either we carry those algorithms’ implementation now when we launch the next subject to space or we provide a reliable update mechanism that’s also secure that will provide this implementation of when the time will be right.

 

Could you talk about wolfSSL’s current status in terms of post-quantum cryptography? What steps have you already taken, what’s currently in progress, and what should we expect in the near future as part of your roadmap?

Concerning post-quantum cryptography, we are working to introduce new algorithms that will be standardized by NIST. But as I said earlier, we’re also working hard to provide the optimization in terms of not just performance but also code size for resource restricted devices such as embedded systems that are based on microcontrollers. Because a big part of our users and customers are running on restricted devices. In the company, we understand that we need to follow where the market is going and we really need to address our customers’ needs. We’re trying to interact as much as possible with those customers that have specific post-quantum and cryptography needs to understand how they are using that, what is the use case in order to facilitate their path to their actual target.

 

Finally, what’s the key message or takeaway you’d like to leave us with regarding the future of cryptography and the path toward post-quantum readiness?

Security is very important. That’s true. It doesn’t really only depend on cryptography, but having a good cryptography that is certified in terms of respecting the standards and also certified for safety. And nowadays this is a very important topic, especially if you’re dealing with life critical embedded systems scenarios and this is something that we need to take into account. The take away would be – if you’re implementing an embedded system, think about security and especially secure updates very early in the process. So, don’t postpone the design of those components to later on, but rather try to anticipate these kinds of problems before it’s too late, in the sense that your architecture is already well-defined and there is less space to move and you need to take shortcuts or trade-offs, that penalise your security aspects. 



Source link