Last year, the National Institute of Standards and Technology (NIST) start the standardization process of selected post-quantum cryptography (PQC) algorithms — the final step before making these mathematical tools available so that organizations around the world can integrate them into their encryption infrastructure. Following this, the National Security Agency (NSA), Cybersecurity and Infrastructure Security Agency (CISA) and NIST released a report joint report containing recommendations for organizations to develop a roadmap for quantum readiness and prepare for future implementation of PQC standards.
But another story also grabbed the headlines: Google announced was implementing a hybrid key encapsulation mechanism (KEM) to protect the sharing of cryptographic secrets when creating secure network connections with Transport Layer Security (TLS) protocol. Simply put, the world’s most popular browser began the process of “quantum-proofing” much of the public Internet.
Google’s announcement was the product of a long chain of events, triggered by NIST chooses Kyber as a candidate for general encryption last year. The NIST trial has been ongoing since 2016, established in response to the growing threat to cryptographically relevant quantum computer (CRQC) poses. When a working CRQC emerges, the encryption we use extensively to protect our Internet sessions will melt away.
As a result, Google announced that it has added Kyber, starting with version 116 of its Chrome browser. This was done through a custom implementation by Google within TLS, a widely used standard in Internet communications.
Additionally, Google’s implementation of Kyber is hybrid, meaning that traditional elliptic curve cryptography has also been left behind along with Kyber, which helps mitigate risks and provide continuous, proven protection from attacks using cryptocurrencies. classic computers today. This step also ensures that someone cannot crack the new Kyber algorithm.
Because you’re not safe yet
Google’s action is significant in many ways: The world’s largest Internet browser, used globally by online users everywhere, has begun its migration to post-quantum cryptographic protection. This is a huge step in the migration efforts that are already underway, if we do it collect now, decrypt later (HNDL) into consideration – late. But it will still take some time before we can truly say that it protects users from a quantum attack.
First, it appears that Google has only updated the Chrome browser on the client side. For any connection to be quantum-secure, the servers in question must also be upgraded to Kyber, but Google doesn’t appear to have done this for its own apps yet.
Added to this is that the surface area we need to protect goes beyond simply securing connections: we need to consider apps beyond the Google environment. Each cloud application provider will also have to work server-side to ensure that Chrome users can establish a secure connection with them using Kyber, which isn’t going to happen anytime soon.
This becomes more complex if we consider that the TLS protocol, into which Google has added custom Kyber, is managed by the Internet Engineering Task Force (IETF). The IETF has not yet ratified a standard way for companies to add post-quantum algorithms as part of TLS, which must happen for widespread adoption to occur.
The final caveat is that there is also the question of how communication connects deeper behind the scenes, such as how links between data centers are secured. There is no point in securing user-application links if data is collected en masse as it moves between data centers. This will require a separate solution, such as quantum secure virtual private network that NATO uses.
What if you can’t wait?
It is now well documented that HNDL attacks, in which sensitive data with a long shelf life is collected by those intending to decrypt it once a sufficiently powerful quantum computer arrives, are already happening. For many, the above shopping list won’t exactly be good news, and even more so for those who need to keep highly sensitive data safe for a long time. That is, mitigation measures need to come much earlier. You can’t wait until new post-quantum algorithms are integrated into a shared public infrastructure, because you will probably be waiting more than a decade.
As a result, the Google news highlights the urgency for organizations to chart their own migration path, rather than wait to be pushed by others. For example, instead of waiting for public infrastructures to be upgraded, aim, for example, to create a bespoke end-to-end infrastructure that is quantum-safe by design, where everything from business processes to daily internal communications , they are protected. This way you don’t have to wait for others to update or for algorithms to be approved. You can have the protection you need for the next 50 years, today.
The first/last mile problem is still present
Google’s update doesn’t relieve pressure for many people, but it’s certainly a milestone if we look at it through the lens of a broader public infrastructure update. Post-quantum migration is a multi-year journey and could only be completed after the birth of a functioning CRQC, which will be too late.
To borrow a phrase from the world of logistics and telecommunications, we still have the first/last mile problem. While these industries have perfected their efficiency and speed challenges to bring their goods and services to the home, this is where things can go horribly wrong from an end-to-end cybersecurity perspective. For organizations that need the most urgent protection from the quantum threat, a tailored approach is needed. And there is a need for it today.
A hybrid approach, in which multiple post-quantum and traditional encryption algorithms are combined, offers public key cryptography that is truly interoperable and resistant to quantum and traditional threats. However, this work goes beyond simply implementing algorithms and can cause unintended consequences in terms of speed and new risks. An organization will only be truly quantum secure when it is secure end-to-end – which means new approaches to identity, access management and human risk will be essential.