Text settings Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only Learn more Minimize to nav Sometime around 2010, sophisticated malware known as Flame hijacked the mechanism that Microsoft used to distribute updates to millions of Windows computers around the world. The malware—reportedly jointly developed by the US and Israel—pushed a malicious update throughout an infected network belonging to the Iranian government.
The lynchpin of the “collision” attack was an exploit of MD5, a cryptographic hash function Microsoft was using to authenticate digital certificates. By minting a cryptographically perfect digital signature based on MD5, the attackers forged a certificate that authenticated their malicious update server. Had the attack been used more broadly, it would have had catastrophic consequences worldwide.
The event, which came to light in 2012, now serves as a cautionary tale for cryptography engineers as they contemplate the downfall of two crucial cryptography algorithms used everywhere. Since 2004, MD5 has been known to be vulnerable to “collisions,” a fatal flaw that allows adversaries to generate two distinct inputs that produce identical outputs.
Within four years, two other pieces of research further demonstrated the weakness of MD5. The latter used 200 Sony Playstations running for three days to generate a rogue TLS certificate. Despite the fatal flaw being well known, a small part of Microsoft’s sprawling infrastructure still used the hash function.
Determined to keep a similar scenario from playing out again, organizations everywhere are rolling out new algorithms to replace RSA and elliptic curves. For more than three decades, the two public-key algorithms have been known to be vulnerable to Shor’s algorithm, a series of equations that allow a quantum computer of sufficient strength to solve the mathematical problems underpinning these two algorithms in polynomial time, a dramatic speed-up from the exponential time required by classical computers.
Earlier this month, both Google and Cloudflare bumped up their internal deadline for PQC (post-quantum computing/computer) readiness to 2029, an acceleration of roughly five years. The moves were largely prompted by two pieces of research showing that CRQC (cryptographically relevant quantum computing/computer) may arrive sooner than previously estimated.
While there’s little known evidence that a CRQC will emerge in the next four years, the revised deadlines set a good example for peers such as Amazon and Microsoft, whose timelines are two to six years longer. They also largely align with US government goals; the Defense Department is requiring all national security systems to use quantum-safe algorithms by December 31, 2031, and the National Institute of Standards and Technology is calling for the deprecation of vulnerable algorithms by 2035. While many experts strongly doubt CRQC will arrive by 2029, others say an industry-wide acceleration is necessary given the stakes and the difficulty of the work required to be ready.
“You have to remember that transitioning the Internet to post-quantum, especially for digital signatures, is a massive undertaking,” Dan Boneh, a computer scientist and cryptographer at Stanford University, said in an interview. “It would be amazing if the entire Internet can get it all done by 2029. By setting a 2029 goal, they are giving themselves some slack in case they fail to meet that deadline. If they target 2035 and miss by two to three years, we are getting uncomfortably close to the danger zone.”
Brian LaMacchia, a cryptography engineer who oversaw Microsoft’s post-quantum transition from 2015 to 2022 and now works at Farcaster Consulting Group, agreed.
PQC readiness “is mostly actuarial/risk management—even if the chance of building a CRQC by, say, 2030 is very low (say 5 percent), the downside risk is huge,” he explained. “Combine that with very long transition engineering times, and you should have started already.”
Unless you believe that there is something fundamental in quantum physics that prevents the construction of a big enough quantum computer, that probability is greater than zero. So that’s the risk we’re trying to mitigate with the PQC transition: the risk that the race to build a CRQC is successful and that capability lands in adversarial hands before we have upgraded all of our cryptographic systems to quantum-resistant algorithms.
Until now, most of the attention paid to PQC has focused on using Shor’s algorithm to break RSA encryption, a feat estimated to be at least a decade or so out. The deadline has prompted security engineers to focus preparations to ward off harvest-now-decrypt-later (HNDL, and also store now, decrypt later) threats, in which adversaries squirrel away encrypted data flowing over the Internet with the plan of decrypting it on Q-Day—the date a CRQC arrives.
Most of the preparation for this eventuality has involved updating RSA encryption to use the Module Lattice Key Encapsulation Mechanism—typically known as ML-KEM—a PQC algorithm based on problems that quantum computers have no advantage over classical computing in solving. Given the relatively low number of protocols using RSA, this work has been relatively easy.
The two recent research papers focused on something else: breaking ECC (elliptic curve cryptography) used for digital signatures. It’s hard to overstate the importance of such signatures and the number of applications using them. They verify the authenticity and integrity of messages, documents, software, hardware, remote SSH logins, and TLS certificates, to name just a few.
In one paper, researchers from the firm Oratomic showed that a relatively new approach for building a working quantum computer using neutral atoms could break ECC with as few as 10,000 physical qubits, orders of magnitude fewer than the most recent lowest-bound estimate. A qubit is the quantum equivalent of a bit in classical computing. While bits are either a 1 or a 0, qubits can be in a superposition of the both states.
There are two varieties of qubits: (1) logical qubits that are free enough of naturally occurring errors that would otherwise render computations useless and (2) physical qubits, the physical hardware needed to correct the errors. Typically, estimates hold that each logical qubit requires 100 to 1,000 physical qubits.
Google showed in the second paper that two quantum circuits it developed needed only 1,200 logical qubits to break 256-bit ECC—which is used to secure blockchains for bitcoin and other cryptocurrencies—in just nine minutes, a period short enough for adversaries to spend other people’s funds in real time.
One such system required only 90 million Toffoli gates, a resource-intensive operation that’s currently a prohibitively major challenge to deliver. A second circuit needed fewer than 1,450 logical qubits and 70 million Toffoli gates. Google estimated that such systems would require 500,000 physical cubits, half of what the same team estimated last June was needed to break 2048-bit RSA.
In the weeks before and after the release of the papers, both Google and Cloudflare announced their accelerated deadlines for full quantum readiness to 2029. With ECC appearing likely to fall before RSA—and sooner than previously expected—both companies are now prioritizing quantum-proofing the ECC-based authentications that form the gates used to keep adversaries out of private networks, computers, and other critical systems.
“An imminent Q-Day flips the script: data leaks are severe, but broken authentication is catastrophic,” Bas Westerbaan, a Cloudflare principal researcher, wrote last week. “Any overlooked quantum-vulnerable remote-login key is an access point for an attacker to do as they wish, whether that’s to extort, take down, or snoop on your system. Any automatic software-update mechanism becomes a remote code execution vector. An active quantum attacker has it easy—they only need to find one trusted quantum-vulnerable key to get in.”
Google’s announcement and other company communications have also signaled a new priority on migrating to quantum-safe authentication schemes, lest adversaries use quantum computers in real-time attacks like the one Google demonstrated. With Q-Day estimates far off, the priority was on augmenting existing vulnerable encryption with ML-KEM. On a shorter timeline, the push shifts to authentication, a much more resource-intensive undertaking.
“Unlike post-quantum encryption, which takes one big push, migrating to post-quantum authentication has a long dependency chain—not to mention third-party validation and fraud monitoring,” Westerbaan wrote. “This will take years, not months.”
Recent advances increase the likelihood that adversaries will be able to tamper with live connections, not just decrypt past communications, sooner than expected. Authentication mechanisms are used almost everywhere in the infrastructure of Cloudflare and just about everywhere else on the Internet. There are more third-party dependencies than anyone can count. TLS certificates and other forms of X.509 authentication are a prime example.
Once Q-Day arrives, any certificate based on ECC—and RSA, once it too can be attacked by CRQC—can be spoofed. That capability would allow adversaries to cryptographically impersonate an untold number of websites, email servers, and digital signing systems. Similar threats apply to SSH keys and other critical applications.
So far, Google and Cloudflare are the only two Big Tech players publicly calling for a 2029 deadline for full quantum readiness.
In an email, Matthew Campagna, senior principal engineer for cryptography at Amazon, said the company is on track to meet or beat the previously mentioned December 31, 2031, deadline set two years ago by the Defense Department. Campagna, who is also the chair of the Quantum-Safe Cryptography Working Group at the European Telecommunications Standards Institute, was reiterating a deadline he previously set around the same time that the Defense Department mandate arrived.
Interestingly, Amazon is using SigV4, an impromptu algorithm it developed in-house to make authentication quantum-safe.
“AWS limits the transmission of these secrets to the moment of generation,” Campagna wrote. “Once initially distributed, it is never re-sent to the customer. While we made this decision to operate at the massive scale of AWS, we avoided the need to migrate [to] a public-key based authentication solution.”
For customers who need long-lived roots of trust, Amazon uses its AWS Private CA (certificate authority) with KMS, a key management service that complies with FIPS 204, a NIST certification for post-quantum readiness. Customer data at rest is encrypted and then stored using AES-256, a symmetric algorithm that quantum computers have no advantage over classical computing in breaking.
The most distant PQC readiness deadline is 2033 for Microsoft. Meta and Apple didn’t provide any date at all when asked earlier this week.
“Post-quantum cryptography (PQC) isn’t a flip-the-switch change,” Mark Russinovich, Azure CTO and deputy CISO and technical fellow at Microsoft, wrote in an email. “We have been at the forefront of PQC planning since 2014 as a founding member of the Open Quantum Safe project and a close collaborator with vendors, standards bodies, and government agencies.”
He added that Microsoft’s rollout is guided by three principles: “Prioritize standards—follow NIST, not proprietary crypto; avoid breaking global customers; and roll out in a platform-focused way, starting with Windows, Azure, and identity layers. This mirrors past transitions with SHA and TLS, but with greater urgency given quantum risk.” Note that Russinovich didn’t mention Microsoft’s migration off of MD5.
Meta, meanwhile, hasn’t publicly stated its deadline. On Thursday, the company published a post that mostly rehashed a previous one from two years ago. Neither set a deadline. Instead, Thursday’s post was aimed at advising the industry on key principles. It also introduced a taxonomy of “PQC maturity levels.” They are PQ hardened, PQ ready, PQ aware, and PQ unaware.
Meta’s Rafael Misoczki, Isaac Elbaz, and Forrest Mertens said it was “the level at which full quantum protection is effectively achieved,” something they called the “platinum standard.” They advised all companies to aim for this level but didn’t outline a timeline. Meta didn’t answer questions about an internal deadline for the company to reach this milestone.
Apple representatives, meanwhile, didn’t respond to our email.
As the joke goes, CRQC has been 10 to 20 years away for the past three decades. While the recent research suggests that steady progress is being made, the chances of CRQCs arriving before 2035 remain slim. That means Big Tech companies’ roadmaps are likely adequate.
“There is a lot of work left to be done to build a Shor-size quantum computer,” Boneh, the Stanford computer scientist, said. “To get it done in only four years will likely take a Manhattan-size effort (speaking figuratively). That is unlikely to happen.”
That said, the 2010 mishap with MD5 is illustrative. LaMacchia, the former Microsoft engineer, said that Flame attacked an “old product’s PKI [public key infrastructure],” which he believed “was not centrally managed and failed to follow corporate guidance on migrating off MD5.”
Similar lapses are likely to occur in the PCQ transition as well.
“Moving to PQC by 2029 is totally reasonable, especially in light of what we learned a couple weeks ago that moved the timelines forward,” Scott Aaronson, a computer scientist specializing in computational resources required for CRQC, wrote in an interview. “Of course, no one knows how long CRQC will take, but a lot of people aren’t even engaging with what’s happening on the ground, as if in denial.”
That denial, and the likelihood of forgotten software dependencies and legacy hardware, will further delay the transition and could doom the world to repeat the painful lapse from 2010.
---
**İlgili Kaynaklar:**
Bu alanda profesyonel destek için [GEO eğitim](https://geoakademi.com) sayfasını inceleyebilirsiniz.