Post-quantum in 2025: the easy half and the hard half
- Alexey
- 7 days ago
- 5 min read
The push towards quantum-safe algorithms shows a clear split. Key exchange is the more approachable step that helps defend against “harvest now, decrypt later” attack, but digital signatures are where the real hard yards begin. What’s driving this split — and what challenges are holding the harder half back?

Migrating to post‑quantum cryptography (PQC) is becoming an unavoidable step for businesses looking to stay secure in the years ahead. It represents a new version of public‑key algorithms designed to replace the classical ones that are considered insecure by cryptographic standardisation bodies (NIST, BSI, ANSSI etc.). They perform the same jobs—establishing shared secrets and proving digital signatures—but use very different mathematics under the hood that makes them secure even in the face of quantum computers. There are two families of cryptographic schemes involved in the transition: key establishment (also called key encapsulation mechanisms, or KEMs) and digital signatures. KEMs migration is essential to protect against “harvest‑now, decrypt‑later” attacks. Without PQ signatures, a future quantum-capable attacker could forge digital certificates and impersonate websites, services, or humans.
There’s a big practical difference between key agreement and digital signatures on the road to post‑quantum crypto. Key agreement—the step that sets up shared secrets—is mostly a drop‑in upgrade. The tricky parts aren’t about PQC itself so much as the plumbing: you need to move from TLS 1.2 to TLS 1.3, because TLS 1.3 is required to enable PQC in HTTPS. Signatures, by contrast, are where the wheels wobble—because PQ signatures are bigger, costlier, and deeply embedded across the PKI infrastructure.
Cloudflare’s latest report (https://blog.cloudflare.com/pq-2025/) brings the data to life. Adoption of post-quantum KEMs is advancing, though unevenly—progress in HTTPS (TLS) is strong, while other protocols still face significant hurdles. Meanwhile, digital signature adoption has barely begun. As a bonus, the report offers an intriguing behind-the-scenes look at how Cloudflare is progressing with its own internal rollout.
Where Post Quantum KEM adoption stands
Post‑quantum key agreement, particularly the KEM family, is rapidly being adopted across the web. Progress is visible in public traffic, business‑to‑business connections. Together, these examples illustrate where the industry stands on PQ KEM implementation and the challenges that still need to be solved before widespread adoption becomes reality.
To understand how PQC adoption is taking shape, it helps to break down KEM deployment into a few practical perspectives that show where progress is happening and where challenges remain. Think about KEM adoption through two simple lenses:
Browsers & user‑initiated traffic: Now well over 50% of human web traffic to Cloudflare is PQ‑protected. This is the fastest‑moving piece, driven by browser's (Chrome, Edge, Safari, Firefox etc.) defaulting to PQC KEMs in their settings. On the backend, where servers operate, scans of the top 100,000 domains show that support for PQC KEMs rose from 28% to 39% over 2025—a strong step forward, even if there’s still more ground to cover.
B2B links: Cloudflare’s services rely on traffic flowing through its network before reaching the customer’s own server (customer's server is usually called the Origin). In its report, Cloudflare shared what this looks like from a B2B perspective: only about 3.5% of Origin servers currently support PQC. That lag is largely due to the wide mix of software stacks and middleboxes still in use across enterprises. So if your B2B infrastructure isn’t yet migrated to PQC KEMs, you’re in good company—many others are still on that journey too.
Internally, Cloudflare’s comments about their own rollout are both revealing and fascinating. They explained that every engineering team was asked to pause their ongoing work and focus on upgrading internal connections to PQ KEM. Most of these changes turned out to be simple software updates, yet the lingering “long tail” of unfinished work continues to pose challenges. First, asking engineers to enable PQC and pause other projects shows serious organisational commitment—what motivated that prioritisation? Second, it’s encouraging that most updates turned out to be simple software changes. And finally, the mention of a lingering “long tail” is a useful reminder that even in the relatively straightforward KEM space, completing the last bits of work often demands far more effort than expected.
Why post-quantum signatures are the hard part
The most technically challenging part of PQC adoption is the signature layer. PQ signatures are larger, slower, and harder to implement securely than key exchange mechanisms, making them a major bottleneck for real-world deployment. Without PQ signatures, digital communication will remain vulnerable to future quantum-capable attackers. That means certificates (public keys and signatures) also need upgrading—and that’s where the real difficulties begin.
NIST has finalised the first wave of standardising PQC digital signatures: ML‑DSA and SLH‑DSA (both published in 2024). Falcon (now FN‑DSA) is still tracking toward standardisation; additional schemes are being researched. NIST standardised multiple signature families for a reason—none are a perfect drop‑in, and each involves trade‑offs in size, speed, assumptions, and implementation safety.
A typical public‑web TLS handshake today effectively carries five signatures and two public keys. That was fine with tiny ECDSA signatures (currently used elliptic curve based signatures), but PQ signatures and keys are much larger. For instance, if we were to switch entirely to ML‑DSA‑44—the smallest and least secure variant of the ML‑DSA family—we’d be adding roughly 15 kB of extra data to each TLS handshake, sent from the server to the client. That’s a noticeable increase. Waiting for FN‑DSA‑512, which is more efficient, would reduce that to about 7 kB. However, it comes with its own trade‑offs: it relies on floating‑point arithmetic and demands a highly complex, side‑channel‑resistant implementation, which raises the risk of timing attacks.
Adding just a few extra kilobytes per handshake can noticeably slow things down—especially on mobile networks or unreliable connections—leading to more dropped sessions. This may not be critical for B2B backend batch transfers, but critical in the consumer space where page loads are tightly measured and optimised.
Real-world tests have shown that some clients or middleboxes simply can’t handle certificate chains larger than about 10 kB and will drop the connection altogether.
In TLS, where sessions often transfer hundreds of kilobytes or even megabytes, the performance hit from larger signatures is usually masked by the overall data volume. But not all connections are that heavy. For example, many QUIC connections are lightweight—Cloudflare reports that the median amount of data transferred from server to client is only around 4.4 kB—so with PQ signatures, the handshake itself could end up larger than the actual payload. That’s not a great look.
HTTPS (TLS) is not all what we care about
Standards work beyond NIST’s algorithmic standardisation is equally vital. For some protocols—like KEM in TLS—the change is little more than assigning a new algorithm ID. But others, including DNSSEC, IPsec, X.509, S/MIME, and OpenPGP, require much deeper re‑engineering to embed PQC properly. Beyond these, additional protocols such as SSH, ACME (for certificate issuance), JOSE/COSE (for API and IoT signing), MLS (for secure group messaging), and even FIDO2WebAuthn are still in the early stages of PQ integration, each facing its own unique set of design and interoperability challenges (https://pqcc.org/heatmap-current-state-of-pqc-standards-and-adoption/).
I genuinely appreciate Cloudflare’s work and the insights they share. Much of this article draws on information from their publication (https://blog.cloudflare.com/pq-2025/), which offers an excellent overview of the state of post‑quantum cryptography. For anyone interested in going deeper, I highly recommend reading their post for further technical details, research references, and context behind the points discussed here. Where I’ve added my own commentary or additional data, I’ve included supporting references throughout the text.
