Reading between the lines of the IEEE's PQC algorithms benchmark publication
- Alexey

- Mar 27
- 3 min read
Updated: Sep 18
The IEEE's recent publication titled "Benchmarking Post-Quantum TLS" (https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10844321) presents a performance analysis of post-quantum cryptography (PQC) algorithms within Transport Layer Security (TLS) implementations. I'd like to offer additional context that may assist in interpreting the IEEE's findings, particularly regarding their broader implications and practical applicability.

Quantum risk affects Key Establishment Mechanisms and Digital Signatures, both of which fall under the broader category of Public Key Cryptography. This discussion—and the focus of the IEEE paper—centres on these two cryptographic primitives.
In short, quantum computing has undermined several classical cryptographic algorithms, as their foundational assumptions no longer hold against quantum-capable adversaries. In response, the U.S. National Institute of Standards and Technology (NIST) has standardized new PQC algorithms and provided timelines for phasing out vulnerable legacy algorithms by 2030 and 2035. While PQC algorithms deliver the same cryptographic functionalities as their predecessors, they rely on different mathematical foundations that are resistant to quantum attacks. However, PQC algorithms require more computing resources and memory, potentially causing compatibility challenges and necessitating system redesigns.
In many practical implementations of internet protocols or applications, a combination of cryptographic primitives (algorithms) is used to deliver an intended functionality in a secure way. For example, in secure messaging protocols, symmetric encryption is often used to protect the actual message content, while Key Exchange Mechanisms (a sub-class of the public key cryptography algorithms) are employed to establish a secure session key. Similarly, digital signatures may be combined with hash functions to ensure both the integrity and authenticity of a message. These primitives—hash functions, digital signatures or key exchange mechanisms—are often orchestrated together in protocols like TLS to provide confidentiality, integrity, authentication, and forward secrecy.
Speaking more about Key Establishment Mechanisms (KEMs), they are used at the start of a secure data transmission session to agree on a secret key over an open channel. Although data is exchanged publicly, KEMs ensure that only the communicating parties can derive the shared secret, even if an adversary intercepts all traffic. Adopting PQC at the KEM level protects the master key used for a session, mitigating the risk of future decryption attempts once quantum attacks become feasible. When considering KEM performance, an important observation is that overhead becomes more or less significant depending on the volume of data transferred. Overhead from PQC becomes significant when the data exchanged is minimal. Conversely, for large datasets, the overhead is relatively negligible. Real-world network conditions also influence practical performance. Hardware configuration and capacity, coupled with other common issues like delays, routing issues, and packet drops, can affect PQC implementations outside laboratory conditions in unpredictable ways. Many organizations, for instance, employ Data Loss Prevention (DLP) systems that terminate and reestablish TLS sessions, introducing extra steps that degrade performance.
Digital signatures are equally critical, ensuring data integrity and authenticity by confirming a message's origin and verifying that it has not been altered. However, transitioning to PQC digital signatures poses even greater computational and memory challenges than KEMs, complicating implementation significantly. Unlike KEMs, digital signatures typically do not face immediate threats from quantum-based "harvest-now-decrypt-later" attacks due to their short-lived nature. Yet, the substantial increase in key sizes and associated storage requirements means proactive remediation is advisable despite the temporary relief provided by short-lived signatures.
Although the IEEE results for KEMs and digital signatures appear promising, especially regarding execution times, it's crucial to interpret them cautiously and consider more realistic evaluation scenarios. Several factors limit the scope of the IEEE findings:
Sole Focus on TLS: TLS was chosen as a testbed to assess PQC primitives but is just one of hundreds of network protocols (https://en.wikipedia.org/wiki/Lists_of_network_protocols). Many other protocols will need separate PQC evaluations, as they do not rely on TLS for security. Additionally, KEM and digital signature primitives are widely applicable beyond networking—such as in secure storage, encrypted file sharing, and VPNs—where similar cryptographic operations are fundamental.
Lack of Real-World Deployment Scenarios: The IEEE paper simulates network delays and packet loss butunder idealized conditions. It excludes real-world constraints like mobile devices, resource-limited hardware, actual internet-scale traffic patterns, mutual authentication scenarios, or heavily loaded web servers.
No Usability or Integration Testing: Compatibility with browsers or applications was not assessed, nor does the paper provide insight into post-quantum TLS performance within typical client-server environments, particularly those involving legacy infrastructure.
Limited Evaluation of Hybrid Approaches: Hybrid schemes, combining classical and PQC algorithms, are recognized as potential transitional strategies but are not benchmarked or deeply analyzed in this publication.
Cloudflare's blog on PQC deployment tests (https://blog.cloudflare.com/pq-2024/) could serve as a complementary resource to the IEEE publication, providing valuable practical insights into real-world implementations.




Comments