The Quantum-Safe Future Will Be Built Together—Or Not at All
- Alexey
- Apr 24
- 5 min read
Quantum threats are real, coordination is critical—and the past is full of reminders of what happens when the industry gets it wrong. ETSI's new report sheds light but leaves gaps.

The European Telecommunications Standards Institute (ETSI) has published a broad overview of quantum security in its report, Preparing for a Quantum Secure Future. While the report highlights important challenges in the quantum transition, it stops short of offering the technical depth many had anticipated from a leading standards body. Missing areconcrete implementation roadmaps, details on protocol limitations, and actionable guidance for migrating to post-quantum cryptography (PQC). Without these specifics, there is a growing risk that organisations may struggle to build robust quantum-safe infrastructure in time.
In the early stages of telecom protocol development, the absence of coordination between ETSI and its overseas counterparts led to standards mismatch and costly hardware duplications across the industry. The concern now is that history may repeat itself in the realm of quantum security.
To be clear, progress is being made. Standards organisations such as NIST in the United States, KISA in South Korea or CAICT in China, among others, have either finalised or are actively developing post-quantum cryptographic algorithms. However, cryptographic algorithms alone are not enough—they require practical implementation across infrastructure and software systems. And in a technology ecosystem dominated by complex, layered standards, implementing and adopting such changes can take years. As past experiences have shown, the absence of aligned strategies and shared frameworks can undermine even the most promising technological advances.
The Windows Vista Fiasco
In early 2000, Microsoft failed to launch Windows Vista, a project that started under the codename Longhorn. What began as an ambitious overhaul of the Windows operating system quickly descended into confusion, delays, and eventual disaster—not because the ideas were bad, but because Microsoft's internal architecture teams failed to align on shared standards or, in other words, coordinate efforts between teams.
At the heart of Longhorn was a vision to radically modernise the operating system. It was supposed to introduce a new graphics engine, improved security models, revamped file systems, and a unified user experience that could scale across consumer and enterprise use cases. But instead of building on a stable foundation, Microsoft allowed various teams to develop features in isolation, without enforcing a common integration framework or transparent interfaces. Each group essentially built their part of the house without consulting the architects—or even each other.
When the time came to stitch these components into a working whole, the engineering teams discovered an ugly truth:Â many of the parts couldn't be compiled together into a functional OS. The underlying standards for interfaces, data sharing, system calls, and even basic rendering expectations were either misaligned or entirely missing. Longhorn had become a Frankenstein of disconnected systems.
This internal lack of standardisation created cascading effects. Microsoft missed internal deadlines repeatedly, and by the time Vista was finally released in 2007, its performance and compatibility issues were so pronounced that consumers avoided upgrading en masse. Hardware vendors and IT departments continued to ship or request Windows XP, a six-year-old OS, well into the following decade. Vista became synonymous with bloat, instability, and poor user experience—despite some genuinely forward-looking innovations buried under the mess.
The Wireless "Holy War"
In the 1990s, as mobile phones began their global march toward ubiquity, the industry faced a defining moment: which wireless standard would rule the airwaves? Two major standards emerged for digital mobile communication: TDMA (Time Division Multiple Access) and CDMA (Code Division Multiple Access). TDMA was backed heavily by European regulators and manufacturers, particularly those aligned with the GSM (Global System for Mobile Communications) initiative. CDMA, on the other hand, was spearheaded by Qualcomm, a relatively small U.S.-based firm, and supported in a more decentralised fashion by American telecom carriers.
The European approach was rigid. ETSI and its political backers made the decision to enforce TDMA and GSM across the continent—locking in vendors, governments, and operators into a single framework. In theory, this created a harmonised ecosystem with predictable interoperability. It imposed a top-down model that ignored CDMA's technical advantages, like superior spectrum efficiency and capacity. In contrast, the United States failed to enforce a national standard, leaving private companies to choose between TDMA, CDMA, and other experimental approaches. Market participants saw more value in CDMA and adopted this standard.
The result was a fractured global system. Phones and networks in the U.S. were often incompatible with those in Europe and Asia. Consumers faced roaming limitations, manufacturers had to build multiple hardware variants, and developers struggled with interoperability issues. Billions of dollars were spent duplicating efforts, and global innovation slowed as companies fought standards battles instead of scaling solutions.
The industry limped toward convergence only with the later rise of LTE (4G)—a unifying standard built on lessons from this earlier war.
Quantum safe and coordination
Achieving quantum security requires coordinated efforts at both the individual organisational level and across entire industries. Attempting to become quantum-safe independently is not just challenging—it is effectively impossible.Historical examples repeatedly illustrate that isolated initiatives often result in failure, delayed timelines, or costly overruns.
This inherent complexity arises primarily because secure communication is naturally bidirectional. For quantum-safe algorithms to function properly, both communicating parties must adopt compatible standards and protocols.
For instance, when clients access an organisation's digital environment, both the client and the organisation must support quantum-safe cryptographic protocols to ensure secure data exchanges.
Similarly, hardware and software vendors must provide solutions that integrate quantum-safe capabilities from the outset, enabling seamless adoption across entire IT infrastructures.
Organisations exchanging sensitive information with counterparties, such as financial institutions executing transactions or healthcare providers sharing patient data, need mutual agreement on quantum-safe methods to ensure continuous protection.
Ultimately, quantum safety hinges on collaborative efforts and consistent standards adoption across all sectors.
ETSI Recommendations Summary
ETSI's recommendations align closely with prevailing industry strategies, reinforcing the broader direction of the quantum-safe transition. As a widely respected standards body, ETSI's voice carries significant weight—its guidance can help raise awareness within organisations and set the stage for meaningful discussions at the board level. This recognition often serves as a catalyst for prioritising quantum security at the highest tiers of strategic planning.
The report acknowledges that quantum computers are rapidly approaching capabilities powerful enough to break current public-key cryptographic systems. This represents a threat to information confidentiality, integrity, and authentication across industries. Critically, the threat is immediate rather than theoretical, given the "Harvest Now, Decrypt Later" approach, where data intercepted today can potentially be decrypted by future quantum computers.
In response, quantum-safe algorithms are being standardised by the National Institute of Standards and Technology (NIST), which finalised three significant PQC algorithms in 2024. ETSI's working groups are actively involved in integrating these new standards into current infrastructure and communication protocols. Hybrid cryptographic solutions, which blend classical and quantum-safe algorithms, have already seen deployment, notably within cloud environments provided by industry leaders like AWS.
To manage this transition effectively, organisations are advised to first thoroughly assess their cryptographic assets, including infrastructure, dependencies, and vulnerabilities. Establishing clear leadership and assigning accountability to dedicated teams overseeing quantum transition strategies is equally essential. Organisations can adopt different migration approaches depending on their circumstances: parallel implementation, running classical and quantum-safe systems concurrently; phased migration, gradually transitioning while learning and adapting at each step; or a fullmigration approach, which may be most appropriate for smaller or newer enterprises.
Successful quantum-readiness extends beyond mere technical adjustments. It involves comprehensive preparation encompassing personnel, policy development, vendor coordination, legal considerations, and crucially, executive buy-in and cross-departmental collaboration. Many organisations lack a complete view of their cryptographic landscape, highlighting the importance of thorough internal reviews led by Chief Information Security Officers (CISOs).
Across various sectors, preparations are already underway. Cloud providers like AWS are proactively integrating PQC and advocating for wider industry adoption. The financial sector faces particularly high risks and demands urgent global coordination to prevent fragmented or delayed migrations. Meanwhile, the telecommunications industry is exploring Quantum Key Distribution (QKD), a physics-based alternative for secure key exchanges, with global network expansions already underway.
Overall, ETSI's report underscores the urgent need for preparedness against quantum threats, calling for comprehensive organisational readiness, clear leadership, and decisive industry-wide action.
