Lessons from Previous Hype Cycles: Why PQC Must Not Repeat the Mistakes of Cloud and Big Data
Introduction
Over the past 25 years, business and technology leaders have navigated repeated waves of disruption. From the rise of the internet and cloud computing to mobile, big data and now AI, each wave has been driven by a mix of innovation, new risk, and — very often — innovation misused in ways that create new risks. The cycle then repeats.
Change is both inevitable and necessary. But one lesson has been remarkably consistent throughout each of these transitions: the disruption created by new technology is rarely solved simply by buying another platform, component or service, despite what the marketing soundbites suggest. Sustainable advantage comes from aligning technology decisions with risk, business strategy and operating models — not the other way around.
Post-quantum cryptography (PQC) is the next major test of this principle. Quantum computing promises enormous economic, scientific and societal benefits, but it also threatens to undermine the cryptography that underpins today’s digital trust. For organisations beginning to consider their response, the question is not whether to act, but how to act in a way that avoids the costly mistakes of previous technology transitions.
The Cloud Migration Lesson
During the rise of cloud computing, the dominant narrative was simple: cheaper, faster, more scalable. Those benefits were real, but only for organisations that fundamentally re-engineered how they designed, built and governed technology. In practice, realising the benefits of cloud depended on several fundamental shifts.
Organisations needed to redesign applications around cloud-native architectural principles. They needed to refactor application monoliths into microservices, containers and loosely coupled components. Robust financial governance, cost visibility and cross-charging models had to be implemented. And there was an inherent requirement to accept a structural shift of IT spend towards operational expenditure, with the corresponding change in financial metrics that brings.
Many organisations believed the marketing and migrated their IT estates to the cloud largely as-is, expecting the promised savings and agility to materialise automatically. The reality was very different. Cost overruns of 100 to 1,000 per cent were not uncommon, and a significant number of enterprises have since moved to hybrid models or repatriated workloads entirely due to cost, complexity or legacy constraints.
The cloud lesson is not that the technology was flawed. It is that the benefits were only available to organisations willing to change how they operated, not just what they purchased.
The Big Data Parallel
Big data followed a remarkably similar pattern. Centralised repositories enabled data to be pooled and analysed at scale, breaking down silos and unlocking powerful new insights. The business cases were compelling, and the potential for competitive advantage was clear.
But to realise these benefits, organisations had to address a range of challenges that were rarely factored into the original investment decisions. Information aggregation risk needed to be mitigated through enhanced monitoring, real-time alerting and tightened access controls across data repositories. Enterprise-wide classification and labelling standards had to be established to ensure consistent data handling and compliance. Context-aware access control models needed to be adopted, accounting for device posture, location, network context and user behaviour. Analytics use cases had to be reconciled with regulatory constraints, establishing clear thresholds and governance for sensitive data processing. And machine identities and AI agents acting on behalf of humans required proper governance throughout the identity lifecycle.
The result was often substantial, unplanned investment in identity, access management, monitoring and governance capabilities around the big data platforms — costs that were rarely factored into the original business case. For many organisations, the total cost of ownership bore little resemblance to the projections that secured initial approval.
The Technology-Focused Doom Loop
The core lesson from both cloud and big data is clear: technology-led change without a corresponding evolution in architecture, governance and operating model creates a “doom loop” of unplanned cost and risk. The technology decisions may have been sound in isolation, but without the surrounding organisational change, the expected value failed to materialise — and the unintended consequences were significant.
This doom loop follows a predictable pattern. Vendor marketing creates urgency. Urgency leads to procurement. Procurement without strategy leads to implementation challenges. Implementation challenges lead to further unplanned spend. And the cycle continues, with each iteration eroding confidence and consuming budget that could have been directed more effectively.
The AI transition is already showing early signs of following this pattern, with organisations investing heavily in platforms and tooling before fully understanding the data governance, ethical and operational implications. The risk is that PQC will follow the same trajectory unless a different approach is taken from the outset.
Why PQC Is at Risk of Repeating This Pattern
Once again, vendor marketing is framing the cryptographic response to quantum threats as primarily technical: new algorithms, upgraded products, crypto-agile platforms. The messaging is designed to create urgency that leads to procurement of more technology. Without a more balanced view, organisations risk treating PQC as a narrow algorithm swap rather than the multi-year enterprise transformation it actually represents.
International standardisation bodies and cybersecurity agencies are clear that PQC is both a technical and organisational challenge. NIST has run a multi-year process to select and standardise quantum-resistant public-key algorithms, recently approving the first set of PQC standards. NSA’s Commercial National Security Algorithm Suite 2.0 explicitly urges owners of national security systems to plan, prepare and budget now for migration, recognising the scale and duration of the transition. And ETSI and ENISA stress that quantum-safe cryptography is not just about choosing new algorithms; it requires redesigning protocols, integrating PQC into existing systems and building crypto-agility into architectures from the outset.
The message from these bodies is consistent: PQC is not a product you buy. It is a transformation you plan, govern and execute over time.
What Makes the PQC Transition Particularly Complex
There are several factors that make the PQC transition more complex than previous technology shifts. First, most enterprises have PKI environments that have evolved organically over many years. Although a future target architecture may be easy to conceptualise, understanding the true starting point is far more challenging. Many organisations lack visibility of how their PKI is implemented, what dependencies exist and where cryptographic services are embedded across their estate.
Second, data is frequently fragmented across systems and environments, with limited understanding of its sensitivity, classification, value or retention requirements. This makes it difficult to identify priority use cases or determine which systems should be addressed first in any PQC transformation.
Third, there is a growing misalignment in government and regulatory guidance worldwide. UK government guidance currently indicates that PQC implementation activity should begin around 2028, whereas Canadian and EU guidance recommends beginning materially earlier, with some sectors encouraged to start adoption from 2025. This inconsistency leaves multinational organisations navigating conflicting timelines, expectations and levels of urgency.
And finally, much of the discourse surrounding PQC is still framed in highly technical terminology. When risk is articulated in technical rather than business language, senior decision-makers are left without the clarity they need to prioritise investment or weigh strategic options. This ambiguity slows progress and increases the likelihood of reactive, fragmented and inefficient responses.
Taking a Different Approach
To avoid repeating past mistakes, boards need vendor-agnostic, business and risk-focused perspectives that complement the technical standards. The goal should be to ensure that exploiting quantum opportunities, addressing threats and mitigating risks is approached as an enterprise transformation — not a point solution.
Encouragingly, the initial steps towards PQC readiness do not require major investment. They require a shift in focus, governance and planning discipline. Organisations should be using architecture governance to embed PQC considerations into procurement and refresh cycles. They should be developing a clear understanding of where their critical data resides, its sensitivity and how it flows through business systems. And they should be grounding every decision in a clear, shared understanding of risk — expressed in language that business leaders understand.
These are not technology projects. They are strategic planning activities that create the foundation for any meaningful PQC action, regardless of which vendor platforms or algorithms are ultimately selected.
How Unsung Can Help
At Unsung, we help organisations take this measured, strategic approach to post-quantum readiness. As a vendor-neutral PKI consultancy, our role is to provide independent guidance that bridges the gap between technical complexity and boardroom decision-making. We work with C-suite leaders, CISOs and technology teams to ensure that PQC readiness is built on solid foundations — grounded in risk, aligned to business priorities and designed to avoid the doom loop that has characterised previous technology transitions.
If your organisation is beginning to consider its approach to post-quantum cryptography, we would welcome the opportunity to discuss how we can support your planning. The conversation does not need to start with technology. It needs to start with your business.
Want to explore this topic further?
This blog is part of a series drawn from our strategic whitepaper, Post-Quantum Cryptography: A Strategic Whitepaper for the C-Suite. It provides vendor-neutral, business-focused guidance on navigating the quantum era — covering the threats already in play, lessons from previous hype cycles, and practical steps your organisation can take today. Download your copy here: https://2f4v3l.share-eu1.hsforms.com/20qJjHSynQkuJKhI_xq9Msg

