Harvest vs. Forge – Twin Quantum Threats to Security
What is “Trust Now, Forge Later” (TNFL)? Most discussions about quantum computing threats focus on “Harvest Now, Decrypt Later” (HNDL) – the idea that adversaries can collect encrypted data today and store it, hoping a future quantum computer will break the encryption and expose sensitive information. This risk is very real, especially for data that needs to remain confidential for decades (think government secrets, health records, long-term intellectual property). In essence, HNDL is a confidentiality threat: today’s intercepted secrets might be decrypted tomorrow.
Yet there’s another quantum-enabled danger that receives far less publicity, one that worries me even more. I previously termed it “Sign Today, Forge Tomorrow” (STFT), but more recently, another, probably better term, is gaining traction: “Trust Now, Forge Later” (TNFL) – this is the digital signature equivalent of HNDL. It means that we could trust a signature or certificate today, but attackers forge it once quantum machines arrive, undermining its validity in the future.
Digital signatures underpin the trustworthiness of everything from software updates and firmware integrity to identity documents and financial transactions. Today these signatures (using RSA or ECDSA algorithms) are considered secure, but a sufficiently powerful quantum computer running Shor’s algorithm could break those algorithms and allow attackers to forge signatures at will.
In other words, the moment quantum computing can crack current signature schemes, an adversary can make malicious software or documents appear authentic by faking the digital signatures we rely on for trust.
In other words, systems that cannot be retrofitted with quantum-safe signatures are ticking time bombs of trust.
Why Forged Signatures Could Be an Even Bigger Threat
Why do I consider “trust now, forge later” potentially more dangerous than the more oft-cited HNDL risk? The core reason is that TNFL attacks undermine integrity and authenticity, which in many domains can be more critical than confidentiality. Let’s break down the key differences:
Immediate and Invisible Compromise
A stolen encrypted file under HNDL can only be read once a future quantum computer succeeds in decryption. Which will take significant time and energy for each separate decryption – the impact, though serious, is delayed and limited to data disclosure. It will feel like a slow burn.
By contrast, a forged digital signature yields an instant and undetectable compromise. A malicious software update signed with a (single, well-chosen) forged private key would be accepted without question – it would “install smoothly – exactly as if it came from the vendor,” with users or systems none the wiser. It could lead to a “supply chain attack” – think SolarWinds hack – with potentially massive consequences.
Q-Day for signatures will feel like a cliff, not a slow burn.
Integrity (and Safety) vs. Confidentiality
Harvesting encrypted data endangers privacy and secrecy.
But forging signatures strikes at the integrity of systems and the safety of operations. Digital signatures form the root of trust for software updates, device identities, communications protocols, and more. If that trust is broken, attackers can impersonate legitimate services, push fake firmware, issue bogus certificates, send fake malicious commands, or alter records with impunity.
Many industries value integrity over secrecy – as I often say, losing the trust in your system can be far costlier than losing the confidentiality of some data. An encrypted message decrypted by an adversary is a serious breach, but a forged command accepted by a safety-critical system could be catastrophic.
Cyber-Kinetic Impact
Perhaps most importantly, compromised signatures can lead directly to physical consequences. A forged code-signing certificate could allow an adversary to load malicious firmware into a factory controller or power grid device, causing equipment malfunctions. Malicious commands could be injected into operational technology (OT) systems under the guise of valid, signed instructions.
In critical infrastructure, this is not just an IT security problem – it’s a safety problem. When quantum attackers can fake the “proof of authenticity” that keeps bad commands out, the result might be equipment damage, environmental disasters, or even loss of life. In short, HNDL threatens data privacy, but TNFL threatens the reliable functioning of our machines and the safety of people.
Another way to look at it: HNDL is about future decryption of past secrets, whereas TNFL is about future forgery of present trust. The former undermines confidentiality, the latter undermines the very foundation of digital trust (integrity and authenticity). In many scenarios, the collapse of trust is even more dire.
It’s telling that integrity attacks have long been feared by those of us in security, even more so than data theft in certain contexts. As I wrote in my earlier article, once adversaries can forge signatures freely, they can “pass off malicious software as authentic, impersonate legitimate services, issue bogus certificates, and even rewrite historical records” – essentially, they hold the keys to the kingdom.
Cyber-Kinetic Fallout: From Data Breaches to Physical Blasts
My perspective on this issue is shaped by over two decades of work on cyber-kinetic risks – the intersection of cybersecurity and physical safety. Since the mid-1990s, I’ve been warning that digital threats don’t just jeopardize data; they can hijack machines and endanger lives. Back then, many in the industrial security community subscribed to an “A-I-C” hierarchy (Availability, Integrity, Confidentiality), prioritizing keeping systems running. Even though the CIA triad really shouldn’t be about the hierarchy. I’ve long argued for an “I-A-C” triad, putting Integrity first, because if an attacker can tamper with commands or data in a control system, availability and safety are moot. In most cases, a system faithfully doing the wrong thing (due to forged instructions) is far more dangerous than one that simply stops working.
Cyber-kinetic attacks refer to cyber intrusions that cause tangible physical effects – think of hacking a chemical plant to cause an explosion, or manipulating a medical device to harm a patient.
These are not science fiction scenarios; they’ve been demonstrated in the real world. Stuxnet, for example, famously sabotaged Iranian nuclear centrifuges via a digital attack. Researchers have shown cars being remotely commandeered and pacemakers being hacked. I used to track relevant research and incidents until about 2017 when they became too numerous. Cyber-kinetic incidents remain underreported – unlike data breaches (which must often be disclosed by law), many attacks on industrial systems are kept quiet or chalked up to “industrial accidents”. But the risk is very real. My definition sums it up: cyber-kinetic attacks are cyber attacks aimed at causing direct physical damage, injury or even death by exploiting vulnerabilities in cyber-physical systems.
It’s exactly this kind of nightmare scenario that quantum-enabled signature forgery could facilitate. Consider an industrial control system that only accepts firmware updates or remote commands if they are digitally signed by the trusted vendor or operator. This is a common safeguard in factories, power grids, transportation systems, etc. Now imagine a future quantum attacker who can break that signature scheme: they could sign a malicious firmware update that disables safety interlocks in a factory, or forge a certificate to masquerade as a trusted operator on a power grid network. The devices would trust now – they’d see the signature and assume everything is in order – but the attacker can forge later, turning that trust against us when quantum hacking capability matures. In critical infrastructure, that could mean turning off alarms, overriding controls, and causing dangerous physical states without any immediate warning.
From my research and experience, the integrity of control systems is absolutely paramount. If an outsider can falsify the digital signals that tell a chemical valve to open or a medical pump to dose medicine, the consequences are far worse than a data leak – we’re talking about potential kinetic damage, environmental harm, even casualties. That’s why I view Trust Now, Forge Later as not just a cryptographic vulnerability, but a public safety issue. It bridges the digital and physical realms in the worst way possible: by making it trivial for an attacker to bypass trust and directly manipulate the devices that run our world.
The Operational Technology (OT) Challenge – Quantum Readiness is Crucial (and Complex)
All of this underscores why quantum readiness for operational technology is so critical. We often focus on IT systems, but OT systems (industrial control systems, IoT devices, embedded systems in critical infrastructure) are on the front lines of this TNFL threat. Unfortunately, making OT quantum-safe is much harder than doing so in IT environments. There are several reasons for this complexity:
- Long Lifecycles: Industrial and IoT devices often remain in service for 15-30 years or more. A piece of equipment installed today might still be running in the 2040s. That’s problematic because many current devices use classical cryptography (RSA, ECC) baked into their hardware/firmware. If those algorithms break in, say, 5 years, millions of devices in the field could instantly become insecure. Replacing or retrofitting them is costly and slow.
- Limited Patching Windows: In many industrial settings, updating firmware or software isn’t as simple as clicking “update.” It can require plant shutdowns or maintenance outages, which might happen only once or twice a year. Utilities, factories, or hospitals can’t just take systems offline frequently. This means critical security upgrades (like swapping in post-quantum algorithms) might be delayed or deferred – increasing the window of exposure.
- Hardware/Protocol Constraints: Many OT devices are resource-constrained (limited CPU, memory) and use specialized or legacy protocols. Post-quantum algorithms typically have larger key sizes and heavier processing requirements. Some old controllers literally can’t handle a quantum-safe algorithm without hardware changes. Additionally, some communication protocols in OT might not easily support new cryptographic methods. This lack of “crypto agility” in legacy systems is a huge hurdle.
- Governance and Safety Requirements: Any change in OT, especially in sectors like energy or healthcare, needs to consider safety certifications, regulatory approvals, and extensive testing. You can’t just patch a nuclear plant’s control system with an experimental algorithm. The migration to PQC (post-quantum cryptography) in OT must be done extremely carefully, often requiring parallel runs, fail-safes, and coordination with equipment vendors and regulators.
In short, for OT-operating organizations, upgrading OT systems to PQC is the largest, most complex overhaul many enterprises will ever undertake. It’s akin to a “Y2K” for cryptography – except the challenge spans every device, not just software, and the deadline (the arrival of a powerful quantum computer) is uncertain. As discussed, broken crypto in OT can directly impact physical operations and put lives and the environment at risk. Tabletop exercises and risk assessments I’ve been doing lately show truly sobering outcomes if we don’t address this in time.
The silver lining is that awareness is growing. Standards bodies like NIST have finalized new quantum-resistant signature algorithms, and forward-thinking vendors are already adding PQC support to hardware security modules and firmware signing tools. The key is to start planning and migrating now.
Organizations should inventory where they use digital signatures and certificates (you might be surprised how many places – from the secure boot code in a smart meter to the VPN that connects a remote operator). Where possible, build in crypto-agility – the ability to swap out algorithms – so that as PQC algorithms mature, you can adopt them with minimal disruption.
And for devices that absolutely cannot be upgraded, strategize mitigations: for instance, placing quantum-safe gateways in front of legacy devices to validate commands, or accelerating the replacement of equipment that has unfixable cryptographic weaknesses.
Conclusion
The catchy slogan “Harvest Now, Decrypt Later” has succeeded in raising awareness about future quantum threats to confidentiality. But the “Trust Now, Forge Later” risk is its darker twin – one that could undermine digital trust, break supply chains, and endanger lives. We must not let it fly under the radar. The threat of forged signatures is not hypothetical or far-fetched; it stems from the same well-understood quantum algorithms that threaten encryption. The difference is that it strikes at the very heart of security: the integrity and authenticity of our systems. Especially for operational and cyber-physical systems, the stakes cannot be overstated – a quantum-enabled forgery in those environments could translate to physical catastrophe in the real world.
© 2025 Applied Quantum. All rights reserved