🔥 Gate Square Event: #PostToWinNIGHT 🔥
Post anything related to NIGHT to join!
Market outlook, project thoughts, research takeaways, user experience — all count.
📅 Event Duration: Dec 10 08:00 - Dec 21 16:00 UTC
📌 How to Participate
1️⃣ Post on Gate Square (text, analysis, opinions, or image posts are all valid)
2️⃣ Add the hashtag #PostToWinNIGHT or #发帖赢代币NIGHT
🏆 Rewards (Total: 1,000 NIGHT)
🥇 Top 1: 200 NIGHT
🥈 Top 4: 100 NIGHT each
🥉 Top 10: 40 NIGHT each
📄 Notes
Content must be original (no plagiarism or repetitive spam)
Winners must complete Gate Square identity verification
Gat
a16z Long Article: What Risks Does Quantum Computing Pose to Crypto?
"When Will Quantum Computers That Pose a Real Threat to Cryptography Arrive? Which Scenarios Are Applicable for HNDL Attacks? What Unique Challenges Does Bitcoin Face? a16z Provides an In-Depth Analysis of the Actual Impact of Quantum Threats on Blockchain and Countermeasures. This article is based on an a16z original piece, organized, translated, and written by Wu Blockchain. (Previous summary: In-depth analysis: Are we overestimating the security threats posed by quantum computers?) (Background supplement: a16z: 17 Major Emerging Trends in Cryptocurrency in 2026)
Table of Contents
Regarding “When will quantum computers capable of posing a real threat to current cryptosystems arrive,” people often make exaggerated time estimates—prompting calls for immediate, large-scale migration to post-quantum cryptography.
However, these calls often overlook the costs and risks of premature migration, as well as the vastly different threat profiles faced by various cryptographic primitives:
Even if post-quantum cryptography is costly, it must be deployed immediately: “Harvest-now-decrypt-later” (HNDL) attacks are already happening, because when a quantum computer arrives—likely decades later—sensitive data protected by encryption today will still hold value. Despite the performance overhead and implementation risks of post-quantum cryptography, for data requiring long-term confidentiality, HNDL attacks leave no choice.
The consideration for post-quantum signatures is entirely different. They are not affected by HNDL attacks; their costs and risks (larger size, performance overhead, immature implementation, potential vulnerabilities) suggest that migration should proceed cautiously, not immediately.
These distinctions are crucial. Misunderstandings can distort cost-benefit analyses, causing teams to overlook more critical security risks—such as vulnerabilities themselves.
The real challenge in transitioning to a post-quantum cryptosystem is aligning “urgency” with “actual threat.” Below, I will clarify common misconceptions about quantum threats and their effects on cryptography—including encryption, signatures, and zero-knowledge proofs—and especially focus on their impact on blockchain.
What is our current point in time?
In the 2020s, the likelihood of “cryptographically relevant quantum computers” (CRQC) emerging remains extremely low, despite some high-profile claims that have attracted attention.
ps: Throughout this document, “cryptographically relevant quantum computer”/“cryptographically relevant quantum computer” will be abbreviated as CRQC.
By “cryptographically relevant quantum computer,” we mean a fault-tolerant, error-corrected quantum computer capable of running Shor’s algorithm on a sufficient scale to attack elliptic curve cryptography or RSA (e.g., breaking secp256k1 or RSA-2048 within roughly one month of continuous operation).
Based on publicly available milestones and resource estimates, we are still very far from such a quantum computer. Although some companies claim CRQC might appear before 2030 or even 2035, publicly available progress does not support these claims.
Background: Currently, no quantum computing platform—be it ion traps, superconducting qubits, or neutral atom systems—is close to the tens of thousands to millions of physical qubits needed to run Shor’s algorithm against RSA-2048 or secp256k1 (the exact number depends on error rates and error correction schemes).
The limiting factors are not just the number of qubits, but also gate fidelity, qubit connectivity, and the depth of error-corrected circuits necessary to run large-scale algorithms. While some systems now exceed 1,000 physical qubits, quantity alone can be misleading: these systems lack the connectivity and gate fidelity required for cryptographic computations.
Recent systems approach physical error thresholds where quantum error correction becomes feasible, but no one has yet demonstrated more than a handful of logical qubits with sustainable error correction circuits—the thousands of high-fidelity, deep, fault-tolerant logical qubits needed for running Shor’s algorithm are still out of reach. Theoretically, there remains a huge gap between feasible quantum error correction and the scale needed for effective cryptanalysis.
In short: unless both qubit count and fidelity improve by several orders of magnitude simultaneously, “cryptographically relevant quantum computers” remain distant.
However, media reports and corporate press releases often foster misunderstandings. Common misconceptions include:
Claims of “quantum supremacy” demonstrations, which often target artificially constructed problems. These are chosen not because they are practical, but because they can run on current hardware and appear to show significant quantum speedup—often downplayed in publicity.
Claims of thousands of physical qubits, which usually refer to quantum annealers, not gate-based quantum computers capable of running Shor’s algorithm.
Casual use of the term “logical qubits” to refer to physical qubits. Physical qubits are noisy; logical qubits require error correction. Shor’s algorithm needs thousands of logical qubits; each logical qubit typically requires hundreds to thousands of physical qubits (depending on error rates). Some companies claim, absurdly, that a code with distance 2 (detects but does not correct errors) can realize 48 logical qubits with only two physical qubits per logical qubit—this is impossible, as such codes cannot correct errors. Truly fault-tolerant logical qubits for cryptanalysis require hundreds or thousands of physical qubits each, not just two.
More broadly, many quantum roadmaps refer to “logical qubits” supporting only Clifford gates, which can be efficiently simulated classically and are insufficient for running Shor’s algorithm. Shor’s algorithm requires thousands of fault-tolerant T gates (or more generally, non-Clifford gates).
Therefore, even if a roadmap claims “reaching X logical qubits by year Y,” it does not imply that the system can run Shor’s algorithm to break cryptography at that time.
These practices distort public (and even professional) understanding of how close we are to a truly cryptographically relevant quantum computer.
Nevertheless, some experts are optimistic. For example, Scott Aaronson recently wrote that, given the rapid pace of hardware development, he now believes it’s plausible that before the next US presidential election, we could have a fault-tolerant quantum computer capable of running Shor’s algorithm.
But he clarifies that this does not mean a cryptographically capable quantum computer; even a small-scale (e.g., factoring 15=3×5) quantum implementation—something achievable with simple circuits—would meet his criterion. The standard remains small-scale Shor implementations, not large-scale cryptanalysis. Previously, quantum factoring of 15 used simplified circuits without full fault-tolerance. The repeated focus on factoring 15 is because arithmetic mod 15 is trivial, and factoring larger numbers like 21 is significantly harder. Experiments claiming to factor 21 often rely on hints or shortcuts.
In short: there is no public progress supporting the expectation that a quantum computer capable of breaking RSA-2048 or secp256k1 (the cryptosystems of real concern) will appear within 5 years. Even 10 years remains an aggressive estimate. Given our distance from truly cryptographically relevant quantum computers, even with optimism about progress, timelines of over a decade are compatible.
So, what does the US government’s target of 2035 for migrating its systems entirely to post-quantum cryptography mean? I believe this is a reasonable schedule for such a large-scale migration. But it does not reflect an expectation that CRQC will appear by then.
What scenarios are suitable (and unsuitable) for HNDL attacks?
“Harvest now, decrypt later” (HNDL) attacks involve adversaries storing encrypted communications today, waiting for a future quantum computer capable of breaking current cryptography, then decrypting the stored data. It is certain that state-level actors are archiving US government communications at scale, with the hope of decrypting them once quantum computers arrive. That’s why cryptosystems must begin migrating today—at least for entities requiring confidentiality for 10–50+ years.
But digital signatures—used by all blockchains—are different from encryption: they do not have confidentiality that can be attacked retroactively.
In other words, when a quantum computer arrives, it will make forgery of digital signatures possible from that moment onward, but past signatures are not “hidden” secrets like encrypted messages. As long as one can verify that a signature was generated before the advent of CRQC, it cannot be forged.
Therefore, compared to encryption, the urgency of migrating to post-quantum signatures is lower.
Major platforms reflect this: Chrome and Cloudflare have deployed hybrid X25519+ML-KEM for TLS encryption. [For readability, I call these “cryptographic schemes,” though strictly speaking, TLS and similar protocols use key exchange or key encapsulation mechanisms, not public-key encryption.]
“Hybrid” here means simultaneously combining a post-quantum scheme (ML-KEM) with a classical one (X25519), providing security from both. This approach aims to prevent HNDL attacks via ML-KEM, while X25519 offers traditional security if ML-KEM is found vulnerable.
Apple’s iMessage also uses a similar hybrid post-quantum scheme in PQ3, and Signal has implemented this in PQXDH and SPQR protocols.
In contrast, migrating core web infrastructure to post-quantum digital signatures will be delayed until “the real threat of CRQC” approaches, because current post-quantum signature schemes impose significant performance degradation (discussed later).
zkSNARKs—zero-knowledge, succinct, non-interactive proofs—are central to blockchain scalability and privacy in the future. Their quantum threat is similar to signatures: some zkSNARKs are not post-quantum secure because they use the same elliptic curve cryptography as current schemes. But their zero-knowledge property remains post-quantum secure: they do not leak any secret information about witnesses, even against quantum adversaries.
The zero-knowledge property guarantees that the proof reveals nothing about the secret witness, even to quantum attackers, so there is no risk of prior “collection” and future decryption of secret data.
Therefore, zkSNARKs are unaffected by HNDL attacks. Just as today’s non-post-quantum signatures are safe if generated before CRQC, zkSNARKs created before the advent of CRQC are considered trustworthy (soundness is preserved). Only after CRQC could attackers forge “valid-looking but false” proofs.
What does this mean for blockchain?
Most blockchains are not vulnerable to HNDL attacks: the majority of non-privacy chains—like Bitcoin and Ethereum today—rely on non-post-quantum cryptography primarily in transaction authorization, i.e., they use digital signatures rather than encryption.
Again, digital signatures are not susceptible to HNDL: “Harvest now, decrypt later” attacks only apply to encrypted data. For example, Bitcoin’s blockchain is public; the threat is forging signatures (deriving private keys to steal funds), not decrypting publicly available transactions. This means HNDL attacks do not directly threaten current blockchain security.
Unfortunately, some trusted institutions (including the US Federal Reserve) have erroneously claimed that Bitcoin is vulnerable to HNDL attacks, exaggerating the urgency of post-quantum migration.
But “lower urgency” does not mean Bitcoin can wait indefinitely: protocol upgrades require significant social coordination. As discussed below, Bitcoin faces specific timing pressures—mainly due to slow governance and the large amount of potentially abandoned, quantum-vulnerable coins.
One current exception is privacy coins, which hide recipients and amounts via encryption or other means. This privacy data can be “harvested” early; once quantum computers can break elliptic curve cryptography, such coins could be de-anonymized post hoc.
The severity of such attacks depends on the design of each chain. For example, Monero’s ring signatures and key images—used to prevent double-spending—are based on elliptic curve cryptography; publicly available blockchain data could allow reconstruction of the entire transaction graph in the future. Other privacy coins may be less vulnerable. See also Sean Bowe’s research on Zcash cryptography.
If a user considers it crucial that “transactions remain confidential until future decryption,” then privacy chains should migrate promptly to post-quantum cryptography (or hybrid schemes), or adopt architectures that do not store decryptable secrets on-chain.
Bitcoin’s unique challenges: governance + abandoned coins
For Bitcoin, two practical considerations make moving toward post-quantum signatures urgent, unrelated to quantum technology itself. First, governance: Bitcoin’s development is very slow. Any contentious issue could trigger a hard fork if the community cannot reach consensus on an appropriate solution.
Second, transitioning to post-quantum signatures cannot be passive: holders must actively migrate funds. This leaves unprotected the coins that have been abandoned but still exposed to quantum threats. Estimated that hundreds of thousands of BTC, at current prices (as of December 2025), worth hundreds of billions of dollars, could be quantum-vulnerable and abandoned.
But quantum threats are unlikely to cause a sudden “catastrophic collapse” of Bitcoin—more likely a gradual, selective attack process. Quantum computers will not break all cryptography instantaneously: Shor’s algorithm must target each public key individually. Early quantum attacks will be costly and slow. Once a quantum computer can break a specific Bitcoin signature key, attackers will prioritize the highest-value wallets.
Additionally, if users avoid address reuse and do not use Taproot addresses (which reveal the public key on-chain), they can remain somewhat protected even before protocol upgrades: their public keys are hidden behind hashes until they make a transaction. When they finally broadcast a spending transaction, the public key is revealed, creating a short “race window”: honest users need their transactions confirmed quickly, while quantum attackers try to extract the private key before confirmation. The most vulnerable coins are those with public keys exposed for years—early P2PK outputs, reused addresses, and Taproot holdings.
There are no easy solutions for already abandoned, vulnerable coins. Options include:
The second option raises serious legal and security issues: using quantum computers to take control of funds without the private keys—even if claimed to be with legitimate ownership—would be considered theft or fraud in many jurisdictions.
Furthermore, “abandoned” is a questionable assumption: no one can be sure whether the private keys are truly lost. Even proof of prior ownership does not grant the right to break the cryptography and recover the coins. This legal ambiguity makes these abandoned, quantum-vulnerable coins highly susceptible to malicious attacks by actors ignoring legal boundaries.
Another issue is Bitcoin’s very low transaction throughput. Even after protocol upgrades, migrating all quantum-vulnerable funds to post-quantum addresses could take months at current transaction rates.
These challenges mean Bitcoin must start planning its post-quantum migration now—not because a CRQC will appear before 2030, but because coordination, consensus, and technical logistics will take years.
The quantum threat to Bitcoin is real, but the timing is driven by Bitcoin’s own governance and technical structure, not by the imminent arrival of quantum computers. Other blockchains face similar risks, but Bitcoin is particularly vulnerable: early transactions used P2PK outputs, directly exposing public keys, thus leaving a large proportion of BTC at risk. Its history, high value concentration, low throughput, and rigid governance make the problem especially severe.
Note that the vulnerability discussed only concerns the cryptographic security of Bitcoin signatures—not its economic security. Bitcoin’s economic security relies on proof-of-work (PoW) consensus, which is less susceptible to quantum attacks for three reasons:
PoW depends on hash functions, which would only be quadratically accelerated by Grover’s algorithm—not exponentially as with Shor’s algorithm.
Implementing Grover search in practice is extremely costly; any quantum computer would be highly unlikely to gain meaningful advantage in Bitcoin’s PoW.
Even if quantum computers achieve significant speedups, the effect would mostly favor large mining pools with quantum capabilities, not undermine Bitcoin’s overall security model.
( Costs and Risks of Post-Quantum Signatures
To understand why blockchain should not rush to deploy post-quantum signatures, we must consider both performance costs and our confidence that post-quantum schemes will evolve safely.
Most post-quantum cryptography relies on one of five approaches: hashing, error-correcting codes, lattices, multivariate quadratic equations (MQ), or isogenies.
Why five? Because the security of any post-quantum primitive depends on an assumption: that quantum computers cannot efficiently solve a particular mathematical problem. The more structured the problem, the more efficient the resulting scheme.
But this is a double-edged sword: more structure also means larger attack surfaces, and the algorithms may be more vulnerable. This creates a fundamental tension—stronger assumptions enable better efficiency, but at the potential cost of security if those assumptions are broken.
Overall, from a security perspective, hash-based schemes are the most conservative, as we are most confident that quantum computers cannot efficiently attack them. But their efficiency is also the worst: for example, NIST’s standardized hash-based signatures (e.g., LMS/ LMS+), even at minimal parameters, have signature sizes of 7–8 KB. In contrast, elliptic curve signatures are about 64 bytes, roughly 100 times smaller.
Lattice schemes are the current focus for deployment. NIST has selected two finalists—Dilithium (ML-DSA) and Falcon. Dilithium at 128-bit security has signatures around 2.4 KB; at 256-bit security, about 4.6 KB—roughly 40–70 times larger than current elliptic signatures. Falcon signatures are smaller (e.g., Falcon-512 is 666 bytes), but rely on complex floating-point operations, making implementation challenging. One of Falcon’s main designers, Thomas Pornin, calls it “the most complex cryptographic algorithm I have implemented so far.”
In implementation security, lattice signatures are more challenging than elliptic curve schemes: they involve more sensitive intermediate values and complex rejection sampling, requiring side-channel and fault attack resistance. Falcon’s constant-time floating-point operations have been successfully attacked in side-channel experiments, recovering private keys.
These issues are immediate risks, unlike the distant threat of a CRQC.
Caution is warranted with higher-performance post-quantum schemes. Past leading schemes like Rainbow (lattice-based signatures) and SIKE/SIDH (isogeny-based encryption) have been broken today—by classical algorithms, not quantum.
This happened after extensive standardization work by NIST. While a healthy scientific process, premature standardization and deployment could backfire.
As mentioned, network infrastructure is adopting a cautious approach to signature migration. This is important because cryptographic transitions typically take years. For example, hash functions like MD5 and SHA-1 have been officially deprecated for years, but their practical migration is ongoing, and some systems still use them. These algorithms are fully broken, not just “potentially breakable in the future.”
) Blockchain vs. Network Infrastructure: Unique Challenges
Fortunately, open-source blockchains (like Ethereum, Solana) can upgrade faster than traditional network infrastructure. On the other hand, infrastructure benefits from frequent key rotations, which mitigate some risks. But blockchain does not have this advantage: addresses and keys may remain exposed for years or indefinitely. Nonetheless, blockchains should adopt a cautious, gradual approach similar to networks. Both are unaffected by signature-type HNDL attacks, but premature migration to immature schemes carries significant costs and risks.
Additionally, blockchains have specific challenges making early migration especially dangerous: they often require “fast aggregation of many signatures,” which favors schemes like BLS signatures. But BLS is not post-quantum secure. Research is ongoing into SNARK-based post-quantum signature aggregation, but it remains early.
Regarding SNARKs—core to blockchain scalability and privacy—the quantum threat resembles that of signatures. Some SNARKs are not post-quantum secure because they rely on elliptic curve cryptography. But their zero-knowledge property remains post-quantum secure: they do not leak secret witnesses, even against quantum adversaries.
This zero-knowledge property guarantees that proofs do not reveal anything about secret witnesses, even to quantum attackers, so there is no risk of collecting and decrypting secret data later.
Thus, zkSNARKs are unaffected by HNDL attacks. As today’s non-post-quantum signatures are safe if created before CRQC, zkSNARKs generated before CRQC are considered trustworthy (soundness is preserved). Only after CRQC do attackers have the ability to forge “valid but false” proofs.
What implications does this have for blockchain?
Most blockchains are not exposed to HNDL attacks: non-privacy ones like Bitcoin and Ethereum rely on classical cryptography for transaction authorization, i.e., they use digital signatures, not encryption.
Again, digital signatures are not susceptible to HNDL: “Harvest now, decrypt later” attacks only target encrypted data. For example, Bitcoin’s blockchain is public; threat is forging signatures (deriving private keys to steal funds), not decrypting transactions. Therefore, HNDL does not pose an immediate cryptographic threat to current blockchains.
Regrettably, some authorities (including the US Federal Reserve) have incorrectly claimed that Bitcoin is vulnerable to HNDL, overstating the urgency of post-quantum migration.
But “lower urgency” does not mean Bitcoin can wait forever: protocol upgrades require significant social coordination. As discussed, Bitcoin faces specific timing pressures—due to slow governance and large amounts of potentially abandoned, quantum-vulnerable coins. (More detailed discussion below.)
An exception is privacy coins, which conceal recipient and amount information. Such confidential data could be “harvested” early; once quantum computers can break elliptic curve cryptography, they could be de-anonymized afterward.
The severity of attacks depends on each chain’s design. For example, Monero’s ring signatures and key images—used to prevent double-spending—are based on elliptic curve cryptography; with public blockchain data, the entire transaction graph could be reconstructed in the future. Other privacy coins may be less vulnerable—see Sean Bowe’s discussion on Zcash cryptography.
If users consider it vital that “transactions remain confidential until decrypted,” then privacy chains should migrate quickly to post-quantum schemes or adopt architectures that do not store decryptable secrets on-chain.
Bitcoin’s Unique Challenges: Governance + Abandoned Coins
For Bitcoin, two practical factors make urgent migration to post-quantum signatures necessary, unrelated to quantum tech itself. First, governance: Bitcoin’s development is very slow. Controversial issues could trigger hard forks if community consensus cannot be reached.
Second, a transition to post-quantum signatures cannot be passive: holders must actively migrate funds. This leaves unprotected the coins that have been abandoned but are still exposed to quantum threats. Estimates suggest hundreds of thousands of BTC, worth hundreds of billions of dollars today, could be quantum-vulnerable and abandoned.
But quantum threats are unlikely to cause a sudden “disastrous collapse” of Bitcoin—more like a gradual, targeted attack process. Quantum computers will not break all cryptography at once: Shor’s algorithm must target each key individually. Early quantum attacks will be costly and slow. Once a quantum computer can break a specific signature key, attackers will prioritize the most valuable wallets.
Moreover, if users avoid address reuse and do not use Taproot addresses (which reveal public keys on-chain), they can remain relatively protected before protocol upgrades: their public keys are hidden behind hashes until spent. When they spend coins, the public key is revealed, creating a short “race window”: honest users want their transactions confirmed quickly, while quantum attackers try to derive the private key before confirmation. The most vulnerable coins are those with public keys exposed for years—early P2PK outputs, reused addresses, Taproot holdings.
No easy solution exists for already abandoned, vulnerable coins. Options include:
The second option raises serious legal and security concerns: using quantum computers to seize funds without private keys—even if claiming legal ownership or good faith—could be considered theft or fraud in many jurisdictions.
Furthermore, “abandoned” is an uncertain assumption: no one knows whether private keys are truly lost. Even proof of prior ownership does not guarantee the right to break cryptography and recover coins. This legal ambiguity makes these coins highly susceptible to malicious actors ignoring laws and seizing them.
Another challenge is Bitcoin’s very low transaction throughput: even after upgrades, migrating all vulnerable funds could take months at current transaction rates.
These difficulties mean Bitcoin must begin planning its post-quantum transition today—not because a CRQC will appear before 2030, but because coordination, consensus, and technical logistics will take years.
The quantum threat to Bitcoin is real; but the timing is driven by Bitcoin’s own structure, not by imminent quantum computer arrival. Other blockchains face similar risks, but Bitcoin is especially vulnerable: early transactions used pay-to-public-key outputs, directly exposing public keys, leaving a large portion of BTC at risk. Its history, high value concentration, low throughput, and rigid governance make the problem particularly severe.
Note: the discussed vulnerability pertains only to the cryptographic security of signatures, not the economic security of the Bitcoin network. Bitcoin’s security relies on proof-of-work (PoW); PoW is less vulnerable to quantum attacks because:
PoW relies on hash functions, which would only be quadratically accelerated by Grover’s algorithm—not exponentially as with Shor’s.
Implementing Grover search is extremely costly, making any quantum advantage in Bitcoin’s PoW highly unlikely.
Even with quantum speedups, the effect would mainly favor large pools with quantum capabilities, not fundamentally undermine Bitcoin’s security.
Costs and Risks of Post-Quantum Signatures
To understand why blockchain should not rush deploying post-quantum signatures, we must consider both performance costs and confidence levels in evolving post-quantum schemes.
Most post-quantum schemes are based on one of five approaches: hashing, coding, lattices, multivariate quadratic (MQ), or isogenies.
Why five? Because the security depends on the assumption that quantum computers cannot efficiently solve a specific hard problem. The more structure, the more efficient potential attacks.
But this creates a fundamental tension: more structure means larger attack surface, and easier algorithms. More structure can mean lower security if assumptions are broken. There is a fundamental trade-off—stronger assumptions enable better efficiency but risk security if broken.
From a security standpoint, hash-based schemes are the most conservative: we are most confident quantum computers cannot attack them efficiently. But they have the worst performance: for example, NIST’s hash-based signatures (e.g., LMS) have signature sizes around 7–8 KB, even at minimum parameters. In contrast, elliptic curve signatures are about 64 bytes, roughly 100× smaller.
Lattice schemes are currently the focus of active deployment. NIST selected Dilithium (ML-DSA) and Falcon as finalists. Dilithium at 128-bit security has signatures around 2.4 KB; at 256-bit security, about 4.6 KB—roughly 40–70 times larger than current elliptic signatures. Falcon signatures are smaller (e.g., Falcon-512 is 666 bytes, Falcon-1024 is 1.3 KB), but rely on complex floating-point operations, making implementation challenging. One of Falcon’s main designers, Thomas Pornin, calls it “the most complex cryptographic algorithm I have implemented.”
Implementation security for lattice signatures is more difficult: they involve more sensitive intermediate data and complex rejection sampling, requiring side-channel and fault attack resistance. Falcon’s constant-time floating-point arithmetic has been successfully attacked in side-channel experiments, exposing private keys.
These issues pose immediate risks, far more tangible than the distant threat of a CRQC.
Caution with high-performance schemes is justified. Past leading schemes like Rainbow (multivariate signatures) and SIKE/SIDH (isogeny-based encryption) have been broken today—by classical algorithms, not quantum.
This occurred late in the standardization process—by the time they reached advanced stages. While this reflects healthy scientific progress, premature standardization could cause setbacks.
Web infrastructure is adopting a cautious, incremental approach to signature migration. This is important because cryptographic transitions in networks typically span many years. For example, hash functions like MD5 and SHA-1 have been deprecated for years, but their practical migration continues. Some systems still use them today. These algorithms are broken, not just theoretically vulnerable.
Blockchain vs. Network Infrastructure: Unique Challenges
Open-source blockchains (like Ethereum, Solana) can upgrade faster than traditional infrastructure. But they lack frequent key rotation, which can mitigate some risks. However, blockchain addresses and keys may remain exposed indefinitely. Overall, blockchains should imitate the cautious, gradual migration patterns of networks. Both are unaffected by signature-type HNDL attacks, but premature migration to immature schemes has high costs and risks, regardless of key lifetime.
Moreover, blockchain faces specific challenges making early migration especially risky: for example, the need to “aggregate many signatures quickly.” Currently popular BLS signatures enable efficient aggregation but are not post-quantum secure. Research is ongoing into SNARK-based post-quantum signature schemes. Progress is promising but early.
Regarding SNARKs—the core cryptographic primitive for scalability and privacy—they face similar quantum threats as signatures. Some SNARKs are based on elliptic curve cryptography, which is not post-quantum secure. However, their zero-knowledge property remains secure: they do not leak secret witnesses, even against quantum adversaries.
This zero-knowledge property guarantees that proofs do not reveal anything about secret witnesses, even to quantum attackers, so there is no risk of collecting and decrypting secret data later.
Thus, zkSNARKs are unaffected by HNDL. As today’s non-quantum signatures are safe if created before CRQC, zkSNARKs generated before CRQC are considered trustworthy (soundness intact). Only after CRQC could attackers forge “valid but false” proofs.
What does this imply for blockchain?
Most blockchains are not exposed to HNDL: non-privacy chains like Bitcoin and Ethereum rely on classical cryptography for transaction authorization, using digital signatures, not encryption.
Again, digital signatures are not vulnerable to HNDL: “Harvest now, decrypt later” attacks apply to encrypted data only. For example, Bitcoin’s blockchain is public; the threat is forging signatures (deriving private keys to steal funds), not decrypting transactions. Therefore, HNDL does not pose an immediate cryptographic threat.
Regrettably, some authorities (including US Federal Reserve) have incorrectly claimed Bitcoin is vulnerable to HNDL, overstating the urgency of post-quantum migration.
But “lower urgency” does not mean Bitcoin can delay indefinitely: protocol upgrades require significant social consensus. As discussed, Bitcoin faces specific timing constraints—due to slow governance and large amounts of potentially abandoned, quantum-vulnerable coins. (See detailed discussion below.)
An exception is privacy coins, which hide recipient and amount info. Such confidential data could be “harvested” early; once quantum computers can break elliptic curve cryptography, they could be de-anonymized.
Severity depends on chain design. For example, Monero’s ring signatures and key images—used to prevent double-spending—are based on elliptic curve cryptography; with public data, the entire transaction graph could be reconstructed in the future. Other privacy coins may be less vulnerable—see Sean Bowe’s Zcash cryptography discussion.
If users deem it critical that “transactions remain confidential until decrypted,” then privacy chains should migrate quickly to post-quantum schemes or architectures avoiding on-chain secrets.
Bitcoin’s Unique Challenges: Governance + Abandoned Coins
For Bitcoin, two factors make urgent migration necessary, unrelated to quantum tech: First, slow governance—any contentious issue risks a hard fork if consensus cannot be reached.
Second, passive migration does not work: holders must actively migrate funds. Abandoned coins still exposed to quantum risks cannot be protected. Estimated that hundreds of thousands of BTC, worth hundreds of billions of dollars, could be vulnerable and abandoned.
Quantum threats likely won’t cause a sudden “disaster”—more like a gradual, targeted attack. Quantum computers won’t break all cryptography at once; Shor’s algorithm must target each key individually. Early attacks will be costly and slow. Once a quantum computer can break a specific key, attackers will prioritize high-value addresses.
Furthermore, if users avoid address reuse and do not use Taproot (which reveals public keys on-chain), they can remain relatively protected before upgrades: their public keys are hidden behind hashes until spent. When they spend coins, the public key is revealed, creating a brief “race window”: honest users want quick confirmation, attackers try to derive the private key before confirmation. Coins with public keys exposed for years—early P2PK outputs, reused addresses, Taproot holdings—are most vulnerable.
No easy fix exists for already abandoned, vulnerable coins. Options include:
The latter raises serious legal and security issues: using quantum computers to seize funds without private keys—claiming legitimate ownership—may constitute theft or fraud in many jurisdictions.
Moreover, “abandoned” is uncertain: no one knows if private keys are truly lost. Even proof of prior holding does not guarantee the right to break cryptography and recover coins. This legal ambiguity makes such coins vulnerable to malicious actors ignoring laws.
Another challenge is Bitcoin’s low transaction throughput: migrating all vulnerable funds could take months at current rates, even after upgrades.
Thus, Bitcoin must start planning its post-quantum migration now—not because a CRQC will appear before 2030, but because coordination and logistics will take years.
The real quantum threat to Bitcoin is driven by its own structure—structural constraints—not the imminent arrival of quantum computers. Other chains face similar risks, but Bitcoin is especially vulnerable: early transactions used P2PK outputs, which directly expose public keys, leaving a large share of BTC at risk. Its history, value concentration, low throughput, and rigid governance exacerbate the problem.
Note: the discussed vulnerability is only about the cryptographic security of signatures—not about economic security. Bitcoin’s security depends on proof-of-work, which is less vulnerable to quantum attacks:
PoW depends on hash functions, which would only be quadratically accelerated by Grover’s algorithm—not exponentially as with Shor’s.
Implementing Grover search is extremely costly; quantum advantage in Bitcoin PoW remains unlikely.
Even with quantum speedups, the effect would mainly favor large pools with quantum capabilities, not undermine the core security model.
Costs and Risks of Post-Quantum Signatures
To understand why blockchain should not rush deploying post-quantum signatures, consider both performance costs and confidence in evolving schemes.
Most post-quantum schemes are based on one of five approaches: hashing, coding, lattices, multivariate quadratic, or isogenies.
Why five? Because security rests on the assumption that quantum computers cannot efficiently solve a particular hard problem. More structure means more efficient attacks.
But more structure also means larger attack surfaces, and potentially easier algorithms—creating a fundamental tension. Stronger assumptions enable better efficiency but risk security if assumptions are broken.
From security perspective, hash-based schemes are most conservative: we have the highest confidence that quantum computers cannot attack them efficiently. But their signatures are large: for example, NIST’s hash-based signatures (like LMS) produce signatures around 7–8 KB, even at minimal parameters. In contrast, elliptic curve signatures are about 64 bytes—roughly 100× smaller.
Lattice schemes are the current deployment focus. NIST selected Dilithium (ML-DSA) and Falcon. Dilithium at 128-bit security has signatures about 2.4 KB; at 256-bit security, about 4.6 KB—roughly 40–70× larger than current elliptic signatures. Falcon signatures are smaller (e.g., Falcon-512: 666 bytes; Falcon-1024: 1.3 KB), but rely on complex floating-point operations, making implementation challenging. Its designer, Thomas Pornin, calls it “the most complex cryptographic algorithm I’ve implemented.”
In implementation security, lattice signatures are more difficult: they involve more sensitive intermediate data and complex rejection sampling, requiring side-channel and fault attack resistance. Falcon’s constant-time floating-point operations have been successfully attacked, exposing private keys.
These issues pose immediate risks, far more tangible than the distant threat of a CRQC.
Caution is justified. Past schemes like Rainbow (multivariate) and SIKE/SIDH (isogeny-based) have been broken today—by classical algorithms, not quantum.
This happened late in the standardization process—by the time they reached advanced stages. While this shows scientific progress, premature standardization and deployment could cause setbacks.
Web infrastructure is adopting gradual, cautious transitions. Cryptographic shifts in networks typically span many years. For example, hash functions like MD5 and SHA-1 have been deprecated for years, yet some systems still use them. These algorithms are fully broken, not just potentially breakable.
Blockchain vs. Network Infrastructure: Unique Challenges
Open-source blockchains like Ethereum and Solana can upgrade faster than traditional infrastructure. But they lack frequent key rotations, which can mitigate some risks. Still, address and key exposure may persist indefinitely. Overall, blockchains should learn from internet protocols’ cautious migration. Both are unaffected by signature-type HNDL attacks, but premature migration to immature schemes entails high costs and risks, regardless of key lifespan.
Furthermore, blockchains face specific issues making early migration particularly risky: e.g., the need for “fast aggregation of many signatures.” Current popular schemes like BLS enable aggregation but are not post-quantum secure. Research into SNARK-based post-quantum signature aggregation is ongoing, promising but early.
For SNARKs—the core of blockchain scalability and privacy—the quantum threat is similar to signatures. Some SNARKs rely on elliptic curve cryptography, which is not post-quantum secure. But their zero-knowledge property remains secure: they do not leak witness info even against quantum attackers.
This