I’ve been reading up on both quantum computing (especially recent advances) and cryptocurrency, and it seems there’s growing concern about how future quantum computers could break current cryptographic methods—like ECDSA, which underpins Bitcoin and Ethereum wallets.

    How might quantum computing realistically impact cryptocurrencies like Bitcoin and Ethereum in the next 10–15 years? Are current protocols truly “quantum-resistant”?
    byu/mmmilanista inCryptoTechnology



    Posted by mmmilanista

    1 Comment

    1. Yes. Barring flaws in the algorithms exploitable by classic computers, it is appearing ever more likely that they will be secure from quantum computing until the heat death of the universe.

      Given enough time and resources to throw at the problem, it seems more likely that subtle algorithmic exploits could be an eventual path someday. But not quantum.

      To crack AES-256 for example, would require *billions* of coherent, entangled physical quibits. (Mostly for error correction.)

      While there are several very large and fairly blatant quantum computing seed-funding scams going on, plus even large companies getting in the game just to maintain the illusion of tech superiority and placate ignorant shareholders (and further the FUD) – the fact is that quantum computing is not like Moore’s Law. Scaling gets exponentially harder.

      In the end, according to more and more experts in the field speaking up, it appears that it may not be physically possible to isolate enough coherent quibits against the unsilenceable background noise of the universe – quantum fields even in a perfect vacuum arbitrarily close to absolute zero – to perform useful calculations at scale (at to answer non-quantum questions), even if we had better algorithms to deploy on it.

      TLDR: While no one – certainly not me – can say absolutely for sure *yet*, the scientific community seems to be getting ever closer to being able to say: useful quantum computing for most formerly “classic” non-abelian problems is fundamentally not possible in this universe, possibly ever. (Except for certain domains where multiple inherently fuzzy outcomes without error correction are desired, like simulating quantum mechanics. But factoring a large integer into two primes demands massive error correction to arrive at one certain answer.) Shor’s algorithm has demonstrated that quantum computing can actually be applied to intermediate steps of some classic problems in a bigger way than just parallelization (e.g. quantum Fourier transform), but it’s not enough to overcome the limitations on the required number of coherent, entangled quibits.

    Leave A Reply
    Share via