← Back to context

Comment by fooker

3 days ago

No.

The long explanations boil down to this: quantum computers (so far) are better (given a million qubits) than classical computers at (problems that are in disguise) simulating quantum computers.

  given a million qubits ...

also last time I checked the record was 80 qubits and with every doubling of the cubits the complexity of the system and the impurities and the noise are increasing. so it's even questionable whether there will ever be useful quantum computers

  • Microsoft Research entire point is that their approach will allow

      "fault-tolerant quantum computing architecture based on noise-resilient, topologically protected Majorana-based qubits."
    

    Roadmap to fault tolerant quantum computation using topological qubit arrays https://arxiv.org/abs/2502.12252

    • Usually when people try to explain something about quantum computers, it feels like someone is trying to teach me what a monad is from the infamous example in some old haskell docs.

      I'm not proud of my ignorance, and I sure hope that eventually if I get it, it'd be very useful for me. At least it worked like that for monads.

      2 replies →

  • The issue isn't really impurities and noise, quantum error-correction solves that problem. The issue is that the supporting technologies don't scale well. Superconducting qubit computers like google's have a bunch of fancy wires coming out of the top, basically one for each qubit. You can't have a million wires that size, or even a smaller size, so the RF circuitry that sends signals down those wires needs to be miniaturized and designed to operate at near 0K so it can live inside the dilution refrigerator, which is not easy.

    Microsoft's technology is pretty far behind as far as capacity but the scaling limitations are less significant and the error-correction overhead is either eliminated or smaller.

  • Based on what i read it seems a lot of algorithmic work is required to even make them useful. New algorithms have to be discovered and still they will only solve only a special class of problems. They cant do classical computing so your NVIDIA GPU probably may never be replaced by a Quantum GPU.

    • I wouldn't worry too much about finding new algorithms. The sheer power of QC parallelism will attract enough talent to convert any useful classical algorithm to QC.

      It's a bit similar to the invention of fast Fourier transform (was reinvented several times...), O(n log n) is so much better than O(n*2) that many problems in science and technology use FFT somewhere in their pipeline, just because it's so powerful, even if unrelated to signal processing. For example, multiplication of very large numbers use FFT (?!).

    • Quantum computing is a generalization of classical computing. Thus, they CAN do classical computing. But, in practice, it'll be not as fast, more error prone and at a bigger cost.

      2 replies →

    • Maybe we'll end up with a state where typical computers have a CPU, a GPU, and a QPU to solve different problems.

  • Hopefully not, besides quantum physics simulations the only problems they solve are the ones that should remain unsolved if we're to trust the integrity of existing systems.

    As soon as the first practical quantum computer is made available, so much recorded TLS encrypted data is gonna get turned into plain text, probably destroying millions of people's lives. I hope everyone working in quantum research is aware of what their work is leading towards, they're not much better than arms manufacturers working on the next nuke.

    • This got me wondering how much of Tor, i2p, etc the NSA has archived. Or privacy coins like XMR.

      I'm also curious. If you don't capture the key exchange but instead only a piece of cypher text. Is there a lower limit to the sample size required to attack the key? It feels like there must be.

      1 reply →

  • i vaguely remember reading an article about solving the correlation between quantum decoherence and scaling of qubit numbers. i dont understand quantum computers so take it with a grain of salt.

    but here’s what perplexity says: “Exponential Error Reduction: Willow demonstrates a scalable quantum error correction method, achieving an exponential reduction in error rates as the number of qubits increases125. This is crucial because qubits are prone to errors due to their sensitivity to environmental factors25. ”

  • > last time I checked the record was 80 qubits

    It has progressed since: IBM Condor (demonstrated in december 2023) has 1121 qubits.

    • which is totally out of touch with the reality of making use of the extra qubits they just slapped on the chip to get a high number

Just like fusion energy it is pointless and you are not allowed to have excitement about it because some anonymous stranger on HN said so.

https://news.ycombinator.com/item?id=43093939#43094339

  • You're certainly allowed to get excited about it as long as you're patient and don't wildly overinflate the realistic timeline to net energy production. Similarly, nobody will stop you from hyping up quantum computation as long as you're not bullshitting usecases or lying about qubit scaling.

    In the wake of cryptocurrency and AI failing to live up to their outrageous levels of hype, many people on this site worry that the "feel the AGI" crowd might accidentally start feeling some other, seemingly-profitable vaporware to overhype and pump.