Comment by adastra22
2 days ago
The whole point though is that they are step function better than traditional qubits, in a way that is simply a type error to compare.
The utility of traditional qubits depends entirely on how reliable and long-lived they are, and how to can scale to larger numbers of qubits. These topological qubits are effectively 100% reliable, infinite duration, and scale like semiconductors. According to the marketing literature, at least…
There are caveats there too. Generally topological qubits can be immune to all kinds of noise (i.e. built-in error correction) but Majorana zero modes aren't exact the right kind of topological for that to be true. They only enjoy protection on most operations, but not all. So there is a still a need for error correction here (and all the complication that entails) it is just hopefully less onerous since only essentially one operation requires it.
All the other qubits scaled the same way when they were in a simulator, too. When they actually hit reality, they all had huge problems.
Other qubits in general do not scale the same way. Some for example do not allow arbitrary point-to-point interactions, which means doubling your physical qubits doesn’t double your number of logical qubits. There are other ways in which scaling was sometimes nonlinear.
Note also that this isn’t a simulated result. Microsoft has an 8-qubit chip they are making available on Azure.
I am well aware of how other qubits scale, but I am also aware that the physicists who created them didn't expect decoherence to scale this rapidly at the time they took that approach.
IBM sells you 400 qubits with huge coherence problems. When IBM had an 8-qubit chip, they were also pretty stable.