← Back to context

Comment by pankajdoharey

4 days ago

Based on what i read it seems a lot of algorithmic work is required to even make them useful. New algorithms have to be discovered and still they will only solve only a special class of problems. They cant do classical computing so your NVIDIA GPU probably may never be replaced by a Quantum GPU.

I wouldn't worry too much about finding new algorithms. The sheer power of QC parallelism will attract enough talent to convert any useful classical algorithm to QC.

It's a bit similar to the invention of fast Fourier transform (was reinvented several times...), O(n log n) is so much better than O(n*2) that many problems in science and technology use FFT somewhere in their pipeline, just because it's so powerful, even if unrelated to signal processing. For example, multiplication of very large numbers use FFT (?!).

Quantum computing is a generalization of classical computing. Thus, they CAN do classical computing. But, in practice, it'll be not as fast, more error prone and at a bigger cost.

  • > Quantum computing is a generalization of classical computing

    Can you explain more or share some resources?

    • Any basic operation you can do with reversible computing on bits, can be done with qubits.

      Any basic operation you can do with normal computing on bits can be done with reversible computing on bits provided that you have enough ancillary bits to store the information that would normally be deleted in the irreversible normal operation on bits.

Maybe we'll end up with a state where typical computers have a CPU, a GPU, and a QPU to solve different problems.