← Back to context

Comment by moi2388

8 hours ago

What about Schor’s algorithm?

That's on the useful end but I don't think any QC has gone beyond being able to factor 14 or something in that neighborhood. Realistically we'd need a few thousand qubits to factor anything that's reasonable and current QCs have a dozen or so qubits that work.

  • no QC has gone beyond being able to factor 1.

    The "factorization" done with quantum computers involved cherry picking special numbers so that a special "compiled" circuit (knowledge of the answer is required in order to do this) can be used instead of the full thing. That makes the semantics of the executed program "slightly" different.

    What the claims say: factor(a,b)

    What the implementation does: println("3").