That's on the useful end but I don't think any QC has gone beyond being able to factor 14 or something in that neighborhood. Realistically we'd need a few thousand qubits to factor anything that's reasonable and current QCs have a dozen or so qubits that work.
The "factorization" done with quantum computers involved cherry picking special numbers so that a special "compiled" circuit (knowledge of the answer is required in order to do this) can be used instead of the full thing. That makes the semantics of the executed program "slightly" different.
On the one hand you have strong and persistent claims about quantum factoring of large numbers
On the other hand you have
https://algassert.com/post/2500
That's on the useful end but I don't think any QC has gone beyond being able to factor 14 or something in that neighborhood. Realistically we'd need a few thousand qubits to factor anything that's reasonable and current QCs have a dozen or so qubits that work.
no QC has gone beyond being able to factor 1.
The "factorization" done with quantum computers involved cherry picking special numbers so that a special "compiled" circuit (knowledge of the answer is required in order to do this) can be used instead of the full thing. That makes the semantics of the executed program "slightly" different.
What the claims say: factor(a,b)
What the implementation does: println("3").