We know that they do not reason because we know the algorithm behind the curtain. The model is generating the next token via model weights and some randomness. That’s all. It not reasoning. Sometimes it has an appearance of reasoning, but not if you know how it works. It doesn’t matter that the model manufacturer marketing department slaps a “Reasoning!” sticker on the side of the model. It’s not actually doing that. As an analogy, sometimes a stage magician in Las Vegas makes it seem that he’s making a woman disappear and a tiger appear in her place, but we all know that’s not what is really happening; It’s just a clever trick.
Well, could you define what reasoning actually means? What would an AI need to do to be considered capable of reasoning? What is the core difference between what we do that is considered reasoning verse what AI currently does that is not considered reasoning?
To be clear, I am not making a statement as to whether AI reasons or not. Its just slippery to say something isn't or can't do X when we can't really define X. Perhaps if we can put it down as an outcome rather than an, in my opinion, currently impossible to accurately define characteristic of a thing.
We know that they do not reason because we know the algorithm behind the curtain. The model is generating the next token via model weights and some randomness. That’s all. It not reasoning. Sometimes it has an appearance of reasoning, but not if you know how it works. It doesn’t matter that the model manufacturer marketing department slaps a “Reasoning!” sticker on the side of the model. It’s not actually doing that. As an analogy, sometimes a stage magician in Las Vegas makes it seem that he’s making a woman disappear and a tiger appear in her place, but we all know that’s not what is really happening; It’s just a clever trick.
Well, could you define what reasoning actually means? What would an AI need to do to be considered capable of reasoning? What is the core difference between what we do that is considered reasoning verse what AI currently does that is not considered reasoning?
To be clear, I am not making a statement as to whether AI reasons or not. Its just slippery to say something isn't or can't do X when we can't really define X. Perhaps if we can put it down as an outcome rather than an, in my opinion, currently impossible to accurately define characteristic of a thing.
What is reasoning?