Comment by mmoll
1 day ago
It doesn’t mean that these “thoughts” influenced their final decision the way they would in humans. An LLM will tell you a lot of things it “considered” and its final output might still be completely independent of that.
1 day ago
It doesn’t mean that these “thoughts” influenced their final decision the way they would in humans. An LLM will tell you a lot of things it “considered” and its final output might still be completely independent of that.
Its output quite literally is not independent, as the "thinking tokens" are attended to by the attention mechanism.