Comment by 0x457
6 months ago
Well, it outputs a chain of thoughts that later used to produce better prediction. It produces a chain of thoughts similar to how one would do thinking about a problem out loud. It's more verbose that what you would do, but you always have some ambient context that LLM lacks.
No comments yet
Contribute on Hacker News ↗