Comment by binarymax
5 hours ago
LLMs don’t learn. They’re static. You could try to fine tune, or continually add longer and longer context, but in the end you hit a wall.
5 hours ago
LLMs don’t learn. They’re static. You could try to fine tune, or continually add longer and longer context, but in the end you hit a wall.
No comments yet
Contribute on Hacker News ↗