Comment by nl
3 months ago
Modern LLMs are post trained for tasks other than next word prediction.
They still output words through (except for multi-modal LLMs) so that does involve next word generation.
3 months ago
Modern LLMs are post trained for tasks other than next word prediction.
They still output words through (except for multi-modal LLMs) so that does involve next word generation.
No comments yet
Contribute on Hacker News ↗