Comment by nl
1 day ago
Modern LLMs are post trained for tasks other than next word prediction.
They still output words through (except for multi-modal LLMs) so that does involve next word generation.
1 day ago
Modern LLMs are post trained for tasks other than next word prediction.
They still output words through (except for multi-modal LLMs) so that does involve next word generation.
No comments yet
Contribute on Hacker News ↗