Comment by furyofantares
19 days ago
The article does demonstrate how bad it is in context.
Context has a lot of big advantages over training though, too, it's not one-sided. Upfront cost and time are the big obvious ones, but context also works better than training on small amounts of data, and it's easier to delete or modify.
Like even for a big product like Claude Code from someone that controls the model, although I'm sure they do a lot of training to make the product better, they're not gonna just rely entirely on training and go with a nearly blank system prompt.
No comments yet
Contribute on Hacker News ↗