Comment by otabdeveloper4

19 hours ago

BERT is one example of a language model that solved specific language tasks very well and that existed before LLM's.

We had very good language models for decades. The problem was they needed to be trained, which LLM's mostly don't. You can solve a language model problem now with just some system prompt manipulation.

(And honestly typing in system prompts by hand feels like a task that should definitely be automated. I'm waiting for "soft prompting" be become a thing so we can come full circle and just feed the LLM with an example set.)