← Back to context

Comment by theshrike79

13 hours ago

Should be possible with optimised models, just drop all "generic" stuff and focus on coding performance.

There's no reason for a coding model to contain all of ao3 and wikipedia =)

There is: It works (even if we can't explain why right now).

If we knew how to create a SOTA coding model by just putting coding stuff in there, that is how we would build SOTA coding models.

I think I like coding models that know a lot about the world. They can disambiguate my requirements and build better products.

  • I generally prefer a coding model that can google for the docs, but separate models for /plan and /build is also a thing.

That's what Meta thought initially too, training codellama and chat llama separately, and then they realized they're idiots and that adding the other half of data vastly improves both models. As long as it's quality data, more of it doesn't do harm.

Besides, programming is far from just knowing how to autocomplete syntax, you need a model that's proficient in the fields that the automation is placed in, otherwise they'll be no help in actually automating it.

  • But as far as I know, that was way before tool calling was a thing.

    I'm more bullish about small and medium sized models + efficient tool calling than I'm about LLMs too large to be run at home without $20k of hardware.

    The model doesn't need to have the full knowledge of everything built into it when it has the toolset to fetch, cache and read any information available.

Now I wonder how strong the correlation between coding performance and ao3 knowledge is in human programmers. Maybe we are on to something here /s