Comment by lukan
3 days ago
Those issues you can partly solve by changing the prompt to tell it to be concise and don't explain its code.
But nothing will make them stick to the one API version I use.
3 days ago
Those issues you can partly solve by changing the prompt to tell it to be concise and don't explain its code.
But nothing will make them stick to the one API version I use.
> But nothing will make them stick to the one API version I use.
Models trained for tool use can do that. When I use Codex for some Rust stuff for example, it can grep from source files in the directory dependencies are stored, so looking up the current APIs is trivial for them. Same works for JavaScript and a bunch of other languages too, as long as it's accessible somewhere via the tools they have available.
Hm, I never tried codex so far, but quite some other tools and models and none could help me in a consistent way. But I am sceptical, because also if I tell them explicitel, to only use one specific version they might or not might use that, depending on their training corpus and temperature I assume.
The less verbosity you allow the dumber the LLM is. It thinks in tokens and if you keep it from using tokens it's lobotomized.
It can think as much as it wants and still return just code in the end.