← Back to context

Comment by dgb23

8 hours ago

If you are looking at open models, check out Pi, as its very extensible and comes with a sane default. Maybe even roll your own.

Most harnesses (claude, codex, opencode etc.) assume that you use a cloud model. There’s no sense of optimization or finer control.

I've been running pi.dev + codex (GPT-5.4) for a long time for my workhorse stuff.

Actually tried caveman mode yesterday and it made everything SO MUCH BETTER. GPT-5.4 has a habit of being extremely verbose to a ridiculous degree, it's like it's writing a report for a CTO or something and padding everything as much as possible to sound smart.

With caveman it just gives me lists of stuff in a compact format. Perfect.

  • let's be clear(er) here that you like caveman's format and output, so that's one value of "better". I've seen at least a few (n < 10, so there's that) tests on token use and for THAT value of "better" it's not much different.