← Back to context

Comment by petcat

9 hours ago

If you tell a human junior developer just "fix this" then they will spend a week on a wild-goose chase with nothing to show for it.

At least the LLM will only take 5 minutes to tell you they don't know what to do.

Do they? I’ve never got a response that something was impossible, or stupid. LLMs are happy to verify that a noop does nothing, if they don’t know how to fix something. They rather make something useless than really tackle a problem, if they can make tests green that way, or they can claim that something “works”.

And’ve I never asked Claude Code something which is really impossible, or even really difficult.

  • Claude code will happily tell me my ideas are stupid, but I think that's because I nest my ideas in between other alternative ideas and ask for an evaluation of all of them. This effectively combats the sycophantic tendencies.

    Still, sometimes claude will tell me off even when I don't give it alternatives. Last night I told it to use luasocket from an mpv userscript to connect to a zeromq Unix socket (and also implement zmq in pure lua) connected to an ffmpeg zmq filter to change filter parameters on the fly. Claude code all but called me stupid and told me to just reload the filter graph through normal mpv means when I make a change. Which was a good call, but I told it to do the thing anyway and it ended up working well, so what does it really know... Anyway, I like that it pushes back, but agrees to commit when I insist.

    • After such hard-won wins, ask the AI to save what it learned during the session to a MD file.

To be fair, that happening feels more like poor management and mentorship than "juniors are scatterbrained".

Over time, you build up the right reflexes that avoid a one-week goose chase with them. Heck, since we're working with people, you don't just say " fix this", you earmark time to make sure everyone is aligned on what needs done and what the plan is.

> At least the LLM will only take 5 minutes to tell you they don't know what to do.

In my experience, the LLM will happily try the wrong thing over and over for hours. It rarely will say it doesn’t know.

  • Don’t ask it to make changes off the bat, then - ask it to make a plan. Then inspect the plan, change it if necessary, and go from there.