← Back to context

Comment by zozbot234

17 hours ago

> It goes into long exploration loops for 5+ minutes even when I point it to the exact files to inspect.

Give it a custom sandbox and context for the work, so it has no opportunity to roam around when not required. AI agentic coding is hugely wasteful of context and tokens in general (compared to generic chat, which is how most people use AI), there's a whole lot of scope for improvement there.

But the problem is it used to not need that before. These days, you have to think twice before you summon a subagent.

  • > But the problem is it used to not need that before. These days, you have to think twice before you summon a subagent.

    This is exactly what I (and many others) kept trying to tell the pro-AI folk 18 months ago: there is no value to jumping on the product early because any "experience" you have with it is easily gained by newcomers, and anything you learned can easily be swapped out from under you anyway.

    • The value is all the things I built with it? Surely, this constant change deteriorates the experience but to be clear, here we're nitpicking on the experience, not questioning the value.

      I also don't understand the "pro-AI" phrase. It's a tool, it brings results. I'm not pro-car when I drive to work.

      1 reply →

The sandbox is fine, but if the parent has given explicit instruction of files to inspect, why is it not centering there? Is the recent breakage that the base prompt makes it always try to explore for more context even if you try to focus it?

  • Because the "explicit instruction" you give AI is not deterministic as in a normal computer program. It's a complete black box and the context is also most likely polluted by all sorts of weird stuff. Putting it on as tight of a leash as possible should be seen as normal.

  • They changed plan mode so that it's instructed to follow a multi-step plan, the first step being to explore the code base. When you tell it to focus it's getting contradictory instructions from plan mode vs your prompt and it's essentially a coin flip which one it picks.

    It does seem like a cynical attempt to make more money.