Comment by theshrike79
7 hours ago
> I have found that they can get into "death spirals," where their suggestions keep getting worse and worse. I've learned to just walk away, and try something else, when that happens. I shudder to think of junior engineers, implementing the code that comes from these.
The death spirals (great term btw) are caused by context pollution. Basically if the LLM gets something wrong in its head (context), it can't move away from it. Just the fact that it exists nudges its decisions towards it.
This is where you need to ask it to summarise where you are to a markdown document or similar (the kids call it a "memory file"). Clear the context (start a new chat) and use the markdown file to bootstrap your progress.
Thanks for the tip!
It made me think of this: https://theonion.com/sam-altman-places-gun-to-head-after-new...