Comment by sunir

3 days ago

We certainly will; they can’t replace humans in most language tasks without having a human like emotional model. I have a whole therapy set of agents to debug neurotic long lived agents with memory.

Ok, call me crazy, but I don't actually think there's any technical reason that a theoretical code generation robot needs emotions that are as fickle and difficult to manage as humans.

It's just that we designed this iteration of technology foundationally on people's fickle and emotional reddit posts among other things.

It's a designed-in limitation, and kind of a happy accident it's capable of writing code at all. And clearly carries forward a lot of baggage...

  • If you can find enough training data that does human-like things without have human-like qualities, we are all ears.

    • It can be simultaneously the best we have, and well short of the best we want. It can be a remarkable achievement and fall short of the perceived goals.

      That's fine.

      Perhaps we can RL away some of this or perhaps there's something else we need. Idk, but this is the problem when engineers are the customer, designer, and target audience.

  • Maybe. I use QWAN frequently when working with the coding agents. That requires an llm equivalent of interoception to recognize when the model understanding is scrambled or “aligned with itself” which is what qwan is.

what on God's green Earth could the CEO of a no name b2b saas have a use for long running agents?

either your business isn't successful, so you're coding when you shouldn't be, or cosplaying coding with Claude, or you're lying, or you're telling us about your expensive and unproductive hobby.

How much do you spend on AI? What's your annual profit?

edit: oh cosplaying as a CEO. I see. Nice WPEngine landing page Mr AppBind.com CEO. Better have Claude fix your website! I guess that agent needs therapy...