Comment by jvanderbot

3 days ago

This is so interesting but it reads like satire. I'm sure folks who love persuading and teaching and marshalling groups are going to do very well in SWEng.

According to this, we'll all be reading the feelings journals of our LLM children and scolding them for cheating on our carefully crafted exams instead of, you know, making things. We'll read psychology books, apparently.

I like reading and tinkering directly. If this is real, the field is going to leave that behind.

We certainly will; they can’t replace humans in most language tasks without having a human like emotional model. I have a whole therapy set of agents to debug neurotic long lived agents with memory.

  • Ok, call me crazy, but I don't actually think there's any technical reason that a theoretical code generation robot needs emotions that are as fickle and difficult to manage as humans.

    It's just that we designed this iteration of technology foundationally on people's fickle and emotional reddit posts among other things.

    It's a designed-in limitation, and kind of a happy accident it's capable of writing code at all. And clearly carries forward a lot of baggage...

    • Maybe. I use QWAN frequently when working with the coding agents. That requires an llm equivalent of interoception to recognize when the model understanding is scrambled or “aligned with itself” which is what qwan is.

  • what on God's green Earth could the CEO of a no name b2b saas have a use for long running agents?

    either your business isn't successful, so you're coding when you shouldn't be, or cosplaying coding with Claude, or you're lying, or you're telling us about your expensive and unproductive hobby.

    How much do you spend on AI? What's your annual profit?

    edit: oh cosplaying as a CEO. I see. Nice WPEngine landing page Mr AppBind.com CEO. Better have Claude fix your website! I guess that agent needs therapy...