Comment by dabaja
3 days ago
Interesting framing, but I think emotions are a proxy for something more tractable: loss functions over time. Engineers remember bad hygiene because they've felt the cost. You can approximate this for agents by logging friction: how many iterations did a task take, how many reverts, how much human correction. Then weight memory retrieval by past-friction-on-similar-tasks. It's crude, but it lets the agent "learn" that certain shortcuts are expensive without needing emotions. The hard part is defining similarity well enough that the signal transfers. Still early, but directionally this has reduced repeat mistakes in our pipeline more than static rules did.
How do you choose which loss function over time to pursue?