Comment by visarga
1 day ago
> Imagine, for a moment, a world with no humans. Just machines, bolts and screws, zeros and ones. There is no emotion. There is no art. There is only logic. Humans use logic-defying algorithms called “emotions”. They get angry. They get sad. They have fun. They make decisions based on “gut”.
This is not right, machines can also have the equivalent of "emotions", it is the predicted future reward. It's how Reinforcement Learning works. How much we appreciate something is akin to the value function in RL. You could say RL is a system for learning emotions, preferences, and tactics.
"But those reward signals are designed by humans"... Right. But are AI models really not affected by physical constraints like us? They need hardware, data and energy. They need humans. Humans decide which model gets used, which approaches are replicated, where we want to invest.
AI models are just as physically constrained as humans, they don't exist in a platonic realm. They are in a process of evolution like memes and genes. And all evolutionary systems work by pitting distributed search against distributed constraints. When you are in a problem space, emotions emerge as the value we associate to specific states and outcomes.
What I am saying is that emotions don't come from the brain, they come from the game. And AI is certainly part of many such games, including the one deciding their evolution.
No comments yet
Contribute on Hacker News ↗