Comment by dgellow
4 hours ago
The chat UX with a fake-human lying to you and framing things emotionally really doesn’t help. And it is pretty much not possible to get away from it, or at least I haven’t found yet how.
I would love to see a model trained to behave way more like a tool instead of auto-completing from Reddit language patterns…
No comments yet
Contribute on Hacker News ↗