Comment by tshaddox
2 days ago
> I want my UX in an app to do the damn thing I precisely intend it to do, not what it believes I intend to do
This is pretty weird epistemological phrasing. It's a bit like saying "I want to know the truth, not just what I believe to be the truth!"
It's just stating a preference for deterministic behavior, i.e. tools not agents. Once you've learned how to use a tool it will reliably do what you intend it to do. With agents, whether human or AI, less initial learning is required but there's always a layer of indirection where errors can arise. The skill ceiling is lower.
Or maybe, “I want to know the truth, not just what you believe I’d like to hear as truth
Which seems… pretty reasonable to me. Both involve the other party substituting some vaguely patronizing judgment of theirs for the first party’s.
But isn't that generally true? That's why anyone bothers to question their existing beliefs -- because they prefer "the truth" over "what [they] believe to be true". Not everyone does, of course, but everyone did just value "what [they] believe to be true" then any sort of self-reflection on belief would be a strict net-negative!
The point is that you never get past what you believe to be true. If you discover that what you believe to be true is false, the only thing you can do is start believing something else to be true. It's silly to say that you want to stop doing this, as if there's some way to escape the cycle.