← Back to context

Comment by adastra22

10 hours ago

We're talking about artificial intelligence. Making computers think the way people do. People are are notoriously miscalibrated on their own self-assessed probabilities too.

Finding a way to objectively calibrate a sense of "how confident do I feel about this?" would be fantastic. But let's not move goal posts. It would still be incredibly useful to have a machine that can merely matches the equivalent statement of confidence or uncertainty that a human would assign to their mental model, even if badly calibrated.

IMO it is you who are moving the goalposts, most likely in an attempt to hide the fact you were unaware of calibration before this discussion.

> It would still be incredibly useful to have a machine that can merely matches the equivalent statement of confidence or uncertainty that a human would assign to their mental model, even if badly calibrated.

If human feelings are badly calibrated, they are useless here too, so no, I don't agree. Things like "confidence" only matter if they are actually tied to real outcomes in a consistent way, and that means calibration.