Comment by flanked-evergl

1 year ago

> The failure is in how you're using it.

People, for the most part, know what they know and don't know. I am not uncertain that the distance between the earth and the sun varies, but I'm certain that I don't know the distance from the earth to the sun, at least not with better precision than about a light week.

This is going to have to be fixed somehow to progress past where we are now with LLMs. Maybe expecting an LLM to have this capability is wrong, perhaps it can never have this capability, but expecting this capability is not wrong, and LLM vendors have somewhat implied that their models have this capability by saying they won't hallucinate, or that they have reduced hallucinations.

> the distance from the earth to the sun, at least not with better precision than about a light week

The sun is eight light minutes away.

  • Thanks, I was not sure if it was light hours or minutes away, but I knew for sure it's not light weeks (emphasis on plural here) away. I will probably forget again in a couple of years.

Empirically, they have reduced hallucinations. Where do OpenAI / Anthropic claim that their models won't hallucinate?