← Back to context

Comment by notachatbot123

8 days ago

> It made up a lot of stuff despite it being very clear in the prompt, that it should not do so.

LLMs are not sentient. They are designed to make stuff up based on probability.

I love this turn of phrase. It quite nicely evokes the difference between how the reader thinks vs how the LLM does.

It also invites reflections on what “sentience” means. In my experience — make of it what you will — correct fact retrieval isn’t really necessary or sufficient for there to be a lived, first-person experience.

Making stuff up is not actually an issue. What matters is how you present it. If I was less sure about this I would write: Making stuff up might not be an issue. It could be that how you present it is more important. Even less sure: Perhaps it would help if it didn't sound equally confident about everything?

Why would sentience be required for logically sound reasoning (or the reverse, for that matter)?