Comment by usefulcat
9 days ago
How is knowing what word is most likely to come next in a series of words remotely the same as having "the concept of truth and facts"?
9 days ago
How is knowing what word is most likely to come next in a series of words remotely the same as having "the concept of truth and facts"?
how would you prove that a human has it?
Humans update their model of the world as they receive new information.
LLMs have static weights, therefore they cannot not have a concept of truth. If the world changes, they insist on the information that was in their training data. There is nothing that forces an LLM to follow reality.
what about a person with short term memory?
Whataboutism is almost never a compelling argument, and this case is no exception.
ETA:
To elaborate a bit: based on your response, it seems like you don't think my question is a valid one.
If you don't think it's a valid question, I'm curious to know why not.
If you do think it's a valid question, I'm curious to know your answer.
its not whataboutism, i'm simply asking how you would perform the same test for a human. then we can see if it applies or not to chatgpt?
2 replies →