← Back to context

Comment by usefulcat

3 hours ago

How is knowing what word is most likely to come next in a series of words remotely the same as having "the concept of truth and facts"?

how would you prove that a human has it?

  • Whataboutism is almost never a compelling argument, and this case is no exception.

    ETA:

    To elaborate a bit: based on your response, it seems like you don't think my question is a valid one.

    If you don't think it's a valid question, I'm curious to know why not.

    If you do think it's a valid question, I'm curious to know your answer.

    • its not whataboutism, i'm simply asking how you would perform the same test for a human. then we can see if it applies or not to chatgpt?