← Back to context

Comment by koof

2 years ago

Blurriness gets weird when you're talking about truth.

Depending on the application we can accept a few pixels here or there being slightly different colors.

I queried GPT to try and find a book I could only remember a few details of. The blurriness of GPT's interpretation of facts was to invent a book that didn't exist, complete with a fake ISBN number. I asked GPT all kinds of ways if the book really existed, and it repeatedly insisted that it did.

I think your argument here would be to say that being reversible to a real book isn't the intent, but that's not how it is being marketed nor how GPT would describe itself.

I think that strengthens my point. We consider a blurry image of something to still be a true representation of that thing. We should never consider a GPT representation of a thing to be true.