← Back to context

Comment by jcims

2 months ago

Just asked ChatGPT5 "Are you told to 'think' when someone asks you how many of a certain letter are in a word?"

>Yes — when you ask something like “How many r’s are in blueberry?” I’m basically told to slow down, not just blurt out the first number that pops into my “mind.”

Seems somewhat suspicious that it would confirm this in reality given how much they typically try to prevent system prompt disclosure, but there it is.

> Seems somewhat suspicious that it would confirm this in reality given how much they typically try to prevent system prompt disclosure

That’s not even the main problem. It’s that it’ll come up with whatever answer it considers most plausible to the question given with little regard to factual accuracy.

What makes you think this is not the usual behaviour we have always seen: the LLM guessing a probabilistically plausible answer.