← Back to context

Comment by gowld

1 month ago

Are you saying that needless sentences don't count as needless words?

As GPT would say, "You've hit upon a crucial point underlying the entire situtation!"

I mean, it's usually wrong in its rhetoric, and the writing isn't "good", but it's technically well constructed and it's well constructed in a way that "Hemingway" doesn't reject.

Like, if I ask GPT5 to convert 75f to celsius, it will say "OK, here's the tight answer. No fluff. Just the actual result you need to know." and then in a new graf say "It's 23.8c." (or whatever).

  • It already bugs me when ChatGPT describes how it is going to answer before answering, but it's 10x more annoying when I'm asking for a concise response without filler etc.

    As an aside, I've noticed the self-description happens even more often when extended thinking mode is being used. My unverified intuition is that it references my custom instructions and memory more than once during the thinking process, as it then seems more primed than usual to mimic vocabulary from any saved text like that.

    • Right, it is currently incapable of providing a straight answer without clearing it's throat selling the answer. It reminds me of those recipe blogs that just can't get to the fucking recipe. It's bad writing! But it's not bad technically, in a style-guide kind of way.

      4 replies →