← Back to context

Comment by jacobgkau

7 days ago

> Biased how?

Biased as in I'm pretty sure he didn't write an AI prompt that was the "opposite" of what he wanted.

And generalizing something that "might" happen as something that "will" happen is not actually an "opposite," so calling it that (and then basing your assumption of that person's prompt-writing on that characterization) was a stretch.

This honestly feels like a diversion from the actual point which you proved: for some class of issues with LLMs, the underlying problem is learning how to use the tool effectively.

If you really need me to educate you on the meaning of opposite...

"contrary to one another or to a thing specified"

or

"diametrically different (as in nature or character)"

Are two relevant definitions here.

Saying something will 100% happen, and saying something will sometimes happen are diametrically opposed statements and contrary to each other. A concept can (and often will) have multiple opposites.

-

But again, I'm not even holding them to that literal of a meaning.

If you told me even half the time you use an LLM the result is that it solves a completely different but simpler version of what you asked, my advice would still be to brush up on how to work with LLMs before diving in.

I'm really not sure why that's such a point of contention.

  • > Saying something will 100% happen, and saying something will sometimes happen are diametrically opposed statements and contrary to each other.

    No. Saying something will 100% happen and saying something will 100% not happen are diametrically opposed. You can't just call every non-equal statement "diametrically opposed" on the basis that they aren't equal. That ignores the "diametrically" part.

    If you wanted to say "I use words that mean what I intend to convey, not words that mean something similar," that would've been fair. Instead, you brought the word "opposite" in, misrepresenting what had been said and suggesting you'll stretch the truth to make your point. That's where the sense of bias came from. (You also pointlessly left "what I intend to convey" in to try and make your argument appear softer, when the entire point you're making is that "what you intend" isn't good enough and one apparently needs to be exact instead.)

    • This word soup doesn't get to redefine the word opposite, but you're free to keep trying.

      Cute that you've now written at least 200 words trying to divert the conversation though, and not a single word to actually address your demonstration of the opposite of understanding how the tools you use work.