Comment by hackyhacky

19 days ago

> you lack actual understanding of how compilers work

My brother in Christ, please get off your condescending horse. I have written compilers. I know how they work. And also you've apparently never heard of undefined behavior.

The point is that the output is different at the assembly level, but that doesn't matter to the user. Just as output from an LLM but differ from another, but the user doesn't care.

Undefined behavior is a edge case in C. Other programing languages (like JavaScript) goes to great lengths in defining their standards such that it is almost impossible to write code with undefined behavior. By far majority of code written out there has no undefined behavior. I think it is safe to assume that everyone here (except you) are talking about C code without undefined behavior when we mean that the same code produces the same results regardless of the compiler (as long as the compiler is standards conforming).

  • Language-based undefined behavior is just one kind of nondeterminism that programmers deal with every day. There are other examples, such as concurrency. So that claim that using LLMs isn't programming because "nondeterminism" makes no sense.

    • > Language-based undefined behavior is just one kind of nondeterminism that programmers deal with every day.

      Every day you say. I program every day, and I have never, in my 20 years of programming, on purpose written in undefined behavior. I think you may be exaggerating a bit here.

      I mean, sure, some leet programmers do dabble in the undefined behavior, they may even rely on some compiler bug for some extreme edge case during code golf. Whatever. However it is not uncommon when enough programmers start relying on undefined behavior behaving in a certain way, that it later becomes a part of the standard and is therefor no longer “undefined behavior”.

      Like I said in a different thread, I suspect you may be willfully ignorant about this. I suspect you actually know the difference between:

      a) written instructions compiled into machine code for the machine to perform, and,

      b) output of a statistical model, that may or may not include written instructions of (a).

      There are a million reasons to claim (a) is not like (b), the fact that (a) is (mostly; or rather desirably) deterministic, while (b) is stochastic is only one (albeit a very good) reason.

You don't sound like you have written any code at all actually. What you do sound like is someone who is pretending like they know what it means to program which happens a lot on the internet.

  • > You don't sound like you have written any code at all actually

    Well, you sound like an ignorant troll who came here to insult people and start fights. Which also happens a lot on the internet.

    Take your abrasive ego somewhere else. HN is not for you.

    • I don't care what I sound like to people who front about their programming skills. I'm not here to impress people like you.