Comment by neilwilson
24 days ago
That’s what a C compiler does when generating a binary.
There was a time when you had to know ‘as’, ‘ld’ and maybe even ‘ar’ to get an executable.
In the early days of g++, there was no guarantee the object code worked as intended. But it was fun working that out and filing the bug reports.
This new tool is just a different sort of transpiler and optimiser.
Treat it as such.
> There was a time when you had to know ‘as’, ‘ld’ and maybe even ‘ar’ to get an executable.
No, there wasn't: you could just run the shell script, or (a bit later) the makefile. But there were benefits to knowing as, ld and ar, and there still are today.
> But there were benefits to knowing as, ld and ar, and there still are today.
This is trivially true. The constraint for anything you do in your life is time it takes to know something.
So the far more interesting question is: At what level do you want to solve problems – and is it likely that you need knowledge of as, ld and ar over anything else, that you could learn instead?
Knowledge of as, ld, ar, cc, etc is only needed when setting up (or modifying) your build toolchain, and in practice you can just copy-paste the build script from some other, similar project. Knowledge of these tools has never been needed.
3 replies →
If you don't see a difference between a compiler and a probabilistic token generator, I don't know what to tell you.
And, yes, I'm aware that most compilers are not entirely deterministic either, but LLMs are inherently nondeterministic. And I'm also aware that you can tweak LLMs to be more deterministic, but in practice they're never deployed like that.
Besides, creating software via natural language is an entirely different exercise than using a structured language purposely built for that.
We're talking about two entirely different ways of creating software, and any comparison between them is completely absurd.
They are 100% different and yet kind-of-the-same.
They can function kind-of-the-same in the sense that they can both change things written in a higher level language into a lower level language.
100% different in every other way, but for coding in some circumstances if we treat it as a black box, LLMs can turn higher level pseudocode into lower level code (inaccurately), or even transpile.
Kind of like how email and the postal service can be kind of the same if you look at it from a certain angle.
> Kind of like how email and the postal service can be kind of the same if you look at it from a certain angle.
But they're not the same at all, except somewhat by their end result, in that they are both ways of transmitting information. That similarity is so vague that comparing them doesn't make sense for any practical purpose. You might as well compare them to smoke signals at that point.
It's the same with LLMs and programming. They're both ways of producing software, but the process of doing that and even the end result is completely different. This entire argument that LLMs are just another level of abstraction is absurd. Low-Code/No-Code tools, traditional code generators, meta programming, etc., are another level of abstraction on top of programming. LLMs generate code via pattern matching and statistics. It couldn't be more different.
People negating down your comment are just "engineers" doomed to fail sooner or later.
Meanwhile, 9front users have read at least the plan9 intro and know about nm, 1-9c, 1-9l and the like. Wibe coders will be put on their place sooner or later. It´s just a matter of time.
Competent C programmers know about nm, as, ld and a bunch of other binary sections in order to understand issues and proper debugging.
Everyone else are deluding themselves. Even the 9front intro requieres you to at least know the basics of nm and friends.