Comment by cogman10
3 days ago
I'm going to be pretty blunt. Carmack gets worshiped when he shouldn't be. He has several bad takes in terms of software. Further, he's frankly behind the times when it comes to the current state of the software ecosystem.
I get it, he's legendary for the work he did at id software. But this is the guy who only like 5 years ago was convinced that static analysis was actually a good thing for code.
He seems to have a perpetual view on the state of software. Interpreted stuff is slow, networks are slow, databases are slow. Everyone is working with Pentium 1s and 2MB of ram.
None of these are what he thinks they are. CPUs are wicked fast. Interpreted languages are now within a single digit multiple of natively compiled languages. Ram is cheap and plentiful. Databases and networks are insanely fast.
Good on him for sharing his takes, but really, he shouldn't be considered a "thought leader". I've noticed his takes have been outdated for over a decade.
I'm sure he's a nice guy, but I believe he's fallen into a trap that many older devs do. He's overestimating what the costs of things are because his mental model of computing is dated.
> But this is the guy who only like 5 years ago was convinced that static analysis was actually a good thing for code.
Why isn't it?
> Interpreted stuff is slow
Well, it is. You can immediately tell the difference between most C/C++/Rust/... programs and Python/Ruby/... Either because they're implicitly faster (nature) or they foster an environment where performance matters (nurture), it doesn't matter, the end result (adult) is what matters.
> networks are slow
Networks are fast(er), but they're still slow for most stuff. Gmail is super nice, but it's slower than almost desktop email program that doesn't have legacy baggage stretching back 2-3 decades.
> Why isn't it?
I didn't say it isn't good. I'm saying that Carmack wasn't convinced of the utility until much later after the entire industry had adopted it. 5 years is wrong (time flies) it was 2011 when he made statements about how it's actually a good thing.
> Well, it is.
Not something I'm disputing. I'm disputing the degree of slowness, particularly in languages with good JITs such as Java and Javascript. There's an overestimation on how much the language matters.
> Gmail is super nice, but it's slower than almost desktop email program
Now that's a weird comparison.
A huge portion of what makes gmail slow is that it's a gigantic and at this point somewhat dated javascript application.
Look, not here to defend JS as the UX standard of modern app dev, I don't love it.
What I'm talking about slow networks is mainly in terms of backend servers talking to one another. Because of the speed of light, there's always going to be some major delays moving data out of the datacenter into a consumer device. Within the datacenter, however, things will be pretty dang nippy.
[flagged]
Wrongo.
I'm primarily a backend dev doing Java. I've had my opinion about Carmack for a while.
You're just hurt that I criticized someone you like.
> Interpreted languages are now within a single digit multiple of natively compiled languages.
You have to be either clueless or delusional if you really believe that.
Let me specify that what I'm calling interpreted (and I'm sure carmack agrees) is languages with a VM and JIT.
The JVM and Javascript both fall into this category.
The proof is in the pudding. [1]
The JS version that ran in 8.54 seconds [2] did not use any sort of fancy escape hatches to get there. It's effectively the naive solution.
But if you look at the winning C version, you'll note that it went all out pulling every single SIMD trick in the book to win [3]. And with all that, the JS version is still only ~4x slower (single digit multiplier).
And if you look at the C++ version which is a near direct translation [4] which isn't using all the SIMD tricks in the books to win, it ran in 5.15. Bringing the multiple down to 1.7x.
Perhaps you weren't thinking of these JIT languages as being interpreted. That's fair. But if you did, you need to adjust your mental model of what's slow. JITs have come a VERY long way in the last 20 years.
I will say that languages like python remain slow. That wasn't what I was thinking of when I said "interpreted". It's definitely more than fair to call it an interpreted language.
[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[2] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[3] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[4] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
And here we get into the absurdity of microbenchmarks like these.
Yes, you can get the JVM to crunch some numbers relatively fast, particularly if the operations are repetitive enough that you can pull JIT tricks.
Now try to run something that looks a bit more like an actual application and less like a cherry picked example, and as soon as you start moving memory around the gap jumps to orders of magnitude.
1 reply →
fwiw There are a few naive un-optimised single-thread #8 n-body programs transliterated line-by-line literal style into different programming languages from the same original. [1]
> a single digit multiple
By which you mean < 10× ?
So not those Java -Xint, PHP, Ruby, Python 3 programs?
> interpreted
Roberto Ierusalimschy said "the distinguishing feature of interpreted languages is not that they are not compiled, but that any eventual compiler is part of the language runtime and that, therefore, it is possible (and easy) to execute code generated on the fly." [2]
[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[2] "Programming in Lua" 1st ed p57
2 replies →
A simple do-nothing for loop in JavaScript via my browser's web console will run at hundreds of MHz. Single-threaded, implicitly working in floating-point (JavaScript being what it is) and on 2014 hardware (3GHz CPU).