Comment by NoMoreNicksLeft

2 months ago

>There's a guy who makes them in his garage.

Savant-tier, obsessive, dedicates his life to it "guy" does it in his garage over a period of how many years, and has succeeded to what point yet? Has he managed even a single little 8-bit or even 4-bit cpu? I'm cheering that guy on, you know, but he's hardly cranking out the next-gen GPUs.

>the market would remember

Markets don't remember squat. The market might try to re-discover, but this shit's path dependent. Re-discovery isn't guaranteed, and it's even less likely when a civilization that is desperate to have previously-manufacturable technology can't afford to dump trillions of dollars of research into it because it's also a poor civilization due to its inability to manufacture these things.

You don't need trillions of dollars to start making tubes again. And it wouldn't be that one guy doing it for funsies, would it? If the question was "can one hobbyist bootstrap everything on his own" then I would agree. Maybe you completely lose even the insight that a small electric current can be used to switch or modulate a larger one. But if you're also losing mid-high-school physics knowledge, that's a different issue.

As I said, you probably won't ever get to where we are now with the technology, but then again probably 99.999% of computing power is wasted on gimmicks and inefficiency. Probably more these days. You could certainly run a vaguely modern society on only electromechanical and thermionic gear - you have power switching with things like thyrotrons, obviously radios, and there were computers made that way, such as the Harwell Witch in 1952.

Maybe you don't get 4K AI video generation or petabyte-scale advertising analytics but you could have quite a lot.

  • Looking at the Ryzen 7 9800X running at 5.2 GHz, if you chopped off 99.999% of that, you'd get a 52 kHz CPU, with 6.6 megaflops vs the original 6.6 gigaflops.

    For reference, the original 4004 Intel CPU from 1971 ran at 740 kHz, so 52 kHz isn't even enough computing to do a secure TLS web connection without an excessively long wait. The 4004 did not do floating point, however, and it wouldn't be until between the 486 (1989) and the Pentium (1993) that we see 5-10 MFLOPS of performance.

    • > 6.6 megaflops vs the original 6.6 gigaflops.

      Hmm... I think 9800X should be able to do at least 32 FLOPS per cycle per core. So 1.3 TFLOPS is the ceiling for the CPU. 1/100000 leaves you... 12 MFLOPS.

      Then there's the iGPU for even more FLOPS.

    • 99.999 may be an ass-pull of a figure, but I was thinking more in terms of having whole datacentres screaming along doing crypto, billions of cat videos, Big Data on "this guy bought a dishwasher, give him more dishwasher adverts", spinning up a whole virtual server to compile a million line codebase on every change, and AI services for pictures of a chipmunk wearing sunglasses. There's a good chunk of computation that we as a society could just go without. I know of embedded systems that run at hundreds of MHz and could replaced by no CPU at all and still fulfill the main task to some extent. Because early models indeed used no CPU. Many fewer functions, but they still fundamentally worked.

      Many things we now take for granted would indeed be impossible. I suppose the good news is that in some electropunk timeline where everyone had to use tubes, your TLS connection might not be practical, but the NSA datacentre would be even less practical. On the other hand, there'd be huge pressure on efficiency in code and hardware use. Just before transistorisation, amazing things were done with tubes or electromechanically, and if that had been at the forefront of research for the last 70 years, who knows what the state of the at would look like. Strowger switches would look like Duplo.

      Probably there would still be a lot of physical paperwork, though!

      1 reply →