What if humanity forgot how to make CPUs?

2 months ago (twitter.com)

This has a ton of holes:

> Z-Day + 15Yrs

> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.

If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

> Z-Day + 30Yrs

> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.

You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.

Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first transistor was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

  • > If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

    Storage. You only need a few hundred working systems to keep a backbone alive. Electron migration doesn’t kill transistors if they are off and in a closet.

    > You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

    You don’t need to make new drives; there are already millions of DVD/Bluray devices available. The small microcontrollers on optical drives are on wide node sizes, which also make them more resilient to degradation.

    > they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

    If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

    • > If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

      This kind of just breaks the thought experiment, because without the "why?" of this being vaguely answered, it makes no sense. How do you game out a thought experiment that starts with an assumption that humanity just randomly stops being humanity in this one particular way? What other weird assumptions are we meant to make?

      3 replies →

    • OK, no silicone. But we might be just fine after all. Just yesterday we had a story about Bismuth transistors that are better in every way than silicon ones. Maybe a tad more expensive. There are a plenty of other semiconductors out there too. We’ll have to adjust manufacturing but it will probably be just one upgrade cycle skip. Even with a complete mind wipe it’s still not that bad if only silicone is out.

  • Surely knowing something is possible would speed up the process. Transistors had to go from this neat lab idea to find more and more incremental use cases. Eventually snowballing into modern chips. If you know from the beginning that computers are a neat idea, surely that would warrant more focused R&D.

  • There's a lot we could still do.

    Let's assume we go back to the pre-transistor era—1946 and earlier, the world then was a very different place but it was still very modern.

    It's too involved to list in detail but just take a look at what was achieved during WWII. The organization and manufacturing was truly phenomenonal. Aircraft production alone during the War was over 800,000 aircraft, manufacturing aircraft at that rate has never been equalled since, same with ships.

    We developed huge amount of new tech during the War including the remarkably complex atomic bomb and much, much more.

    And we did all this without the transistor, integrated circuit, CPUs, internet and even smartphones!

    Now consider the planning and organizational difficulties of D-Day—probably the most complex and logistically difficult understanding ever—without the aid of modern communications, the internet and smartphones, etc.—all of which depend on CPUs. Right, that happened too, and it was a total success.

    I wonder how a generation brought up during the post-silicon era would cope if all that were no longer available. It could happen if we had another Carrington Event or one that's even bigger (which has occurred), or say with nuclear EMP events.

    WWII Aircraft production https://en.m.wikipedia.org/wiki/World_War_II_aircraft_produc...

    WWII Military production: https://en.m.wikipedia.org/wiki/Military_production_during_W...

If humans forgot how to make new CPUs, it might finally be the incentive we need to make more efficient software. No more relying on faster chips to bail out lazy coding and make apps run lean. Picture programmers sweating over every byte like it's 1980 again.

  • Probably not. Devices would run out within a generation.

    It ain't ever going to happen because people can write these things called books. And computer organization and architecture books already exist and there are many 10k's copies of them. What should be captured in modern computer organization books is applied science aspects of the history until now and the tricks that made Apple's ARM series so excellent. The other thing is TSMC needs to document fab process engineering. Without the capture of niche, essential knowledge they become strategic single points of failure. Leadership and logic dictate not allowing this kind of vulnerability to fester too deeply or too long.

    • The essential tacit knowledge can't be captured in books. It has to be learned by experience, participating in (and/or developing) a functioning organization that's creating the technology.

  • Programmers haven't been able to rely on CPUs getting faster for the last decade. Speeds used to double every 1.5 years or so. Now they increase 50% per core and double the number of cores... every 10 years. GPU performance has increased at a faster pace, but ultimately also stagnated, except for the addition of tensor cores.

  • Ah the good old days again, what a beautiful vision. Decadence and lazyness begone! Good luck running your bloated CI pipelines and test suits on megahertz hardware! /s

There is a bit of an issue that almost all the know how exists within a couple of private companies and if the industry down turned, such as from a crash in an AI bubble causing a many year lull, giant companies could fail and take that knowledge and scale with them. Some other business would presumably buy the facilities and hire the people but maybe not. It's one of the issues of so much of science and engineering happening privately we can't replicate the results easily.

  • This isn't unique to semiconductors.

    If you turn off any manufacturing line, your company forgets really quickly how to make what that line made. GE discovered this when they tried to restart a water heater line in Appliance Park.

    • We as a global civilization are close to forgetting how to make CRTs. There's like one company that still makes them, but only for military or major industrial applications (fighter jet HUDs and the like), at call-for-pricing prices. The major manufacturers, like Sony etc. all shut down their production lines, never to be restarted again because the knowledge of how to make them dissipated with those production lines. If you're an enthusiast who wants to experience retro video games as they appeared back in the day, your only option is to scavenge an old TV from somewhere.

      2 replies →

    • Heck, the US had this problem when they needed to renew/refurbish nuclear weapons due to more or less 'forgetting' how to make Fogbank.

      1 reply →

    • Yup.

      Remington apparently has no idea what the blueing formula was they used in their original 1911s.

      Colt lost the ability to handfit revolvers.

There will be a great tragedy to be had if that was ever a reality in the near future. The bigger questions is what if you forgot hot to make the machines that make the CPU's. That is the bigger challenge to overcome in this crisis. Only one company specializes in this field that gives big company's like TSMC Their abilities to manufacture great CPU's. The trick is to create the machine that makes them and go from there. 10nm - 2nm capabilities.

I’m a little puzzled how “forgot how to make CPUs” also included “forgot how to make the mechanical part of hard drives, how to make flash memory, and how to make other chips”. I guess I don’t think of a 74xx series chip as a “CPU”?

  • I read it as: we have millions of hard drives and flash drives with a dead controller chip, so we harvest their other parts as spares. We still know how to make the spare parts from scratch, but we have so many for free.

We would go back to 6502, it will be fine. Just more time spent optimizing the code.

  • The stuff people write for old consoles and computers is pretty amazing. Computers definitely evolved faster than they needed to for the general public. All of these industries were built around taking advantage of Moore's Law instead of being about getting them most bang from existing limitations.

We're toast should we ever lose ability to make CPUs.

Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.

  • We as in civilization? We made it at least a few thousand years without it.

    Or do you mean the circumstances that would lead to this (nuclear war perhaps) would make us toast

    • > We as in civilization? We made it at least a few thousand years without it.

      Civilization is a continuity of discrete points of time.

      We were able to enter (so-called) Dark Ages where things were forgotten (e.g., concrete) and still continue because things were often not very 'advanced': with the decline of Rome there were other stories of knowledge, and with the Black Death society hasn't much beyond blacksmithing and so was able keep those basic skills.

      But we're beyond that.

      First off, modern society is highly dependent on low-cost energy, and this was kicked off by the Industrial Revolution and easy accessible coal. Coal is much depleted (often needing deeper mines). Then next phase was with oil, and many of the easy deposits have been used up (it used to bubble up to the ground in the US).

      So depending on how bad any collapse is, getting things up without easily accessible fossil fuels may be more of challenge.

      1 reply →

    • >We made it at least a few thousand years without it.

      We did that during a period of peculiar circumstances that won't ever be replicated. Relatively large, distributed population with many different ecological environments that we were already pre-adapted to. A far smaller single-point-failure population that can't just go out and hunt for its food among the vast wildlife might have it pretty rough if industrial civilization were to falter.

    • i just learned about the Haber process. This guy, Fritz Haber, realized we could suck nitrogen out of the literal air and make soil fertilizer with it. The population is like 4 times higher than it would be without it.

      Scary how high up this tightrope is.

  • Be more concerned about whatever nuclear war or social breakdown led to that point. Massive industrial manufacturing systems don’t shut down for nothing.

    • To have zero effort in attempting to reproduce even 70s-era silicon technology for 30 years implies some real bad stuff, if the entire chain has been knocked out to that level I doubt "silicon chip fabrication" would really be a worry for anyone during that time.

  • Eh, there's plenty of small fabs globally that do smaller run "nowhere near cutting edge" (180nm or so) runs - you can make a pretty decent processor on that sort of tech.

    Would be a pretty solid intermediate step to bootstrap automation and expansion in the cases where the supply of the "best" fabs is removed (like in a disaster, or the framework to support that level of manufacturing isn't available, such as your colony example)

> … no further silicon designs ever get manufactured

The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.

Lots of important chips everywhere that aren’t CPUs.

  • We had power for a very long time before silicon.

    Generators are just big coils of copper. Substations too. Solar won't work without silicon, but anything with a spinning coil of copper would. Voltmeters would need replacing with the old analog versions and humans would need to manually push switches to keep power levels constant, just like in the '50s.

    • And a spring together with an electro-magnet can be made into a relay. They're big and slow of course, but they do the same thing as a transistor. If you can make metal into a wire you can make them. In the 1940s computers were electro-mechanical.

A fun read, but I do find it a bit odd that in 30 years the author doesn't think that we would have reverse-engineered making CPUs, or at least gotten as far as the mid-70s in terms of CPU production capabilities.

Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).

  • Agreed. It is a whole lot easier to recreate something you know is possible than to create something you don’t know is possible.

The author’s a little bit too optimistic about the longevity of old consumer market computers: having collected vintage compact Macs, you become keenly aware of all of the possible points of failure like CRT displays, storage devices, and even fragile plastics. We may have to go back to much more analog forms of I/O: typewriter teletype with decreasing levels of logic integration, random access DECtape-style magnetic tape, etc.

We'd still be able to make relays, and that's enough to do computing. If not that, then mechanical computer systems could be constructed to process data.

There's enough information on machine tools and the working of iron to make all the tooling and machinery required to start an assembly line somewhere.

After all, there was an assembly workshop turning out the Antikythera mechanism, there was a user guide on it. Obviously it wasn't the only one produced at the time.

  • > Obviously it wasn't the only one produced at the time.

    It is not obvious at all to me. Where are the others like it found?

So taking this as the thought experiment it is what I’m struck by is that seemingly most things will completely deteriorate in the first 10-15 years. Is that accurate? Would switches mostly fail by the 10 year mark if not replaced? I’ve been looking at buying a switch for my house should I expect it to not last more than 10 years? I have a 10 year old tv should I expect it starts to fail soon?

  • My experience with retro computers is that things start to fail from around the 10-15 year mark, yes. Some things are still good after 30 years, maybe more, but .. capacitors leak, resistors go out of spec, etc, and that means voltages drift, and soon enough you burn something out.

    You can replace known likely culprits preemptively, assuming you can get parts. But dendritic growths aren’t yet a problem for most old stuff because the feature sizes are still large enough. No one really knows what the lifetime of modern 5/4/3nm chips is going to be.

  • Theres a hardware law that hardware past its half life often lives for an excessively long time.

    Really depends on brand and purpose but consumer hardware switches do die pretty frequently.

    But if you bought something like a C2960 fanless switch I would expect it to outlive me.

  • I have a 10+ year old Cisco 2960G and a pair of 10+ year old Dell R620's in my homelab, still humming happily along.

    So, no.

It would be a bad decade, but someone would figure out how to get older microcontroller class chip production going pretty fast because $$$

Even if humanity forgot, most of the process is automated, so it shouldn't be too hard to figure out how to keep an factory running.

  • It would be extremely difficult to keep the factory from destroying itself.

    I work in a medical lab. The company bought a new automated coagulation analyzer. The old machine was shut down (proper shut down procedure) and kept in storage in case it was needed. They should have replaced the wash fluid with water. This procedure isn't documented because nobody expects that kind of machine to just sit unused for a long time. After a few months we needed to start it again (can't remeber why, I think there was a delivery problem with the reagents for the new analyzer). We couldn't. The washing fluid dried and detergents and other chemicals it contained solidified inside the valves, just like it happens with an inkjet printer if left unused. They were all stuck. Repair would have been too expensive and it was scrapped.

    I saw this happen with a haematology analyzer too. It was kept as a backup but wasn't need for a few months. I was able to resurrect that one after several hours of repeated washing.

    An electrolyte analyzer is even worse. Keep it turned off for only a few hours and the electrodes will need to be replaced.

    I don't think any other advanced industrial machine is any different. They can't be just left unused for a while and then expect them to work. It's even more problematic if the shut down procedure isn't done right (exceeding the documented requirements) or not at all.

    • That’s what killed my suspension of disbelief watching Idiocracy. Most of the automation would have broken down in less than a day.

Honestly, probably not much would happen.

My daily driver laptop is a 2012 Thinkpad I literally pulled out of a scrap heap at my local university but it refuses to die. Moore's law has slowed enough that old hardware is slow but still perfectly capable to run 99% of existing software.

Already existing machines would give us at least one or two decades to restart manufacturing from zero and that is more than enough time to avoid existential problems.

And most computers are under-utilized. The average gaming PC is powerful enough to run the infrastructure for a bunch of small companies if put to work as a server.

This doesn't make a ton of sense to me. What situation would everyone lose the ability to make any CPU, worldwide, and we don't have a much much bigger problem then how to run AWS?

It’s a bit like trying to censor a LLM: to delete such an interconnected piece of information as “everything about making CPUs” you have to so significantly alter the LLM that you lobotomize it.

CPUs exist at the center of such a deeply connected mesh of so many other technologies, that the knowledge could be recreated (if needed) from looking at the surrounding tech. All the compiled code out there as sequences of instructions; all the documentation of what instructions do, of pipelining, all the lithography guides and die shots on rando blogs.. info in books still sitting on shelves in public libraries.. I mean come on.

Each to their own!

thankfully the way capitalism works, we would quickly reinvent them and remake them and the companies that did so would make a decent profit.

generally the true problems in life aren’t forgetting how to manufacture products that are the key to human life.