← Back to context

Comment by TimorousBestie

2 months ago

I kinda doubt it. The theoretical knowledge is there, but there’s a huge gulf between that and all the practical knowledge/trade secrets held by TSMC.

Another view on this topic is https://gwern.net/slowing-moores-law

The stuff that really matters is mostly on microcontrollers.

The few industries that push computing out of need would suffer. Certain kinds of research, 3D modeling.

But most of what we use computers for in offices and our day-to-day should work about as well on slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.

We’re using 30 years of hardware advancements to run JavaScript instead of doing new, helpful stuff. 30 years of hardware development to let businesses save a little on software development while pushing a significant multiple larger than that cost onto users.

  • > slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.

    Early 90's Intel was the 486 33 Mhz. It barely had enough performance to run the TCP/IP stack at a few hundred KB/sec, using all of the CPU just for that task. I think you forgot how slow it was. Pentium II is where it starts to get reasonably modern in the late 90's. Pentium Pro (1995) was their first with multiprocessor support. It was moving so fast back then that early/mid/late 90's was like comparing decades apart at todays pace of improvement.

    • 166MHz pentium with 128MB (not a typo, kids!) of memory felt luxuriously snappy and spacious, including with tabbed web browsing in Phoenix/Firebird/Firefox… running BeOS or QNX-Photon. Not so much under Linux or Windows.

      Not so far removed from a multi-CPU Pentium at 90 or 100MHz, from the very early Pentium days.

      I guess what I had in mind was first-gen Pentiums. They’re solidly in the first half of the ‘90s but “early 90s” does cover a broader period, and yeah, 486 wouldn’t quite cut it. They’re the oldest machines I can recall multitasking very comfortably on… given the right software.

      1 reply →

If you wanna make something that's competitive with the latest and greatest, sure. But there's literally thousands of fabs that can make _a_ CPU, and hundreds that can make something that is usable in a PC, even if not very fast. There's a huge span of semiconductor fabrication beyond the bleeding edge of digital logic.

  • One thing the above did have though was a mention that the high end would quickly be worth its weight in gold.

    The nvidia dgx b200 is already selling for half a million. The nearest non tsmc produced competitor doesn’t come close. Imagine no more supply!

I might be biased, being an insider of semiconductor industry, to think that there gulf isn't that huge. Virtuakly everything is known down to what, 28nm or so. That's still a fairly good process and pretty impossible to forget.

There are other semiconductor manufacturers, right? Certainly it would be catastrophic to the industry, and would likely set progress back a while, but it would hardly be insurmountable. This discussion also assumes TSMC wouldn't sell/trade knowledge/processes, or they'd not be stolen, which wouldn't be crazy, given the hypothetical war in the region

Intel still knows 14nm quite well, and would likely sell access to the line if asked.

If Taiwan ceased to exist, that would put us a decade back.

  • Samsung is just tiny bit behind TSMC.

    The gap isn't a decade, more like 12-18 months.

    Also, TSMC has 5nm production in the US. There are actual people with know how of this process in the US.

    • All of their photolithography equipment is manufactured in the Netherlands by ASML

      https://www.asml.com/en

      Other companies are using the same equipment (Samsung and Intel) but TSMC has deeper expertise and got more out of the same equipment so far.

TSMC factories use ASML hardware (designed and built in the Netherlands), that actually produces the chips.

https://www.asml.com/en

TSMC is running a successful business but they're not the only customers of ASML.