← Back to context

Comment by timschmidt

5 days ago

SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.

Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.

Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.

  • > Flight simulators just had more cash for more advanced chips

    Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.

    Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.

    No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.

    • Computers themselves were non-consumer to begin with, but the Personal Computer broke the technology moat to consumers before anything else and once that had passed it was mostly a matter of time imho.

      Many 3d games like doom, quake1, flight unlimited,etc ran purely on software rendering since CPU's were already providing enough oomph to render fairly useful 3d graphics in the mid 90s. CPU power was enough but consoles/arcades showed that there was more to be gotten (but nothing hindered games at that point).

      And already there, the capital investment for game consoles (Atari,NES,SNES,PS1,PS2, etc) and arcade games(like the above mentioned 3d games) were big enough to use custom chipsets not used or purposed for anything else (I think also that in the 80s/90s the barrier of entry to making competitive custom chips was a tad lower, just consider the cambrian explosions of firms during the 90s making x86 and later ARM chips).

      Yes, there was vendors that focused on the high end commercial customers, and yes many alumnis of those firms did contribute a ton of expertise towards what we have today.

      But if you look at what companies survived and pushed the envelope in the longer run it was almost always companies that competed in the consumer market, and it was only when those consumer chips needed even more advanced processing that we breached the point where the chips became capable of NN's.

      In fact I'd say that had the likes of SGI prevailed we would've had to wait longer for our GPU revolution. Flight simulators,etc were often focused on "larger/detailed" worlds, PS2-era chips with higher polycounts and more memory would have been fine for simulator developers for a long time (since more details in a military scenario would have been fine).

      Leisure games has always craved fidelity on a more "human" level, to implement "hacks" for stuff with custom dynamic lighting models, then global illumination, subsurface scattering,etc we've needed the arbitrary programmability since the raw power wasn't there (the most modern raytracing chips are _starting_ to approach that levels without too ugly hacks).

Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.

3dfx didnt. They had a subsidiary? spinoff? Quantum3D that reused 3dfx commodity chips to build cards for simulators.