Comment by mrweasel
14 hours ago
It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.
At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.
https://en.wikipedia.org/wiki/S3_Texture_Compression
Loads of games from the era roundtripped their textures through lossy S3/DXT compression and then stored them as uncompressed RGB or RGBA.
I know this because I wrote a Unreal Engine texture repacking tool with a "DXT detection" feature so that I wouldn't be responsible for losing DXT compression on a texture which had already paid the price, only to find that this situation was already hyperabundant in the ecosystem.
Many Unreal Engine games of the day could have their size robotically halved just by re-enabling DXT compression in any case where this would cause zero pixel difference. This was at a time before Steam, when game downloads routinely took a day, so I was very excited about this discovery. Unfortunately, the first few developers I emailed all reacted with hostility to an unsolicited tip from what I'm sure they saw as a hacker, so I lost interest in pushing and it went nowhere. Ah well.
The article blew a huge opportunity to showcase the great diversity of “Pioneering Era” 3D accelerators (they weren’t called GPUs until later). But instead they just pretended it was always NVIDIA vs ATI, and threw in a few Voodoos.
It was only 3dfx and NVIDIA (since the TNT) that mattered in the 1990s though. All the other 3D accelerators were only barely better than software rasterization, if at all.
Seeing Quake II run butter smooth on a Riva TNT at 1024x768 for the first time was like witnessing the second coming of Christ ;)
3 replies →
And they say that Nvidia coined the phrase GPU - but I recall that Sony did it earlier... not that it really matters.
+1 to that, when i first saw unreal tournament with the add-on compressed texture pack was a real WOW moment.
Yeah it also lacked driver support. But it was for a very brief moment the king of the hill.
My contributions: Matrox Parhelia for the first card supporting triple-monitors, and ATI All-in-Wonder which did TV out when media centre TVs weren’t really a thing.
The big feature of the All-in-Wonder was TV in. You could record, in glorious analog detail that could quickly use up your entire hard drive.
I can remember using an AiW card to play PS2 on my computer screen when my TV died. The latency wasn’t great but we still had fun.
I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get `-vo mga` for the console and `-vo xmga` for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.
For a moment, a Matrox G400 DualHead was THE card to have for a multi-monitor setup.
This was a very sweet video card.
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
4000 certainly did, the "shader execution reordering" gave an meaningful uplift to tasks that "underutilized warp units due to scattered useful pixels".
it seems to have helped path tracing by a lot.
I think their point is RTX is not useful.
> S3 ViRGE and the Matrox G200
Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)
Matrox was really halfhearted with game support. They seemed far more interested in corporate customers, advertising heavily stuff like "VR" conference calls that nobody wants. They were early with multi-monitor support back when monitors were big, heavy, and expensive. I had a G200 that was the last video card I've ever seen where you could expand the VRAM by slotting in a SODIMM. It also had composite out so you could hook it to a TV. I played a lot of games on it up until Return to Castle Wolfenstein, which was almost playable but the low res textures looked real bad and the framerate would precipitously drop at critical times like when a bunch of Nazis rushed into the room and started shooting.
Last time I saw a Matrox chip it was on a server, and somehow they had cut it down even more than the one I had used over a decade earlier. As I recall it couldn't handle a framebuffer larger than 800x600, which was sometimes a problem when people wanted to install and configure Windows Server.
The only thing the ViRGE was good for was passing through to a Voodoo2
But it WAS ultra popular with OEMs. If you had embedded video there was a huge chance that was it.
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?
They were probably forced to update when they dropped older busses. Without a PCI or AGP bus on there they have to find something that can hang off of a PCIe lane.
The ATi Rage 128 was used in everything short of toasters for a long time too. I assume that the drivers are part of what made it obsolete.
1 reply →
Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.
Drivers, probably.
Even current Dell servers less than a year old ship with G200 graphics. If it works, why change it? A 1998 ASIC can be put in the corner of a modern chipset for pennies or less.
From memory the cards that stood out were
Nvidia 6xxx series, which was the first card to support SLI. I remember my gaming pc in college with 6x series card, and being able to get another card and use and SLI bridge that increased performance in some games.
Nvidia GeForce 900 series, which had the Titan with 12gb, first card iirc to able to support larger resolution gaming.
Nvidia RXT series which started with 20xx i think, first card to come with 24gb of ram.
And then the modern 4xxx series which used to fry power cables.
This is an ad from viral marketing company and everyone here is falling for it.
>This is an ad from viral marketing company
they arent a marketing company:
"Dashboards, CRMs, automations. We're a small consulting team that turns your messy spreadsheets into systems that run your business."
They are: https://sheets.works/data-viz/hire
1 reply →
What are they advertising? Nvidia graphics cards?
Yes. They are likely also advertising for themselves with how viral their ads are. The article is featured on their website.
1 reply →
>S3 ViRGE
decelerator?
>Matrox G200
because it never got opengl driver? Because it was 2x slower than even Savage3D? Nvidia TNT released a month later offering 2x the speed at lower price
https://www.tomshardware.com/reviews/3d-chips,83-7.html
truly a graphic card that mattered! :)