← Back to context

Comment by formerly_proven

11 hours ago

The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.

Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?

  • They were probably forced to update when they dropped older busses. Without a PCI or AGP bus on there they have to find something that can hang off of a PCIe lane.

  • The ATi Rage 128 was used in everything short of toasters for a long time too. I assume that the drivers are part of what made it obsolete.

    • I remember having a ton of servers with cut down Mach64 chips. They were so bad that you would get horizontal lines flickering across the screen while text was scrolling in an 80x25 text console. I don't know why server manufacturers go to so much effort to make the console as terrible as possible. Are they nostalgic for the 8 bit ISA graphics from the original 5150? They seem offended at the idea that someone might hook a crash cart directly up to their precious hardware.

  • Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.

Even current Dell servers less than a year old ship with G200 graphics. If it works, why change it? A 1998 ASIC can be put in the corner of a modern chipset for pennies or less.