← Back to context

Comment by iancmceachern

1 day ago

A gpu from 8 years ago is cost competitive, efficient and "worth using" for modern tasks?

I don't want to be picky, but there is still a lot of value left in "not modern" tasks, like video encoding/transcoding. If somewhere the trickle-down effect is real, then it is computing hardware. Take Hetzner's server auction. If the hardware is physically deployed and running, you just need to find appropriate payloads/customers. https://www.hetzner.com/sb/

  • We have a box at work for employees bring hardware in they’re getting rid of, along with hardware we’re throwing out that we don’t need anymore.

    It has a pile of GPUs that are completely obsolete for any task: they use way too much power, have a large form factor that burns up a PCIe x16 slot, are loud, some need extra power cables, lack driver support on modern operating systems, and in return for all that don’t have as much power as something much better you could get for $100.

    Value on eBay seems to be about $10-$15, mostly for people with a retro computing hobby or people removing semiconductor components for other purposes.

    An obsolete data centre isn’t worth much either. (We have a small one made from equipment being liquidated from local data centres that have been upgraded.) The power consumption is too high and it is not set up for efficient HVAC for modern ultra high power draw workloads.

    • The key is to calculate right and go for the mainstream hardware. If you are a hosting company with diversified use cases, you have plenty of room to downcycle hardware until it breaks. If you are operating under limited space in a field with bleeding edge/performance targets, doing this is not viable. There are many solution provides that will buy your outdated things. I‘m not saying that old hardware is great or a cash cow in general, but the lifetime can usually be doubled or tripled if the right use case can be found and when you are the owner.

      Side quest: Virtualized instances at cloud providers never get upgraded unless recreated. I bet there are millions of VMs running for years on specs and prices of 2018-2020.

The V100 is ~8 years old and AFAIK mostly not that common anymore, but the A100 is ~5.5 years old now and is still very commonly used, it's maybe the most common HPC cluster GPU. On the consumer side, 3090s are still very popular, representing a good balance between cost, performance and efficiency (this is mostly due to 4090s and 5090s being much more expensive).

The GPUs have a much shorter lifecycle, on the order of ~3 years.

  • Exactly, I'm a mechanical engineer and I still have tools given to me by my machinist great uncle from WWII that are not only functional, they're identical to a new tool I'd buy today for that purpose, from the same manufacturer. This is the difference the OP was highlighting

    • We've also been doing machining in the modern sense for at least a hundred and fifty years. The GPU as a concept is about 30 years old, and in the modern sense much younger than that.

      Innovation occurs on a sigmoid curve, we're still very early in the sigmoid for software and computer hardware, and very late in the sigmoid for machining, unless you include CNC, in which case, we're back to software and computer hardware being the new parts.

      A better example would be the tape out and lifetime for semiconductor fabs, which are only about 70 years old and have lifetimes measured in the decade range.

      1 reply →

    • Are those tools functional? Have you ever checked? I'm not sure what tools you are talking about, but likely some of them are measurement tools and they can seem to work perfect while giving the wrong measurement. Other might be cutting tools that cut, but they are a bit dull and if you don't know how to check you won't realize the cuts are not as good as new anymore (or maybe you have sharpened them and they now cut the wrong profile...). There are many ways a tool can seem functional but be wrong.

      6 replies →

  • No they don't. The 3 year number came from some random person on the internet who claimed to be a Google employee and was denied by Google, as you can see on any of the articles about this claim:

    > Recent purported comments about Nvidia GPU hardware utilization and service life expressed by an “unnamed source” were inaccurate, do not represent how we utilize Nvidia’s technology, and do not represent our experience.