Comment by arjie

15 hours ago

Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.

And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).

I'm still rocking a Z97, i7-4790k and a 980Ti :) I'm still waiting until I need an upgrade. DDR3 is still performing good enough for the games I run.

  • I was running a 970ti for the longest time, it was only when I wanted to get into some VR gaming that it was time for an upgrade.

It is interesting the consumer high you get from buying things. I remember being in a microsoft store like 12 years ago and wanting this Surface laptop. Holding it in my hands but I couldn't afford it. Now I have a Surface Book 3 and it's still cool but not the same experience as it being a flagship/new at the time.

Still there are a lot of laptops I'd like to try when they get cheaper. As far as GPUs I like the Nvidia founder designs, it was a while before I got a 3080 Ti Fe that I ended up having to sell at a loss when I didn't have a job that was sad. I have a 4070 founders now which does struggle on certain games at 1440p but I'm going to use it to run local LLMs.

My current machine is an i5-3570k with a 1070Ti...

The old CPU is actually more of an issue. I couldn't run Civ 7 because the game (probably the DRM) uses some instructions that aren't implemented on that CPU. Other than that I bet it would run just fine.

I was just about to upgrade before hardware prices went through the roof. Now I'm just holding on until some semblance of sanity returns, hoping every day that the bubble pops and loads of gently loved hardware starts appearing on the secondary market. Also, the way nVidia has been skimping on memory for all but the most outrageously expensive chips has grated on me. I was really hoping they would buck the trend with the 5xxx generation, but nope, and with RAM prices the way they are I have little hope for the 6xxx generation. My current card is close to a decade old and has 8GB of VRAM. I'm not upgrading to a card with 8GB of VRAM, or ever 12GB. That 8GB was crucial in future proofing the original card, none of its 4GB contemporaries are of much use today.

I also have that exact setup sitting around, but am just using my ryzen laptop now.

Hey, I could have used that i7-4790k!

I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.

It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.

I used my 1080 Ti for about eight years. The successor GPU is in some ways way faster (raytracing, AI features etc.), but in others really quite stagnant considering the huge stretch of time that passed between them. ~10 years for 2-3x performance in GPUs at higher nominal and real price points shows how slow silicon advances have been compared to the 90s and 2000s. The same period from 2000 to 2010 would've seen 1000x performance if not more. The difference between a 1080 Ti and a more expensive RTX 50 card is the RTX can render ideally triple the frames in synthetic benchmarks, double the frames in some rasterizing games (most games won't see gains that high), and do a few relatively tame raytracing tricks at performance which is still not really good. At the same throughput it consumes maybe half the power or a bit less. The difference between a GeForce 2 and e.g a Radeon HD 4k is several planes of existence.

  • My 1080ti is still working away in my kid's PC. If you connect a 1080p monitor, it will still hit 60fps in mostly everything.

    The only thing that holds this card back now is a handful of titles that will not run unless ray-tracing support present on card - Indiana Jones and The Great Circle springs to mind etc.

    I am very likely going to get a decade of use out of it across three different builds, one of the best technology investments I've ever made.

    • It really is an impressive bit of hardware. I finally pulled it out of my last system a year ago, but it was definitely holding its own up until that point.

  • Well. The 5090ti is significantly faster than a 1080ti. It has 92b vs 12b transistors. That's the 10 years difference you mention. 10 years before the 1080ti we had the 8800 ultra with 600m transistors. So yeah you are a bit right. But stacked transistors in the future might become reality and enable transistor increase again.

    • A 5090 is more than twice as expensive as a 1080 Ti in real MSRP terms and way more than that in actual real terms, since the 1080 Ti was available for some time below MSRP, while the 5090 realistically never was and usually goes for 50-100% above MSRP. So I don't think these can be compared. Basically a similar story with the 5080, it's significantly more expensive in real terms (and about ~2x in nominal terms).

      The 5070 Ti would be the same spot.

      If you compare these - the RTX 50 card has a bit higher TDP (which it will usually not reach due to clock limits), is a roughly 100mm² smaller die with around 4x the transistors and about 3x the compute (since much more of the chip is disabled compared to the 1080 Ti's chip). It has 5 GB more memory (11->16) and a lot more bandwidth.