← Back to context

Comment by caycep

5 days ago

It's a sad trend for "the rest of us" and history in general. The economic boom of the 80's thru the 2010s has been a vast democratization of computation - hardware became more powerful and affordable, and algorithms (at least broadly if not individually) became more efficient. We all had supercomputers in our pockets. This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.

I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.

> This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.

Compute is cheaper than ever. The ceiling is just higher for what you can buy.

Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.

This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.

Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.

Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.

  • > Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.

    This is missing the forest for the trees quite badly. The 2000 price GPUs are what would've been previously 600-700, and the 200-400 dollar GPUs are now 600-700. Consumers got a shit end of the deal when crypto caused GPUs to spike and now consumers are getting another shitty deal with RAM prices. And even if you want mid range stuff it's harder and harder to buy because of how fucked the market is.

    It would be like if in your example companies literally only sold F-150s and stopped selling budget models at all. There isn't even budget stock to buy.

  • > Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.

    If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.

  • The thing about being annoyed about the top of the range prices, for me, it irritates as it feels like it drags the lower models prices upwardsz

    • It does. If the top range is 80k you'll feel you're getting a deal for 40k.

      So no one makes a 25k model.

    • But it’s not like the lower priced models are subsidizing the high-end models (probably the opposite; the high-end ones have greater margins).

  • The problem is the VRAM segmentation.

    A GTX 1080 came out in the first half of 2016. It had 8 GB of VRAM and cost $599 with a TDP of 180W.

    A GTX 1080 Ti came out in 2017 and had 11 GB of VRAM at $799.

    In 2025 you can get the RTX 5070 with 12 GB of VRAM. They say the price is $549, but good luck finding them at that price.

    And the thing with VRAM is that if you run out of it then performance drops off a cliff. Nothing can make up for it without getting a higher VRAM model.

  • I would take this argument more seriously if

    -the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...

    -for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble

    (granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).

    on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.

One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.

  • > When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.

    Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.

    What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.

    • The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.

      Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.

      There's a lot more of us than them.

      There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.

      4 replies →

    • Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable with well dated cards.

      5 replies →

    • > We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.

      Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.

      1 reply →

  • Gaming drove the development of GPUs which led to the current AI boom. Smartphones drove small process nodes for power efficiency.

    • SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.

      6 replies →

  • 100%. We’ve seen crazy swings in RAM prices before.

    A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)

>We all had supercomputers in our pockets.

You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free

Well put. Since the 1980's consumer has been driving the segment. Even supercomputers were built out of higher end consumer hardware (or playstations in one example).

The move to cloud computing and now AI mean that we're back in the mainframe days.

True, it is reminiscent of a time before me when people were lucky to have mainframe access through university. To be fair this was a long time in the making with the also quite aggressive move to cloud computing. While I don't mind having access to free AI tools, they seem to start taking possession of the content as well

It's not like you need 64GB to have "democratized computation". We used to have 64MB and that was plenty. Unfortunately, software got slower more quickly than hardware got quicker.

  • > Unfortunately, software got slower more quickly than hardware got quicker.

    Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.

    The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.

    • You brought up a light computing load that a laptop from like 2005 wouldn't struggle with?

      People ran multiple browser windows, a 3D video game, irc (chat application), teamspeak/ventrilo (voice chat) and winamp (music) all at once back in the early 2000s. This is something an 8 year old phone can do these days.

      2 replies →

  • It is pretty important if you are doing things like 3d animation, video editing, or advanced CAD software. Plus software in general has ballooned its memory requirements and expectations. Even my 11 year old PC had to have a RAM upgrade a few years ago just because software updates suck up so much extra memory, and there is almost nothing consumers can do about it.

    • At any point in the the 1990s, it was generally unfathomable to be using an 11-year-old PC for any modern purpose.

      That an 11-year-old PC can keep up today (with or without an upgrade) is evidence that systems are keeping up with software bloat just fine. :)

  • > We used to have 64MB and that was plenty.

    Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.

    • There are some use cases that simply require a lot of memory because they do, but I'm talking in general. Software that doesn't have a good excuse, used to run in 64MB how it now runs in 64GB.

      Besides, you just said you only needed 512MB, which is still nothing these days.

      1 reply →