If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.
That may or may not be an INTERNAL NVIDIA goal, or even a goal for multiple companies, however, that is NOT how the situation will play out.
The ecosystem isn't closed. TSMC doesn't exist in a vacuum. They may be the most advanced, however, there are a few reasons this will never work:
1) older fabs can be built in a garage by a smart person (it's been done a few times, I'd link the articles, but I don't have them handy)
2) Indie devs exist and often provide better gaming experience than AAA developers.
3) Old hardware/consoles exist, and will continue to exist for many decades to come (my Atari 2600 still works, as an example, and it it is older than I)
Sure, they MAY attempt to grab the market in this way. The attempt will backfire. Companies will go under, including possibly team green if they actually do exit the gaming market (because let's be real, at least in the U.S. a full blown depression is coming. When? No idea. However, yes, it's coming unless folks vote out the garbage.), and the player that doesn't give in, or possibly a chinese player that has yet to enter the market, will take over.
Yeah they want a return to TV era where censors curtail content
Everyone will own a presentation layer device. Anyone who can only afford the 1GB model can only get SNES quality visuals.
Snow Crash and Neuromancer have displaced the Bible as cognitive framework for tech rich.
Am working on an app that generates and syncs keys 1:1 over local BT and then syncs them again to home PC (if desired). The idea being cut out internet middle men and go back to something like IRC direct connect, that also requires real world "touch grass" effort to complicate greedy data collectors.
Testing now by sharing IP over Signal and then 1:1'ing over whatever app. Can just scaffold all new protocols on top of TCP/IP again.
I figured this out about 5 years ago.
Its why each of my kids and my wife all have decent spec desktop PC's, and half of us use linux (I'll migrate the others later)
Maybe then the year of Linux (or OpenBSD?) on the desktop would finally arrive. Maybe anti-trust could get used. Maybe parts could get scrapped from data centres.
Stadia worked, when conditions were good, Geforce Now exists. No cheaters in multiplayer (though there are always new ones), it's a way to go. They're even doing a thing with cellphones as merely devices playing a full screen video stream that you can interact with.
> Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab.
If they do it all gamers will boycott LLMs. Which would be a godsend. Decades trying to save power, moving to LED, trying to improve efficiency everywhere, and now... We are wasting terawatts in digital parrots.
I think China will then try to sell their own PC parts instead, their semiconductor industry is catching up so who knows in a decade.
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
There is definitely a part of me which feels like with the increasing ram prices and similar. Its hard for people to have a home lab.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
Feel the same way here. Can't help but get the vibe that big tech wants to lock consumers out, eliminate the ability to have personal computing/self-hosted computing. Maybe in tandem with governments, not sure, but it's certainly appetizing to them from a profit perspective.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
That might be a bit on the paranoid side. It could just be that it's far more profitable right now for companies to sell only to data centres. That way, they don't need to spend money on advertising, or share their revenues with third-party sellers.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
A lot of people who were good at optimizing games have aged out and/or 'got theirs' and retired early or just got out of the demanding job and secured a better paying job in a sector with more economic upside and less churn. On the other side there's an unending almost exponential group of newcomers into the industry who are believe the hype given by engine makers who hide the true cost of optimimal game making and sell on 'ease'.
That's not how it ACTUALLY worked. How it actually worked is that top video card manufactures would make multi-million dollar bids to to the devs of the three or four AAA games that were predicted to be best-sellers in order to get the devs to optimize their rendering for whatever this year's top video adapter was going to be. And nobody really cared if it didn't run on your crappy old last-year's card, because everybody undrerstood that the vast majority of games revenue comes from people who have just bought expensive new systems. (Inside experience, I lived it).
I don't think it has ever been the case that this year's AAA games play well on last year's video cards.
> Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
This is such a funny take, I remember all throughout the 90s and 00s (and maybe even 10s, not playing much then anymore) you often could new games on acceptable settings with a 1-2 year old high spec machine, in fact to play at highest settings you often needed ridiculously spec'ed machines. Now you can play the biggest titles (CP77, BG3 ...) on 5-10 year old hardware (not even top spec), with non or minimal performance/quality impact. I mean I've been playing BG3 and CP77 on highest settings on a PC that I bought 2 years ago used for $600 (BG3 I was playing when it had just come out).
One wonders what would happen in a SHtF situation or someone stubs their toe on the demolition charges switch at TSMC and all the TwinScans get minced.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
Consoles and their install base set the target performance envelope. If your machine can't keep up with a 5 year old console then you should lower expectations.
And like, when have onboard GPUs ever been good? The fact that they're even feasible these days should be praised but you're imagining some past where devs left them behind.
> The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".
Gaming performance is so much more than hardware specs. Thinking that game devs optimizing their games on their own could fundamentally change the gaming experience is delusional.
And anyone who knows just a tiny bit of history of nvidia would know how much investment they have put into gaming and the technology they pioneered.
I fail to see how my comment could be construed as being pro-monopoly.
There are a huge number of onboard GPUs being left out of even minimum requirements for most recent games. I'm just saying that maybe this situation could led game devs to finally consider such devices as legitimate targets and thus make their games playable on such devices. This is by no means a pro-monopolistic take.
I have a 9 year old gaming PC with an RX480 and it is only now starting to not be able to run certain games at all (recent ones that require ray tracing). It can play Cyberpunk and Starfield on low settings very acceptably.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
Nvidia doesn't share dies between their high-end datacenter products like B200 and consumer products. The high-end consumer dies have many more SMs than a corresponding datacenter die. Each has functionality that the other does not within an SM/TPC, nevermind the very different fabric and memory subsystem (with much higher bandwidth/SM on the datacenter parts). They run at very different clock frequencies. It just wouldn't make sense to share the dies under these constraints, especially when GPUs already present a fairly obvious yield recovery strategy.
You can't turn a GB200 into a GB202 (which I assume is what you meant since GP102 is from 2016), they are completely different designs. That kind of salvage happens between variants of the same design, for example the RTX Pro 6000 and RTX 5090 both use GB202 in different configurations, and chips which don't make the cut for the former get used for the latter.
Well the good thing for NVIDIA AI business is that most of your chips can sit unused in warehouses and still get rich. 6 million H100s sold but infrastructure (water cooled dc) for only a third of them exists in the world.
AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
I got an.. AMD (even today I still almost say “ATI” every time) RX6600 XT I think, a couple years ago? It’s been great. I switched over to Linux back in the spring and yes the compatibility has been fine and caused no issues. Still amazed I can run “AAA” games, published by Microsoft even, under Linux.
My very gaming experienced and data oriented 13 year old wants to switch from Nvidia to AMD. I don’t understand all his reasons/numbers but I suppose that’s as good an endorsement as any for AMDs GPUs.
Similar with nvidia, you've got to consider what partner companies AMD likes working with. AMD/nvidia design chips, contract TSMC to make them, then sell the chips to the likes of ASUS/MSI/gigabyte/etc to put them on cards the consumer buys. The other market AMD serves is Sony/MS for their consoles and I'd argue they're a major motivator driving radeon development as they pay up-front to get custom APU chips, and there's synergy there with Zen and more recently the AI demand. Ever since ATi bought up the company (ArtX) that made the Gamecube GPU it seems to me that the PC side is keeping the motor running in-between console contracts as far as gaming demands go, given their low market share they definitely don't seem to prioritize or depend on it to thrive.
Wow, yeah, I picked up one of these a few months before the new generation came out for $350. Everything shot up after that.
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
I keep hearing this and yet history has proved, time and again, that any overly greedy monopolist achieves the reverse effect of monopoly.
Don't get too worried. People still can and do vote with their wallets. Additional vector of attack against greedy capitalists is also the fact that the economy is not doing great either.
They cannot increase prices too much.
I also predict that the DDR5 RAM price hikes will not last until 2027 or even 2028 as many others think. I give it maximum one year, I'd even think the prices will start slightly coming down during summer 2026.
Reading and understanding economy is neat and all but in the modern age some people forget that the total addressable market is not infinite and that the regular customers have relatively tight budgets.
It would be great if more GPU competition would enter the field instead of less. The current duopoly is pretty boring and stagnant, with prices high and each company sorta-kinda doing the same thing and milking their market.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
This is just DRAM hysteria spiraling out to other kinds of hardware, will age like fine milk just like the rest of the "gaming PC market will never be the same" stuff. Nvidia has Amazon, Google, and others trying to compete with them in the data center. No one is seriously trying to beat their gaming chips. Wouldn't make any sense to give it up.
It's not related to the DRAM shortage. Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI and there was controversy years before that about most "gaming" GPUs going to crypto miners. They won't exit the gaming market but from a shareholder perspective it does look like a good idea.
> Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI
Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.
Took what, four years for PC cases to get back to reasonable prices after COVID? And that’s a relatively low-tech field that (therefore) admits new entrants. I don’t know, I’m not feeling much optimism right now (haven’t at any point after the crypto boom), perhaps because I’ve always leaned towards stocking up on (main) RAM as a cheap way to improve a PC’s performance.
Huh? Only thing I noticed during COVID was a few GPUs and some HDDs getting insanely expensive. Surprise surprise, not even 18 months later (I think more like 14 on my local market) the sellers finally got it through their thick skulls that economy is not what you see in simulations and that 95% of people absolutely will not pay ~$4000 for an RTX 3090 / 4090. And similar for the 18TB HDDs.
I am not saying you are wrong but here in Eastern Europe, while we did suffer the price hikes (and are suffering those of the DDR5 RAM now as well), the impact was minimal. People just holed up, said "the market's crazy right now, let's wait it out", and shrugged. And lo and behold, successfully wait it out they did.
As I mentioned in another comment in this thread, I highly doubt high RAM prices will survive even to 2027. Sure a good amount of stores and suppliers will try to hold on to the prices with a death grip... but many stores and vendors and suppliers hate stale stock. And some other new tech will start coming out. They would not be able to tolerate shelves full of merchandise at prices people don't want to buy at.
They absolutely _will_ budge.
I predict that by July/August 2026 some price decreases will start. They are likely to be small -- no more than 15% IMO -- but they will start.
The current craze of "let's only produce and sell to the big boys" has happened before, happens now, and will happen again. I and many others don't believe the hysteric "the market will never be the same again after" narrative either.
The fact that a case is literally a sheet metal box and can cost $150 is so bewildering to me. All those $400 nas builds like 25% of the cost is literally just the case. CPU might be only $25 and requires an advanced fab lab not just someone with a lasercutter.
If Nvidia did drop their gaming GPU lineup, it would be a huge re-shuffling in the market: AMD's market share would 10x over night, and it would open a very rare opportunity for minority (or brand-new?) players to get a foothold.
What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.
I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.
It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.
Yeah, sure, every tech company now acts like a craven monopolist hellbent on destroying everything that isn't corporate-driven AI computing, but not this time! This time will be different!
I guess time will tell, will it? Your sarcastic remark does not bear any prophetic value.
They can act as monopolist as they want. They can try anything and cackle maniacally at their amazing business acumen all they want.
Turns out, total addressable market is not infinite. Turns out people don't want to spend on RAM as much as they would on a used car. How shocking! And I am completely sure that yet again the MBAs would be unprepared for these grand revelations, like they are, EVERY time.
Still, let us wait and see. Most of us are not in a rush to build a gaming machine or a workstation next month or else puppies will start dying.
I am pretty sure the same people now rubbing hands and believing they have found the eternal gold, will come back begging and pleading for our measly customer dollars before not too long.
What I don't really understand is why the big data centre operators destroy their old cards instead of selling them off. What are the downsides for them? Apart from the obvious, i.e. it would bring in money, would it not also drive down the cost for brand new cards? I.e. Nvidia can currently overcharge dramatically because there is such a shortage. If the data centre operators would dump large numbers of used cards on the market would that not increase supply and drive down cost?
Most data center GPUs don't have display outputs and some use exotic connectors (not PCIe x16 slots), making them worth next to nothing as conventional graphics cards.
You don't "write off" the full value, you fiddle the amortization so they go to zero accounting value exactly when you want them to. They're playing this exact game with datacentre GPUs right now.
You can still have a tax implication when you sell the fully depreciated item but in theory it should only be a benefit unless your company has a 100% marginal tax rate somehow.
Of course it can cost more to store the goods and administer the sale then you recoup. And the manufacturer may do or even require a buyback to prevent the second hand market undercutting their sales. Or you may be disinclined to provide cheap hardware to your competitors.
If a $100B/year company closes a $1B/year division, they are doing something much worse than losing $1B/year: they are giving $1B in funding to a new competitor which can grow to threaten the major part.
I've switched away from Nvidia around 2008 due to poor Linux support. Been on AMD ever since (now running the flagship model from a year ago, the 7900xtx or whatever it's called).
Won't personally miss Nvidia, but we need competition in the space to keep prices 'reasonable' (although they haven't been reasonable for some years), and to push for further innovation.
I don't think they will. There is a reason why every GPU they make (gaming or not) supports CUDA. Future gamers are future CUDA developers. Taking that away would be a self goal.
On graphics: there is a threshold where realistic graphics make the difference.
Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.
We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.
I can't name one in last 5 that has been "pushing the envelope" that would actually wow me. And the ones that did, did it by artstyle, not sheer amount of polygons pushed to the screen.
VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.
Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs
PC gaming will be fine even without 8K 120fps raytracing. It will be fine even if limited to iGPUs. Maybe even better off if it means new titles are actually playable on an average new miniPC.
More realistically I guess we get an AMD/Intel duopoly looking quite similar instead.
It will probably be a bigger blow to people who want to run LLMs at home.
That doesn't sem very plausible, how many people are driven away from CounterStrike or like League of Legends because the graphics weren't as good as Cyberpunk or whatever?
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
CoD is currently 300GB on disk due to all the textures. I suspect a lot of players would be happy with a modest regression in fidelity if it means the game can run smoothly on affordable devices and leave some room for their other games too.
Huh? No? It means that the overall platform is already at 'good enough' level. There can always be an improvement, but in terms of pure visuals, we are already past at a point, where some studios choose simple representations ( see some 2d platformers ) as a stylistic choice.
Man, I remember playing UT GOTYE back in the 00s and the graphics blew us away when we fired it up and then Return to Castle Wolfenstein made my brother cry from the "realistic" zombies (on a CRT even!). It's amazing what you can take for granted when even a fraction of a modern card would have been called "photorealistic" back then.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
I've heard good things about Moore Threads. Who knows, maybe the consumer GPU market is not a duopoly after all, Nvidia exiting the market would be a good thing longer term by introducing more competitions.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
Is there any path for Microsoft and NVIDIA to work together and resurrect some sort of transparent SLI layer for consumer workloads? It’d take the pressure off the high end of the market a little and also help old cards hold value for longer, which would be a boon if, for example, your entire economy happened to be balanced on top of a series of risky loans against that hardware.
If NVIDIA exits the market, there is still AMD, Intel and PowerVR (Imagination Technologies is back at making discrete PC GPUs, although currently only in China).
Is that due to some kind of issue with the architecture, or just a matter of software support?
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
I don’t understand why most people in this thread think that this would be such a big deal. It will not change the market in significant negative or positive ways. AMD has been at their heals for a couple of decades and is more competitive than ever, they will simply fill their shoes. Most games consoles have been AMD centric for a long time regardless, they’ve always been fairly dominant in the mid range and they have a longstanding reputation of having the best price/performance value for gamers.
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close to taking over Intel on their long CPU race, both in the datacenter and consumer segments, and Nvidia might be next in the coming 5 years, depending on how the AI bubble develops.
Why would AMD shareholders tolerate them engaging in gaming pc market if Nvidia drops out? They might last a couple years but I mean the writing will be on the wall for them to chase enterprise sales and abandon gamers. Especially when game console manufacturers would prefer you spend $25 a month for life instead of buying a $500 console every 8 years. There won't be an xbox or playstation in your house before long.
Sure the gaming hardware landscape is likely to change, but it won’t be up to Nvidia’s decision. If it remains a viable business, it will thrive whether Nvidia is in it or not.
They’ve become a big name now with AI, but they were never the only game in town in their home markets. They had an edge on the high-end so their name had some prestige, but market share wise it was quite even. Even with AI, they have a temporary head start but I wouldn’t be surprised if they get crowded in the coming years, what they do is not magic.
Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.
Doing one thing well can lead to doing bigger things well.
More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.
Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.
NVIDIA would still have service contract obligations to fulfil, and would provide support for its existing products for a period of time.
Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.
What goes into a Nintendo console is not prime silicon. When it's time to design the next console, I am sure Nvidia will still be more than happy to give them a design that they have laying around somewhere in a drawer if it means they ship 100M units.
Then Intel and AMD carry on, tbh having sewn up handhelds and consoles and made gaming on integrated graphics mainstream many won't notice.
An AI bubble burst leaving loads of GPU laden datacenters is much more likely to hasten cloud gaming.
If you look at AMD's CPUs there's indications they do that. When Zen1/1+/2 came out they were priced below intel's products as they needed to rebuild mindshare with their promising new chips, from Zen3 onwards where they started building a performance lead in many categories as well as core count they jacked the prices up because they could demand it.
That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.
Looks like an hit piece to trigger some people to dump their $NVDA stock. They worked phrases like "abandon" and "AI Bubble" into the title/subtitle. Authors other articles look like clickbait crap https://www.pcworld.com/author/jon-martindale
They probably won't. They'll just change things so their hardware becomes a subscription-style model rather than proper outright ownership by the purchaser, which is to a limited degree the case when it comes to their hardware drivers anyway.
Game graphics are still a high margin silicon business. Someone will do it.
Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive
Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.
Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.
PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.
I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.
Gaming got me into coding. Messing with Warcraft 3 world editor back when there were a lot of dota clones. Good times. I think blizzard had their own language JASS that was very lua like
I was tempted to respond with an offhand comment about the size of the industry or similar, but what axe do you have to grind about PC gaming? You'd prefer folks go to the far more injurious mobile gaming space?
Really, I have been gaming before even getting my Timex 2068 in the mid 80's, starting with Game & Watch handhelds, and I don't get "build your aquarium" culture of many PC gamers nowadays.
It is so bad that is almost impossible to buy a traditional desktop on regular computer stores, there are only fish tanks with rainbows on sale.
Unfortunately you’re right re:dating prospects but that’s mostly because game devs haven’t been able to reproduce the insane success of valorant at getting the better gender to want to play hardcore games.
I'm not sure it would matter. It doesn't seem that graphics are the limiting factor in games anymore. Plenty of popular games use variations on cartoon-style graphics, for example - Fortnight, Overwatch, Valorant, etc. Seems gameplay, creativity, and player community are more determining factors.
That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.
Notably the games you listed are all f2p/esports games, and that does matter in terms of how much budget developers have to polish a realistic look vs ship a cartoon and call it the "art style".
I just upgraded to 9700 XT to play ARC Raiders and it's absolutely a feast for the eyes while also pioneering on several fronts especially around the bot movement and intelligence.
If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.
You're not thinking big enough. Their ultimate goal is gaming (or any computing really) available only in the cloud.
That may or may not be an INTERNAL NVIDIA goal, or even a goal for multiple companies, however, that is NOT how the situation will play out.
The ecosystem isn't closed. TSMC doesn't exist in a vacuum. They may be the most advanced, however, there are a few reasons this will never work:
1) older fabs can be built in a garage by a smart person (it's been done a few times, I'd link the articles, but I don't have them handy)
2) Indie devs exist and often provide better gaming experience than AAA developers.
3) Old hardware/consoles exist, and will continue to exist for many decades to come (my Atari 2600 still works, as an example, and it it is older than I)
Sure, they MAY attempt to grab the market in this way. The attempt will backfire. Companies will go under, including possibly team green if they actually do exit the gaming market (because let's be real, at least in the U.S. a full blown depression is coming. When? No idea. However, yes, it's coming unless folks vote out the garbage.), and the player that doesn't give in, or possibly a chinese player that has yet to enter the market, will take over.
2 replies →
Yeah they want a return to TV era where censors curtail content
Everyone will own a presentation layer device. Anyone who can only afford the 1GB model can only get SNES quality visuals.
Snow Crash and Neuromancer have displaced the Bible as cognitive framework for tech rich.
Am working on an app that generates and syncs keys 1:1 over local BT and then syncs them again to home PC (if desired). The idea being cut out internet middle men and go back to something like IRC direct connect, that also requires real world "touch grass" effort to complicate greedy data collectors.
Testing now by sharing IP over Signal and then 1:1'ing over whatever app. Can just scaffold all new protocols on top of TCP/IP again.
4 replies →
See, I wrote that out but then I thought, “Nah, that’s too conspiracy for this crowd.” But lo! Yeah. Not excited about the emerging status quo.
Even bigger than that, it’s all a slow march to a sort of narcissistic feudalism.
This. Your home PC is just another appliance now. Welcome to your Copilot future.
I figured this out about 5 years ago. Its why each of my kids and my wife all have decent spec desktop PC's, and half of us use linux (I'll migrate the others later)
Maybe then the year of Linux (or OpenBSD?) on the desktop would finally arrive. Maybe anti-trust could get used. Maybe parts could get scrapped from data centres.
Interesting times they would be!
13 replies →
Stadia worked, when conditions were good, Geforce Now exists. No cheaters in multiplayer (though there are always new ones), it's a way to go. They're even doing a thing with cellphones as merely devices playing a full screen video stream that you can interact with.
> Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab.
So.. a smart phone?
If they do it all gamers will boycott LLMs. Which would be a godsend. Decades trying to save power, moving to LED, trying to improve efficiency everywhere, and now... We are wasting terawatts in digital parrots.
You are vastly overestimating the standards of the median gamer.
1 reply →
I think China will then try to sell their own PC parts instead, their semiconductor industry is catching up so who knows in a decade.
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
US already banned Chinese EVs, even though, from what I've heard, they're excellent
There is definitely a part of me which feels like with the increasing ram prices and similar. Its hard for people to have a home lab.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
Feel the same way here. Can't help but get the vibe that big tech wants to lock consumers out, eliminate the ability to have personal computing/self-hosted computing. Maybe in tandem with governments, not sure, but it's certainly appetizing to them from a profit perspective.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
1 reply →
That might be a bit on the paranoid side. It could just be that it's far more profitable right now for companies to sell only to data centres. That way, they don't need to spend money on advertising, or share their revenues with third-party sellers.
I doubt that this would ever happen. But...
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
8 replies →
1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
7 replies →
You wish. Games will just be published cloud-only and you can only play them via thin clients.
It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
3 replies →
More like they wish. That would mean a globally good internet infrastructure which is absolutely never happening.
This hurt my soul. Kudos.
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
18 replies →
A lot of people who were good at optimizing games have aged out and/or 'got theirs' and retired early or just got out of the demanding job and secured a better paying job in a sector with more economic upside and less churn. On the other side there's an unending almost exponential group of newcomers into the industry who are believe the hype given by engine makers who hide the true cost of optimimal game making and sell on 'ease'.
That's not how it ACTUALLY worked. How it actually worked is that top video card manufactures would make multi-million dollar bids to to the devs of the three or four AAA games that were predicted to be best-sellers in order to get the devs to optimize their rendering for whatever this year's top video adapter was going to be. And nobody really cared if it didn't run on your crappy old last-year's card, because everybody undrerstood that the vast majority of games revenue comes from people who have just bought expensive new systems. (Inside experience, I lived it).
I don't think it has ever been the case that this year's AAA games play well on last year's video cards.
> Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
DOOM and Battlefield 6 are praised for being surprisingly well optimized for the graphics they offer, and some people bought these games for that reason alone. But I guess in the good old days good optimization would be the norm, not the exception.
This is such a funny take, I remember all throughout the 90s and 00s (and maybe even 10s, not playing much then anymore) you often could new games on acceptable settings with a 1-2 year old high spec machine, in fact to play at highest settings you often needed ridiculously spec'ed machines. Now you can play the biggest titles (CP77, BG3 ...) on 5-10 year old hardware (not even top spec), with non or minimal performance/quality impact. I mean I've been playing BG3 and CP77 on highest settings on a PC that I bought 2 years ago used for $600 (BG3 I was playing when it had just come out).
I feel like Steam Deck support is making developers optimize again.
1 reply →
One wonders what would happen in a SHtF situation or someone stubs their toe on the demolition charges switch at TSMC and all the TwinScans get minced.
Would there be a huge drive towards debloating software to run again on random old computers people find in cupboards?
Until we end up spending trillions recreating the fab capacity of tsmc, they dont have a full monppoly (yet)
Consoles and their install base set the target performance envelope. If your machine can't keep up with a 5 year old console then you should lower expectations.
And like, when have onboard GPUs ever been good? The fact that they're even feasible these days should be praised but you're imagining some past where devs left them behind.
> The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
They'll just move to remote rendering you'll have to subscribe to. Computers will stagnate as they are, and all new improvements will be reserved for the cloud providers. All hail our gracious overlords "donating" their compute time to the unwashed masses.
Hopefully AMD and Intel would still try. But I fear they'd probably follow Nvidia's lead.
Is remote rendering a thing? I would have imagined the lag would make something like that impractical.
5 replies →
> I think it would be a good thing.
This is an insane thing to say.
> Game and engine devs simply don't bother anymore to optimize for the low end
All games carefully consider the total addressable market. You can build a low end game that runs great on total ass garbage onboard GPU. Suffice to say these gamers are not an audience that spend a lot of money on games.
It’s totally fine and good to build premium content that requires premium hardware.
It’s also good to run on low-end hardware to increase the TAM. But there are limits. Building a modern game and targeting a 486 is a wee bit silly.
If Nvidia gamer GPUs disappear and devs were forced to build games that are capable of running on shit ass hardware the net benefit to gamers would be very minimal.
What would actually benefit gamers is making good hardware available at an affordable price!
Everything about your comment screams “tall poppy syndrome”. </rant>
> This is an insane thing to say.
I don't think it's insane. In that hypothetical case, it would be a slightly painful experience for some people that the top end is a bit curtailed for a few years while game developers learn to target other cards, hopefully in some more portable way. But also feeling hard done by because your graphics hardware is stuck at 2025 levels for a bit is not that much of hardship really, is it? In fact, if more time is spent optimising for non-premium cards, perhaps the premium card that you already have will work better then the next upgrade would have.
It's not inconceivable that the overall result is a better computing ecosystem in the long run. The open source space in particular, where Nvidia has long been problematic. Or maybe it'll be a multi decade gaming winter, but unless gamers stop being willing to throw large amounts of money chasing the top end, someone will want that money even if Nvidia didn't.
1 reply →
I wonder what Balatro dos that wouldn’t be possible on a 486.
3 replies →
I always chuckle when I see an entitled online rant from a gamer. Nothing against them, it's just humorous. In this one, we have hard-nosed defense of free market principles in the first part worthy of Reagan himself, followed by a Marxist appeal for someone (who?) to "make hardware available at an affordable price!".
3 replies →
I agree re "optimizations", but I dont think there should be compromises on quality (if set to max/ultra settings)
Gaming performance is so much more than hardware specs. Thinking that game devs optimizing their games on their own could fundamentally change the gaming experience is delusional.
And anyone who knows just a tiny bit of history of nvidia would know how much investment they have put into gaming and the technology they pioneered.
I haven't been on HN even 60 seconds this morning and I've already found a pro-monopoly take. Delightful.
I fail to see how my comment could be construed as being pro-monopoly.
There are a huge number of onboard GPUs being left out of even minimum requirements for most recent games. I'm just saying that maybe this situation could led game devs to finally consider such devices as legitimate targets and thus make their games playable on such devices. This is by no means a pro-monopolistic take.
I have a 9 year old gaming PC with an RX480 and it is only now starting to not be able to run certain games at all (recent ones that require ray tracing). It can play Cyberpunk and Starfield on low settings very acceptably.
I don’t think they can.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
Nvidia doesn't share dies between their high-end datacenter products like B200 and consumer products. The high-end consumer dies have many more SMs than a corresponding datacenter die. Each has functionality that the other does not within an SM/TPC, nevermind the very different fabric and memory subsystem (with much higher bandwidth/SM on the datacenter parts). They run at very different clock frequencies. It just wouldn't make sense to share the dies under these constraints, especially when GPUs already present a fairly obvious yield recovery strategy.
You can't turn a GB200 into a GB202 (which I assume is what you meant since GP102 is from 2016), they are completely different designs. That kind of salvage happens between variants of the same design, for example the RTX Pro 6000 and RTX 5090 both use GB202 in different configurations, and chips which don't make the cut for the former get used for the latter.
> So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
B200 doesn't have any graphics capabilities. The datacenter chips don't have any graphical units, it's just wasted die space.
As long as gaming GPUs will compete for same wafer space that AI chips use, the AI chips will be far more profitable to NVIDIA
Why don't they sell these to datacenters as well, which could run a "low core section" with reduced power and cooling?
Well the good thing for NVIDIA AI business is that most of your chips can sit unused in warehouses and still get rich. 6 million H100s sold but infrastructure (water cooled dc) for only a third of them exists in the world.
AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
I got an.. AMD (even today I still almost say “ATI” every time) RX6600 XT I think, a couple years ago? It’s been great. I switched over to Linux back in the spring and yes the compatibility has been fine and caused no issues. Still amazed I can run “AAA” games, published by Microsoft even, under Linux.
My very gaming experienced and data oriented 13 year old wants to switch from Nvidia to AMD. I don’t understand all his reasons/numbers but I suppose that’s as good an endorsement as any for AMDs GPUs.
Similar with nvidia, you've got to consider what partner companies AMD likes working with. AMD/nvidia design chips, contract TSMC to make them, then sell the chips to the likes of ASUS/MSI/gigabyte/etc to put them on cards the consumer buys. The other market AMD serves is Sony/MS for their consoles and I'd argue they're a major motivator driving radeon development as they pay up-front to get custom APU chips, and there's synergy there with Zen and more recently the AI demand. Ever since ATi bought up the company (ArtX) that made the Gamecube GPU it seems to me that the PC side is keeping the motor running in-between console contracts as far as gaming demands go, given their low market share they definitely don't seem to prioritize or depend on it to thrive.
RX 7900gre, can confirm as much.
Wow, yeah, I picked up one of these a few months before the new generation came out for $350. Everything shot up after that.
My son is using that card, today, and I'm amazed at everything that card can still power. I had a 5080 and just comparing a few games, I found if he used the SuperResolution correctly, he can set the other game settings at the same as mine and his frame-rate isn't far off (things like Fortnite, not Cyberpunk 2077)
There are many caveats there, of course. AMD's biggest problem is in the drivers/implementation for that card. Unlike NVidia's similar technology, it requires setting the game at a lower resolution which it then "fixes" and it tends to produce artifacts depending on the game/how high those settings go. It's a lot harder to juggle the settings between the driver and the game than it should be.
1 reply →
AMD will certainly be very happy to raise prices significantly when they have a defacto monopoly over the market segment alright.
I keep hearing this and yet history has proved, time and again, that any overly greedy monopolist achieves the reverse effect of monopoly.
Don't get too worried. People still can and do vote with their wallets. Additional vector of attack against greedy capitalists is also the fact that the economy is not doing great either.
They cannot increase prices too much.
I also predict that the DDR5 RAM price hikes will not last until 2027 or even 2028 as many others think. I give it maximum one year, I'd even think the prices will start slightly coming down during summer 2026.
Reading and understanding economy is neat and all but in the modern age some people forget that the total addressable market is not infinite and that the regular customers have relatively tight budgets.
1 reply →
If it’s too expensive, I will play on my phone or my macbook instead of a gaming pc. They can’t increase the prices too much.
It would be great if more GPU competition would enter the field instead of less. The current duopoly is pretty boring and stagnant, with prices high and each company sorta-kinda doing the same thing and milking their market.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
there's intel and they're not bad
pretty sure i heard that they were stepping down from doing discrete graphics in the next year or two
1 reply →
We'd need our government to actually enforce antitrust laws that have been on the books for about a century. Good luck.
This is just DRAM hysteria spiraling out to other kinds of hardware, will age like fine milk just like the rest of the "gaming PC market will never be the same" stuff. Nvidia has Amazon, Google, and others trying to compete with them in the data center. No one is seriously trying to beat their gaming chips. Wouldn't make any sense to give it up.
It's not related to the DRAM shortage. Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI and there was controversy years before that about most "gaming" GPUs going to crypto miners. They won't exit the gaming market but from a shareholder perspective it does look like a good idea.
> Gaming dropped to ~10% of Nvidia's revenue a year or two ago due to AI
Well, actually it's that the AI business made NVidia 10x bigger. NVidia now has a market cap of $4.4 trillion. That's six times bigger than General Motors, bigger than Apple, and the largest market cap in the world. For a GPU maker.
1 reply →
yet another reason to not listen to your shareholders.
if it were up to them, cuda would be a money losing initiative that was killed in 2009
2 replies →
Took what, four years for PC cases to get back to reasonable prices after COVID? And that’s a relatively low-tech field that (therefore) admits new entrants. I don’t know, I’m not feeling much optimism right now (haven’t at any point after the crypto boom), perhaps because I’ve always leaned towards stocking up on (main) RAM as a cheap way to improve a PC’s performance.
Huh? Only thing I noticed during COVID was a few GPUs and some HDDs getting insanely expensive. Surprise surprise, not even 18 months later (I think more like 14 on my local market) the sellers finally got it through their thick skulls that economy is not what you see in simulations and that 95% of people absolutely will not pay ~$4000 for an RTX 3090 / 4090. And similar for the 18TB HDDs.
I am not saying you are wrong but here in Eastern Europe, while we did suffer the price hikes (and are suffering those of the DDR5 RAM now as well), the impact was minimal. People just holed up, said "the market's crazy right now, let's wait it out", and shrugged. And lo and behold, successfully wait it out they did.
As I mentioned in another comment in this thread, I highly doubt high RAM prices will survive even to 2027. Sure a good amount of stores and suppliers will try to hold on to the prices with a death grip... but many stores and vendors and suppliers hate stale stock. And some other new tech will start coming out. They would not be able to tolerate shelves full of merchandise at prices people don't want to buy at.
They absolutely _will_ budge.
I predict that by July/August 2026 some price decreases will start. They are likely to be small -- no more than 15% IMO -- but they will start.
The current craze of "let's only produce and sell to the big boys" has happened before, happens now, and will happen again. I and many others don't believe the hysteric "the market will never be the same again after" narrative either.
The fact that a case is literally a sheet metal box and can cost $150 is so bewildering to me. All those $400 nas builds like 25% of the cost is literally just the case. CPU might be only $25 and requires an advanced fab lab not just someone with a lasercutter.
If Nvidia did drop their gaming GPU lineup, it would be a huge re-shuffling in the market: AMD's market share would 10x over night, and it would open a very rare opportunity for minority (or brand-new?) players to get a foothold.
What happens then if the AI bubble crashes? Nvidia has given up their dominant position in the gaming market and made room for competitors to eat some (most?) of their pie, possibly even created an ultra-rare opportunity for a new competitor to pop up. That seems like a very short-sighted decision.
I think that we will instead see Nvidia abusing their dominant position to re-allocate DRAM away from gaming, as a sector-wide thing. They'll reduce gaming GPU production while simultaneously trying to prevent AMD or Intel from ramping up their own production.
It makes sense for them to retain their huge gaming GPU market share, because it's excellent insurance against an AI bust.
Yeah, sure, every tech company now acts like a craven monopolist hellbent on destroying everything that isn't corporate-driven AI computing, but not this time! This time will be different!
I guess time will tell, will it? Your sarcastic remark does not bear any prophetic value.
They can act as monopolist as they want. They can try anything and cackle maniacally at their amazing business acumen all they want.
Turns out, total addressable market is not infinite. Turns out people don't want to spend on RAM as much as they would on a used car. How shocking! And I am completely sure that yet again the MBAs would be unprepared for these grand revelations, like they are, EVERY time.
Still, let us wait and see. Most of us are not in a rush to build a gaming machine or a workstation next month or else puppies will start dying.
I am pretty sure the same people now rubbing hands and believing they have found the eternal gold, will come back begging and pleading for our measly customer dollars before not too long.
What I don't really understand is why the big data centre operators destroy their old cards instead of selling them off. What are the downsides for them? Apart from the obvious, i.e. it would bring in money, would it not also drive down the cost for brand new cards? I.e. Nvidia can currently overcharge dramatically because there is such a shortage. If the data centre operators would dump large numbers of used cards on the market would that not increase supply and drive down cost?
Most data center GPUs don't have display outputs and some use exotic connectors (not PCIe x16 slots), making them worth next to nothing as conventional graphics cards.
Who is buying the old cards? They can't be used for gaming, if there was money to be made I think they would be doing it.
Its just easier to write off the full value as e-waste than to try to turn a profit selling a meager amount of used hardware.
You don't "write off" the full value, you fiddle the amortization so they go to zero accounting value exactly when you want them to. They're playing this exact game with datacentre GPUs right now.
You can still have a tax implication when you sell the fully depreciated item but in theory it should only be a benefit unless your company has a 100% marginal tax rate somehow.
Of course it can cost more to store the goods and administer the sale then you recoup. And the manufacturer may do or even require a buyback to prevent the second hand market undercutting their sales. Or you may be disinclined to provide cheap hardware to your competitors.
If a $100B/year company closes a $1B/year division, they are doing something much worse than losing $1B/year: they are giving $1B in funding to a new competitor which can grow to threaten the major part.
I've switched away from Nvidia around 2008 due to poor Linux support. Been on AMD ever since (now running the flagship model from a year ago, the 7900xtx or whatever it's called).
Won't personally miss Nvidia, but we need competition in the space to keep prices 'reasonable' (although they haven't been reasonable for some years), and to push for further innovation.
I don't think they will. There is a reason why every GPU they make (gaming or not) supports CUDA. Future gamers are future CUDA developers. Taking that away would be a self goal.
Will leave a vaccum for Chinese companies to grab the whole market share.
It means people get to enjoy more indie games with good designs, instead of having FOMO for cool graphics without substance.
On graphics: there is a threshold where realistic graphics make the difference.
Not all games need to be that, but Ghost of Tsushima in GBA Pokemon style is not the same game at all. And is it badly designed ? I also don't think so. Same for many VR games which make immersion meaningful in itself.
We can all come up with a litany of bad games, AAA or indie, but as long as there's a set of games fully pushing the envelope and bringing new things to the table, better hardware will be worth it IMHO.
I can't name one in last 5 that has been "pushing the envelope" that would actually wow me. And the ones that did, did it by artstyle, not sheer amount of polygons pushed to the screen.
VR, sure, you want a lot of frames on 2 screens, that requires beef so the visual fidelity on same GPU will be worse than on screen, but other than that if anything graphical part of games have flatlined for me.
Also, putting the money literally anywhere else gonna have better results game quality wise. I want better stories and more complex/interesting systems, not few more animated hairs
3 replies →
Sure, but would Ghosts of Tsushima be any less immersive with PS4 graphics? Even max PS3 graphics?
5 replies →
Daikatana in GBC style turned into a good game, lol.
It means lots of people will give up the hobby.
Let's be real, the twitch FPS CoD players aren't going to give that up and play a boring life simulator.
This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
PC gaming will be fine even without 8K 120fps raytracing. It will be fine even if limited to iGPUs. Maybe even better off if it means new titles are actually playable on an average new miniPC. More realistically I guess we get an AMD/Intel duopoly looking quite similar instead.
It will probably be a bigger blow to people who want to run LLMs at home.
That doesn't sem very plausible, how many people are driven away from CounterStrike or like League of Legends because the graphics weren't as good as Cyberpunk or whatever?
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
5 replies →
CoD is currently 300GB on disk due to all the textures. I suspect a lot of players would be happy with a modest regression in fidelity if it means the game can run smoothly on affordable devices and leave some room for their other games too.
Most CoD players are on console or mobile, not PC
Huh? No? It means that the overall platform is already at 'good enough' level. There can always be an improvement, but in terms of pure visuals, we are already past at a point, where some studios choose simple representations ( see some 2d platformers ) as a stylistic choice.
It gonna be ok.
10 replies →
Cod devs aren't stupid. They will design a game for the hardware their target market can get their hands on.
Let's be real, CoD only appeals to a small community in the whole planet.
8 replies →
>It means lots of people will give up the hobby.
Oh, we can only hope!
>This has the potential to harm a lot of businesses from hardware to software companies, and change the lives of millions of people.
Including millions of gamers, but for the better.
10 replies →
Man, I remember playing UT GOTYE back in the 00s and the graphics blew us away when we fired it up and then Return to Castle Wolfenstein made my brother cry from the "realistic" zombies (on a CRT even!). It's amazing what you can take for granted when even a fraction of a modern card would have been called "photorealistic" back then.
What about Radeon cards, or consoles?
It remains to be seen to be fair.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
I've heard good things about Moore Threads. Who knows, maybe the consumer GPU market is not a duopoly after all, Nvidia exiting the market would be a good thing longer term by introducing more competitions.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
Is there any path for Microsoft and NVIDIA to work together and resurrect some sort of transparent SLI layer for consumer workloads? It’d take the pressure off the high end of the market a little and also help old cards hold value for longer, which would be a boon if, for example, your entire economy happened to be balanced on top of a series of risky loans against that hardware.
If NVIDIA exits the market, there is still AMD, Intel and PowerVR (Imagination Technologies is back at making discrete PC GPUs, although currently only in China).
Unfortunately none of those are any use for video work.
Is that due to some kind of issue with the architecture, or just a matter of software support?
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
5 replies →
I don’t understand why most people in this thread think that this would be such a big deal. It will not change the market in significant negative or positive ways. AMD has been at their heals for a couple of decades and is more competitive than ever, they will simply fill their shoes. Most games consoles have been AMD centric for a long time regardless, they’ve always been fairly dominant in the mid range and they have a longstanding reputation of having the best price/performance value for gamers.
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close to taking over Intel on their long CPU race, both in the datacenter and consumer segments, and Nvidia might be next in the coming 5 years, depending on how the AI bubble develops.
Why would AMD shareholders tolerate them engaging in gaming pc market if Nvidia drops out? They might last a couple years but I mean the writing will be on the wall for them to chase enterprise sales and abandon gamers. Especially when game console manufacturers would prefer you spend $25 a month for life instead of buying a $500 console every 8 years. There won't be an xbox or playstation in your house before long.
Sure the gaming hardware landscape is likely to change, but it won’t be up to Nvidia’s decision. If it remains a viable business, it will thrive whether Nvidia is in it or not.
They’ve become a big name now with AI, but they were never the only game in town in their home markets. They had an edge on the high-end so their name had some prestige, but market share wise it was quite even. Even with AI, they have a temporary head start but I wouldn’t be surprised if they get crowded in the coming years, what they do is not magic.
Is it better to go short on them or buy AMD?
Betting on AMD's continued success in CPUs is far safer than Nvidia's demise.
AMD would do the same thing as Nvidia but $50 cheaper.
It wouldn't be unheard of.
Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.
Doing one thing well can lead to doing bigger things well.
More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.
Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.
Update: Adding https://en.wikipedia.org/wiki/Eudora_(email_client)
Mac Eudora was the best email client ever. If it had got UTF8 support I'd probably still be running it in an emulator.
I just learned today that there has been some efforts underway: https://hermes.cx/
I had completely forgotten about the existence of Eudora. Thanks friend, that lead me down a mental rabbit hole.
Glad you enjoyed. Steve Wozniaks review of audits was fun to read.
I'm also curious what this could mean for Nintendo.
NVIDIA would still have service contract obligations to fulfil, and would provide support for its existing products for a period of time.
Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.
What goes into a Nintendo console is not prime silicon. When it's time to design the next console, I am sure Nvidia will still be more than happy to give them a design that they have laying around somewhere in a drawer if it means they ship 100M units.
Then Intel and AMD carry on, tbh having sewn up handhelds and consoles and made gaming on integrated graphics mainstream many won't notice. An AI bubble burst leaving loads of GPU laden datacenters is much more likely to hasten cloud gaming.
I’ve always been an AMD customer because I’ve despised Nvidia’s business practices for 10+ years.
It would still suck if they left the market because who does AMD have to compete with with? Intel? LOL
Increased prices for everyone. Lovely. I can’t despise AI enough.
> I’ve always been an AMD customer because I’ve despised Nvidia’s business practices for 10+ years.
I am 100% sure AMD would have done the exact same thing as NVIDIA does right now, given the chance.
Are you saying they wouldn't have milked the market to the last drop? Do you really believe it?
If you look at AMD's CPUs there's indications they do that. When Zen1/1+/2 came out they were priced below intel's products as they needed to rebuild mindshare with their promising new chips, from Zen3 onwards where they started building a performance lead in many categories as well as core count they jacked the prices up because they could demand it.
Of course they would have milked us that’s why I want Nvidia to stick around, to keep AMD in check.
Then Intel and AMD take what NVIDIA won’t.
Some of the pro-monopoly takes in this thread are mindblowing. We get precisely what we deserve.
[dead]
We can really hope they do it and fast!
That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.
the pc gaming market is a hobbyist niche compared to the ongoing infrastructure projects.
i predict that the "pc" is going to be slowly but surely eaten bottom-up by increasingly powerful SoCs.
From the article: "(NVidia) AI data center revenue reached $51.2 billion versus just $4.3 billion from gaming in Q3 2025."
Moore Threads in China just announced a new GPU.[1] Announced, not shipped.
[1] https://wccftech.com/moore-threads-lushan-gaming-huashan-ai-...
Looks like an hit piece to trigger some people to dump their $NVDA stock. They worked phrases like "abandon" and "AI Bubble" into the title/subtitle. Authors other articles look like clickbait crap https://www.pcworld.com/author/jon-martindale
They probably won't. They'll just change things so their hardware becomes a subscription-style model rather than proper outright ownership by the purchaser, which is to a limited degree the case when it comes to their hardware drivers anyway.
Fuck this future.
More children born?
If the AI bubble doesn’t burst is carrying an awful lot of water there…
Game graphics are still a high margin silicon business. Someone will do it.
Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive
Look.
Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.
Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.
PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.
I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.
Shrug and buy the next best thing?
[flagged]
Gaming got me into coding. Messing with Warcraft 3 world editor back when there were a lot of dota clones. Good times. I think blizzard had their own language JASS that was very lua like
so nice memories, and the same occurred to me, but was writing macros and hotkeys for Ragnarok Online (in Pascal/Delphi and VB/6)
I was tempted to respond with an offhand comment about the size of the industry or similar, but what axe do you have to grind about PC gaming? You'd prefer folks go to the far more injurious mobile gaming space?
>You'd prefer folks go to the far more injurious mobile gaming space?
No, I prefer them touching grass and talking to some people, or getting a less addictive and time-wasting hobby.
2 replies →
Really, I have been gaming before even getting my Timex 2068 in the mid 80's, starting with Game & Watch handhelds, and I don't get "build your aquarium" culture of many PC gamers nowadays.
It is so bad that is almost impossible to buy a traditional desktop on regular computer stores, there are only fish tanks with rainbows on sale.
Luckily, what's valuable and what's not is not on you to judge.
Unfortunately you’re right re:dating prospects but that’s mostly because game devs haven’t been able to reproduce the insane success of valorant at getting the better gender to want to play hardcore games.
I'm not sure it would matter. It doesn't seem that graphics are the limiting factor in games anymore. Plenty of popular games use variations on cartoon-style graphics, for example - Fortnight, Overwatch, Valorant, etc. Seems gameplay, creativity, and player community are more determining factors.
That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.
Notably the games you listed are all f2p/esports games, and that does matter in terms of how much budget developers have to polish a realistic look vs ship a cartoon and call it the "art style".
I just upgraded to 9700 XT to play ARC Raiders and it's absolutely a feast for the eyes while also pioneering on several fronts especially around the bot movement and intelligence.
> It doesn't seem that graphics are the limiting factor in games anymore.
Have you seen the GTA VI trailer?