This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump.
My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
I hear your argument, but short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon. Of course I could easily be wrong, but regardless I think the most predictable cause for a drop in the NVIDIA price would be that the CHIPS act/recent decisions by the CCP leads a Chinese firm to bring to market a CUDA compatible and reliable GPU at a fraction of the cost. It should be remembered that NVIDIA's /current/ value is based on their being locked out of their second largest market (China) with no investor expectation of that changing in the future. Given the current geopolitical landscape, in the hypothetical case where a Chinese firm markets such a chip we should expect that US firms would be prohibited from purchasing them, while it's less clear that Europeans or Saudis would be. Even so, if NVIDIA were not to lower their prices at all, US firms would be at a tremendous cost disadvantage while their competitors would no longer have one with respect to compute.
All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.
I suspect major algorithmic breakthroughs would accelerate the demand for GPUs instead of making it fall off, since the cost to apply LLMs would go down.
People will want more GPUs but will they be able to fund them? At what points does the venture capital and loans run out? People will not keep pouring hundreds of billions into this if the returns don't start coming.
Doesn't even necessarily need to be CUDA compatible... there's OpenCL and Vulkan as well, and likely China will throw enough resources at the problem to bring various libraries into closer alignment to ease of use/development.
I do think China is still 3-5 years from being really competitive, but still even if they hit 40-50% of NVidia, depending on pricing and energy costs, it could still make significant inroads with legal pressure/bans, etc.
NVIDIA stock tanked in 2025 when people learned that Google used TPUs to train Gemini, which everyone in the community knows since at least 2021. So I think it's very likely that NVIDIA stock could crash for non-rationale reasons
Google did not use TPUs for literally every bit of compute that led to Gemini. GCP has millions of high end Nvidia GPUs and programming for them is an order of magnitude easier, even for googlers.
Any claim from google that all of Gemini (including previous experiments) was trained entirely by TPUs is lies. What they are truthfully saying is that the final training run was done on all TPUs. The market shouldn’t react heavily to this, but instead should react positively to the fact that google is now finally selling TPUs externally and their fab yields are better than expected.
I really don't understand the argument that nvidia GPUs only work for 1-3 years. I am currently using A100s and H100s every day. Those aren't exactly new anymore.
It’s not that they don’t work. It’s how businesses handle hardware.
I worked at a few data centers on and off in my career. I got lots of hardware for free or on the cheap simply because the hardware was considered “EOL” after about 3 years, often when support contracts with the vendor ends.
There are a few things to consider.
Hardware that ages produce more errors, and those errors cost, one way or another.
Rack space is limited. A perfectly fine machine that consumes 2x the power for half the output cost. It’s cheaper to upgrade a perfectly fine working system simply because it performs better per watt in the same space.
Lastly. There are tax implications in buying new hardware that can often favor replacement.
The common factoid raised in financial reports is GPUs used in model training will lose thermal insulation due to their high utilization. The GPUs ostensibly fail. I have heard anecdotal reports of GPUs used for cryptocurrency mining having similar wear patterns.
I have not seen hard data, so this could be an oft-repeated, but false fact.
1-3 is too short but they aren’t making new A100s, theres 8 in a server and when one goes bad what do you do? you wont be able to renew a support contract. if you wanna diy you eventually you have to start consolidating pick and pulls. maybe the vendors will buy them back from people who want to upgrade and resell them. this is the issue we are seeing with A100s and we are trying to see what our vendor will offer for support.
(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.
(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.
(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.
If power is the bottleneck, it may make business sense to rotate to a GPU that better utilizes the same power if the newer generation gives you a significant advantage.
From an accounting standpoint, it probably makes sense to have their depreciation be 3 years. But yeah, my understanding is that either they have long service lives, or the customers sell them back to the distributor so they can buy the latest and greatest. (The distributor would sell them as refurbished)
I think the story is less about the GPUs themselves, and more about the interconnects for building massive GPU clusters. Nvidia just announced a massive switch for linking GPUs inside a rack. So the next couple of generations of GPU clusters will be capable of things that were previously impossible or impractical.
This doesn't mean much for inference, but for training, it is going to be huge.
This seems to take for granted that China and their foundries and engineering teams will never catch up. This seems foolish. I'm working under the assumption that sometime in the next ten years some Chinese company will have a breakthrough and either meet Nvidia's level or leapfrog them. Then the market will flood with great, cheap chips.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.
Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.
Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.
Personally I wonder even if the LLM hype dies down we'll get a new boom in terms of AI for robotics and the "digital twin" technology Nvidia has been hyping up to train them. That's going to need GPUs for both the ML component as well as 3D visualization. Robots haven't yet had their SD 1.1 or GPT-3 moment and we're still in the early days of Pythia, GPT-J, AI Dungeon, etc. in LLM speak.
That's the rub - it's clearly overvalued and will readjust... the question is when. If you can figure out when precisely then you've won the lottery, for everyone else it's a game of chicken where for "a while" money that you put into it will have a good return. Everyone would love if that lasted forever so there is a strong momentum preventing that market correction.
Crypto & AI can both be linked to part of a broader trend though, that we need processors capable of running compute on massive sets of data quickly. I don't think that will ever go down, whether some new tech emerges or we just continue shoveling LLMs into everything. Imagine the compute needed to allow every person on earth to run a couple million tokens through a model like Anthropic Opus every day.
Agree on looking at the company-behind-the-numbers. Though presumably you're aware of the Efficient Market Hypothesis. Shouldn't "slowed down datacenter growth" be baked into the stock price already?
If I'm understanding your prediction correctly, you're asserting that the market thinks datacenter spending will continue at this pace indefinitely, and you yourself uniquely believe that to be not true. Right? I wonder why the market (including hedge fund analysis _much_ more sophisticated than us) should be so misinformed.
Presumably the market knows that the whole earth can't be covered in datacenters, and thus has baked that into the price, no?
I saw a $100 bill on the ground. I nearly picked it up before I stopped myself. I realised that if it was a genuine currency note, the Efficient Market would have picked it up already.
I'll also point out there were insane takes a few years ago before nVidia's run up based on similar technical analysis and very limited scope fundamental analysis.
Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.
I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.
I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.
The large api/token providers, and large consumers are all investing in their own hardware. So, they are in an interesting position where the market is growing, and NVIDIA is taking the lion's share of enterprise, but is shrinking at the hyperscaler side (google is a good example as they shift more and more compute to TPU). So, they have a shrinking market share, but its not super visible.
I’m sad about Grok going to them, because the market needs the competition. But ASIC inference seems to require a simpler design than training does, so it’s easier for multiple companies to enter. It seems inevitable that competition emerges. And eg a Chinese company will not be sold to Nvidia.
What’s wrong with this logic? Any insiders willing to weigh in?
I'm not an insider, but ASICs come with their own suite of issues and might be obsolete if a different architecture becomes popular. They'll have a much shorter lifespan than Nvidia hardware in all likelihood, and will probably struggle to find fab capacity that puts them on equal footing in performance. For example, look at the GPU shortage that hit crypto despite hundreds of ASIC designs existing.
The industry badly needs to cooperate on an actual competitor to CUDA, and unfortunately they're more hostile to each other today than they were 10 years ago.
> The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years
Isn’t this entirely dependent on the economic value of the AI workloads? It all depends on whether AI work is more valuable than that cost. I can easily see arguments why it won’t be that valuable, but if it is, then that cost will be sustainable.
100% this. all of this spending is predicated on a stratospheric ROI on AI investments at the proposed investment levels. If that doesn't pan out, we'll see a lot of people left holding the cards including chip fabs, designers like Nvidia, and of course anyone that ponied up for that much compute.
I no AI fanboy at all. I think it there won’t be AGI anytime soon.
However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.
AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.
At this point computation is in essence commodity. And commodities have demand cycles. If other economic factors slowdown or companies go out of business they stop using compute or start less new products that use compute. Thus it is entirely realistic to me that demand for compute might go down. Or that we are just now over provisioning compute in short or medium term.
What if its penetration ends up being on the same level as modern crypto? Average person doesn't seem to particularly care about meme coins or bitcoin - it is not being actively used in day to day setting, there's no signs of this status improving.
Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.
So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.
I think the way to think about the AI bubble is that we're somewhere in 97-99 right now, heading toward the dotcom crash. The dotcom crash didn't kill the web, it kept growing in the decades that followed, influencing society more and more. But the era where tons of investments were uncritically thrown at anything to do with the web ended with a bang.
When the AI bubble bursts, it won't stop the development of AI as a technology. Or its impact on society. But it will end the era of uncritically throwing investments at anyone that works "AI" into their pitch deck. And so too will it end the era of Nvidia selling pickaxes to the miners and being able to reach soaring heights of profitability born on wings of pretty much all investment capital in the world at the moment.
Bubble or not it’s simply strange to me that people confidently put a timeline on it. To name the phases of the bubble and calling when they will collapse just seems counter intuitive to what a bubble is. Brad Gerstner was the first “influencer” I heard making these claims of a bubble time line. It just seems downright absurd.
> This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump.
My 30k ft view is that the stock will inevitably slide as AI
Actually "technical analysis" (TA) has a very specific meaning in trading: TA is using past prices, volume of trading and price movements to, hopefully, give probabilities about future price moves.
But TFA doesn't do that at all: it goes in detail into one pricing model formula/method for options pricing. In the typical options pricing model all you're using is current price (of the underlying, say NVDA), strike price (of the option), expiration date, current interest rate and IV (implied volatility: influenced by recent price movements but independently of any technical analysis).
Be it Black-Scholes-Merton (european-style options), Bjerksund-Stensland (american-style options), binomial as in TFA, or other open options pricing model: none of these use technical analysis.
Here's an example (for european-style options) where one can see the parameters:
You can literally compute entire options chains with these parameters.
Now it's known for a fact that many professional traders firms have their own options pricing method and shall arb when they think they find incorrectly priced options. I don't know if some use actual so forms of TA that they then mix with options pricing model or not.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.
No matter if you're right or not, I'd argue you're doing what's called fundamental analysis (but I may be wrong).
P.S: I'm not debatting the merits of TA and whether it's reading into tea leaves or not. What I'm saying is that options pricing using the binomial method cannot be called "technical analysis" for TA is something else.
Fundamental analysis is great! But I have trouble answering concrete questions of probability with it.
How do you use fundamental analysis to assign a probability to Nvidia closing under $100 this year, and what probability do you assign to that outcome?
I'd love to hear your reasoning around specifics to get better at it.
I think the idea of fundamental analysis that you focus on return on equity and see if that valuation is appreciably more than the current price (as opposed to assigning a probability)
Well, not to be too egregiously reductive… but when the M2 money supply spiked in the 2020 to 2022 timespan, a lot of new money entered the middle class. That money was then funneled back into the hands of the rich through “inflation”. That left the rich with a lot of spare capital to invest in finding the next boom. Then AI came along.
Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc
It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.
Add in the fact companies seriously invested in AI (and like workloads typically reliant on GPUs) are also investing more into bespoke accelerators, and the math for nVidia looks particularly grim. Google’s TPUs set them apart from the competition, as does Apple’s NPU; it’s reasonable to assume firms like Anthropic or OpenAI are also investigating or investing into similar hardware accelerators. After all, it’s easier to lock-in customers if your models cannot run on “standard” kit like GPUs and servers, even if it’s also incredibly wasteful.
The math looks bad regardless of which way the industry goes, too. A successful AI industry has a vested interest in bespoke hardware to build better models, faster. A stalled AI industry would want custom hardware to bring down costs and reduce external reliance on competitors. A failed AI industry needs no GPUs at all, and an inference-focused industry definitely wants custom hardware, not general-purpose GPUs.
So nVidia is capitalizing on a bubble, which you could argue is the right move under such market conditions. The problem is that they’re also alienating their core customer base (smaller datacenters, HPC, gaming market) in the present, which will impact future growth. Their GPUs are scarce and overpriced relative to performance, which itself has remained a near-direct function of increased power input rather than efficiency or meaningful improvements. Their software solutions - DLSS frame-generation, ray reconstruction, etc - are locked to their cards, but competitors can and have made equivalent-performing solutions of their own with varying degrees of success. This means it’s no longer necessary to have an nVidia GPU to, say, crunch scientific workloads or render UHD game experiences, which in turn means we can utilize cheaper hardware for similar results. Rubbing salt in the wound, they’re making cards even more expensive by unbundling memory and clamping down on AIB designs. Their competition - Intel and AMD primarily - are happily enjoying the scarcity of nVidia cards and reaping the fiscal rewards, however meager they are compared to AI at present. AMD in particular is sitting pretty, powering four of the five present-gen consoles, the Steam Deck (and copycats), and the Steam Machine, not to mention outfits like Framework; if you need a smol but capable boxen on the (relative) cheap, what used to be nVidia + ARM is now just AMD (and soon, Intel, if they can stick the landing with their new iGPUs).
The business fundamentals paint a picture of cannibalizing one’s evergreen customers in favor of repeated fads (crypto and AI), and years of doing so has left those customer markets devastated and bitter at nVidia’s antics. Short of a new series of GPUs with immense performance gains at lower price and power points with availability to meet demand, my personal read is that this is merely Jenson Huang’s explosive send-off before handing the bag over to some new sap (and shareholders) once the party inevitably ends, one way or another.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
Exactly, it is currently priced as though infinite GPUs are required indefinitely. Eventually most of the data centres and the gamers will have their GPUs, and demand will certainly decrease.
Before that, though, the data centres will likely fail to be built in full. Investors will eventually figure out that LLMs are still not profitable, no matter how many data centres you produce. People are interested in the product derivatives at a lower price than it costs to run them. The math ain't mathin'.
The longer it takes to get them all built, the more exposed they all are. Even if it turns out to be profitable, taking three years to build a data centre rather than one year is significant, as profit for these high-tech components falls off over time. And how many AI data centres do we really need?
I would go further and say that these long and complex supply chains are quite brittle. In 2019, a 13 minute power cut caused a loss of 10 weeks of memory stock [1]. Normally, the shops and warehouses act as a capacitor and can absorb small supply chain ripples. But now these components are being piped straight to data centres, they are far more sensitive to blips. What about a small issue in the silicon that means you damage large amounts of your stock trying to run it at full power through something like electromigration [2]. Or a random war...?
> The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
Yep. Nothing about this adds up. Existing data centres with proper infrastructure are being forced to extend use for previously uneconomical hardware because new data centres currently building infrastructure have run the price up so high. If Google really thought this new hardware was going to be so profitable, they would have bought it all up.
According to nvidia’s 2025 annual report [1], 34% of their sales for 2025 comes from just 3 customers.
Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].
This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).
To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.
[2] same, page 116:
> Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.
I have lots of skepticism about everything involved in this, but on this particular point:
It's a bit like TSMC: you couldn't buy space on $latestGen fab because Apple had already bought it all. Many companies would have very much liked to order H200s and weren't able to, as they were all pre-sold to hyperscalers. If one of them stopped buying, it's very likely they could sell to other customers, though there might be more administrative overhead?
Now there are some interesting questions about Nvidia creating demand by investing huge amounts of money in cloud providers that will order nv hardware, but that's a different issue.
Its probably not very likely that if a large buyer pulled out, NVIDIA could just sell to other customers. If a large buyer pulls out, that's a massive signal to everyone else to begin cutting costs as well. The large buyer either knows something everyone else doesn't, or knows something that everyone else has already figured out. Either way, the large buyer pulling out signals "I don't think the overall market is large enough to support this amount of compute at these prices at current interest rates" and everybody is doing the same math too.
None of those customers can afford to cancel their orders. OpenAI, Google and Meta cannot afford to get cheap on GPUs when presumably they believe GAI is around the corner. The first company to achieve GAI will win because at that point all gains will become exponential.
All the AI companies are locked in a death loop where they must spend as much money as possible otherwise everything they invested will immediately become zero. No one is going to pay for an LLM when the competitor has GAI. So it's death loop for everyone that has become involved in this race.
He's answering the question "How should options be priced?"
Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Whether Nvidia specifically takes a big dive depends much more on whether they continue to meet growth estimates than general volatility. If they miss earnings estimates in a meaningful way the market is going to take the stock behind the shed and shoot it. If they continue to exceed estimates the stock will probably go up or at least keep its present valuation.
I've been selling options for almost a decade now, including running trading algorithms, and was laughing a bit to myself because it was basically just the math in an everyday option chain. As you already know, anyone can look at the strike they are talking about, with the IV already cooked into it, on platforms like Think Or Swim or even Yahoo Finance. Some of the stuff can be pretty useful though in backtesting and exploration.
All that aside, I'm impressed it made it to the HN front page.
> Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Other way around: if NVidia sinks, it likely takes a bunch of dependent companies with it, because the likely causes of NVidia sinking all tell us that there was indeed an AI bubble and it is popping.
Indeed, the market as a whole would be affected. But is not NVIDIA more of a software company than a hardware one? This bugs the shit out of me.
They are maintaining this astronomical growth through data centers margins from the design of their chips and all of that started from graphics related to video games.
I'm surprised more people are not talking about the fact that the two best models in the world, Gemini 3 and Claude 4.5 Opus, were both trained on Google TPU clusters.
Presumably, inference can be done on TPUs, Nvidia chips, in Anthropic's case, new stuff like Trainium.
Google is a direct competitor to many LARGE buyers of GPUs and therefore a non starter from a business perspective. In addition, many companies cannot single source due to risk considerations. Hardware is different because buying GPUs is a capital investment. You own the asset and revisit the supplier only at the next refresh cycle, not continuously as with rented compute.
It doesn't goto nearly zero. TSMC has a large fab in Arizona and they are continuing to expand it. They also have a fab in Washington, and in Japan. [1]
The fab in Washington is very old (notice it's still equipped for 8 inch wafers) and so pretty irrelevant to Nvidia's business.
I'm not quite sure what process they run there but I believe it was an acquisition 10+ years ago, not built from the ground up by them.
Edit: their Japan fab is also a mature node so not very relevant here.
And their Arizona fab is a very very small portion of their volume and with far worse margin.
I agree. It's funny that this is one of the cited reason for the (relative) value suppression of tsmc, but the same factors should apply to Nvidia too.
Going to zero is one potential outcome. Equally plausible is it goes up 10% in a relatively quick battle or diplomatic outcome which ends the geopolitical uncertainty.
I think they are already hedging for Taiwan. 1. They just pseudo-acquired Groq, fully made in USA (GlobalFoundries) and with a diversified supply chain. 2. And they just announced they will be re-introducing RTX 3090 made in Korea (Samsung). 3. And they plan to produce chips in Intel's new US fabs soon.
I think the bigger problems of the AI bubble are energy and that it's gaining a terrible reputation for being the excuse for mass layoffs while suffocating the Internet with slop/brainrot content. All while depending on government funding to grow.
Yes, lots of other companies would be affected to a greater or lesser extent (even non-tech stocks), but specifically any company that relies on manufacturing all their product in Taiwan will be affected most of all.
The whole economy will crash. Probably won't be due to China invading Taiwan though. More likely because the president decided to delete their country's world reserve currency status (which is another word for a trade deficit).
Well, the reality is that most people don't want a bloodbath and it's increasingly looking like external support won't come, so what you gonna do... life is a very complex chess game, gotta play your pieces right.
At this rate, even if they can't get the Taiwanese population to consent, it probably makes more sense to wait anyway to see how low America can sink. The lower America goes, the better their chance for success.
Where do you see the pro china side getting more and more support? As far as I can tell it's sharply swung towards maintaining independence in the past decade or two with single digit support of unification with the mainland.
An EU type agreement will keep peace for some time. Remove all trade barriers between two countries, have a treaty preventing any side to be used militarily by third party, no attacking each other and free movement of all vessels through each other's seas. Maybe few more
I think Taiwanese elites can be bought, they say they can’t but I think that’s just part of the bargaining for a higher price. The overtures towards a costly and destructive invasion is Chinas attempt at lowering that price. As is the strategy of building up an indigenous chip manufacturing industry. The aggressive rhetoric from China has the added benefit of keeping the US on a self sabotaging aggressive posture.
Arizona fabs don't work without TW's many sole source suppliers for fab consumables. They'll likely grind to halt after few months when stock runs out. All the dollar shuffling's not going to replace supply chain that will take (generously) years to build, if ever.
China invading Taiwan makes zero sense, they just flex those muscles for domestic consumption. They will probably take over Taiwan, but they'll do it how modern major powers do anything: propaganda, influence campaigns, and soft power.
Russia invading Ukraine also made zero sense, given their actual capabilities and the likely (now realized) consequences. The leader doesn't always have the best information, it turns out.
Either that, or the leader does have access to the best information, and they just DGAF. That condition seems to be going around too.
They're enjoying a massive demand for GPUs due to AI blowing up, at a time when there isn't much competition, yet the technology is already pleateauing, with similar offerings from AMD, not to mention proven training & inference chips from Google & AWS, plus the Chinese national strategy of prioritizing domestic chips
The only way the stock could remain at its current price or grow (which is why you'd hold it) is if demand would just keep going up (with the same lifecycle as current GPUs) and that there would be no competition, which the latter to me us just never going to be a thing.
Investors are convinced that Nvidia can maintain its lead because they have the "software" side, I.e. CUDA, which to me is so ridiculous, as if with the kind of capital that's being deployed into these datacenters, you couldn't fit your models into other software stacks by hiring people....
As others have noted, the article is analysing the actual financial markets angle.
For my two cents on the technical side, it is likely that any Western-origin shakiness will come from Apple and how it manages to land the Gemini deal and Apple Intelligence v2. There is an astounding amount of edge inference sitting in people’s phones and laptops that only slightly got cracked open with Apple Intelligence.
Data centre buildouts will get corrected when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet. This will serve as a guiding principle for any future buildout and heavyweight inference cards that Nvidia is supplying. The 2-5 year moat top providers have with the largest models will get chomped at by the leisure/hobby/educational use cases that lightweight models capably handle. Small language and visual models are already amazing. The next crack will appear when the past gen cards (if they survive the around the clock operation) get bought up by second hand operators that can provide capable inference of even current gen models.
If past knowledge of DC operators holds (e.g. Google and its aging TPUs that still get use), the providers with the resources to buy new space for newer gens will accumulate the amount of hardware, but the providers who need to continuously shave off the financial hit that comes with using less efficient older cards.
I’m excited to see future blogs about hardware geeks buying used inference stacks and repurposing them for home use :)
>when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet
is there any reason to expect that this information will ever be known outside of apple?
Accuracy wise we won't know the exact numbers, but insiders and industry experts usually are able to find ballpark figures that they share with the press. The alternative is the usual find out the estimates through competitors' lost MAU numbers in apps like ChatGPT for iOS.
It's entirely possible it will crash, but I also don't think it'll go bankrupt or anything.
I don't typically buy stock to flip it right away; I have some Nvidia stock that I bought the day after ChatGPT was launched, and I bought a bit more when it was $90/share about a ~year ago. If it drops to $100, then I'll still be in the black, but even if it drops to $50, I'm not going to worry because I figure that I can just hold onto it until another upswing.
Nvidia has been around long enough and has enough market penetration in datacenters and gaming that I don't think it's going to go bust, and I figure that it will eventually appreciate again just due to inflation.
> Nvidia has been around long enough and has enough market penetration in datacenters and gaming that I don't think it's going to go bust, and I figure that it will eventually appreciate again just due to inflation.
I don't dispute that all. I'm just ok with that being the outcome; if it keeps up with inflation then it's still better than storing it in a bank, and the thing that would upset me more than "not gaining" would be "actively losing".
Now obviously, if it drops below from what I paid for it and then it takes inflation to catch up, then yeah, that's definitely "lost money", but that's just the risk of the stock market, especially with individual stocks. I also think that if it crashes, Nvidia might still have another surge eventually, even if it doesn't get back to its full glory.
I definitely would not buy new Nvidia stock at its current price though.
Everyone is saying data center build outs are the main thing to look out for. But those data centers with all those gpus will need to replace those gpus right? Nvidia will come up with better, faster, more efficient gpus.
LLM use age won't crash either, it might decline or taper off but it's here to stay.
My concern is better models that won't need a whole of GPU, or China comping up with their own foundry and GPUs that compete. There is also the strategy issue, can Nvidia's leadership think global enough? will they start pursuing data centers in europe, latam, asia? can they make gpus cheap enough to compete in those regions?
The way things are, lots of countries want this tech local, but they can't deny the demand either.
Europe for example might not want anything to do with American AI companies, but they still need GPUs for their own models. But can Nvidia rebrand itself as a not-so-american-but-also-american company? Like Coca Cola for example. i.e.: not just operate in europe but have an HQ in europe that has half their execs working from there, and the rest from california. Or perhaps asia is better (doubt)? either way, they can't live off of US demand forever, or ignore geopolitics.
There is one thing everybody forgets when making such predictions: companies don't stand still. Nvidia and every other tech business is constantly exploring new options, taking over competitors, buying startups with novel technologies etc... Nvidia is no slouch in that regard, and their recent quasi-acquisition of Groq is just one example of this. So, when attempting at making predictions, we're looking at a moving target, not systems set in stone. If the people at the helm are smart (and they are), you can expect lots of action and ups and downs - especially in the AI sphere.
My personal opinion, having witnessed first hand nearly 40 years of tech evolution, is that this AI revolution is different. We're at the very beginning of a true paradigm shift: the commoditization of intelligence. If that's not enough to make people think twice before betting against it, I don't know what is. And it's not just computing that is going to change. Everything is about to change, for better or worse.
I would be wary of taking analysis like this website at face value unless you know enough about quant finance to check some of the working for yourself. Just a skim shows a few statements that are questionable at best. Eg
> the theory of unbiased random walks assumes constant volatility throughout the year
No. I’m pretty sure it doesn’t. If you assume a brownian motion with a constant volatility as your stochastic process for computing the walk then of course vol is constant by definition, but you can use a stochastic vol process, a stochastic vol process (eg Heston), one with jumps or even an SVJJ process to compute the walk[2] if you want to. As long as you don’t have a drift term and the jumps are symmetrical the process will still (I think) be unbiased.
There are technical reasons why it may or may not be important to use stochastic vol, but if I recall correctly, it only really matters if you care about “forward volatility” (eg the volatility of Nvidia one year from some future point in time) which you would if pricing something that uses forward-starting options. Then the term structure of the volatility surface at a future date is important so you need a stochastic vol model. If you care about the price evolution but not the future volatility then you can validly make the simplifying assumption that jumps will cancel each other out over time and that volatility is a locally deterministic function of time and price (if not constant, which it obviously is not) and use something like a Dupire model.[3]
More significantly, implied volatility is just the market price of a particular option expressed in terms of volatility. This is convenient for traders so they can compare option prices on a like for like basis between underlyers without constantly having to adjust for differences in the underlying price, strike and time. Implied volatility is not actually the overall expected volatility of the underlying instrument. For that, you would have to fit one of the models above to market prices and calculate the expectation over all strikes and times. And that still is just the market’s opinion of the volatility, not an actual probability even if you apply the BoE adjustment thing he does in the article.
[2] “SVJ” means stochastic vol with jumps (ie discontinities) in the underlying price evolution. SVJJ means stochastic vol with jumps both in the price of the underlying and in the volatility. An example of this is the Matytsin model, which everyone just calls “SVJJ” but it’s not the only possible svjj model https://www.maplesoft.com/support/help/maple/view.aspx?path=...
How much of their turnover is financed directly or indirectly by themselves, then leveraged further by their 'customers' to collaterize further investments?
Are they already "too big to fail"? For better or worse, they are 'all in' on AI.
Predicting any stock will crash, be it from a technical analysis or from looking at fundamentals, is a fool's game. As Keynes allegedly said, the market can stay irrational longer than you can remain solvent.
The poster child for this is Tesla. Nothing fundamental justifies Tesla's valuation.
IMHO the only rational way to look at the future of AI and the companies from profit from it is to look at geopolitics.
The market seems to have decided there's going to be one winner of the AI race. I actually don't think that'll be OpenAI. I think it'll be Google or Nvidia of the companies currently in the race. But I also don't think it'll be either of them.
The magic of software is that it is infinitely reproducible. That makes it difficult to build a wall around it. Microsoft, Facebook, Apple and Google have successfully built moats around their software very successfully in spite of this. Google's big advantage in the AI race is their ability to build and manage data centers and that they'll probably end up relying on their own hardware rather than NVidia.
I think China will be the AI winner or they'll make sure there is no winner. It's simply too important to them. For me, DeepSeek was a shot across the bow that they were going to commoditize these models.
The US blocked the export of the best lithography machines AND the best chips to China. IMHO this was a mistake. Being unable to import chips meant Chinese companies had no choice but to make their own. This created a captive market for China recreating EUV technology. Chinese companies have no choice but to buy Chinese chips.
The Chinese government has the patience and infrastructure for recreating ASML's technology and it's an issue of national security. And really all it takes is hiring a few key people to recreate that technology. So Western governments and analysts who said China will take 20+ years to catch up (if they ever do) simply don't understand China or the market they're talking about.
They sound exactly like post-WW2 generals and politicians who thought the USSR would take 20+ years to copy the atomic bomb. It took 4 years. And hydrogen bombs came even quicker.
There's a story that should get more attention: China has reportedly refused a deal for NVidia's latest chips [1]. If true, why do you think they're doing that? Because they don't want to be reliant on foreign chips. They're going to make their own.
Sure you can predict the market. Making money off of it beyond the regular risk-adjusted return is what's hard. (And the prediction of this article is indeed based on that assumption.)
This is fun math to play with, but completely misses the point of how and why options are priced the way they are. Think of horse racing - when a horse is 1000 to 1 odds the odds of that horse winning are much, much lower. The odds are non-zero, but the oddsmakers are considering who the other side is and why they are buying that ticket.
Most options are actually used to hedge large positions and are rolled over well before the "due date". YOLOing calls and puts is a Robin Hood phenomenon and the odds of "fair pricing" are heavily affected by these big players, so using that data as some sort of price discovery is flawed from the get go.
Are you saying options are not priced at the cost of hedging them? That implies a lot of money could be made by arbitraging between the hedge and the option.
That sounds like an egregious statement. Markets don't have simple persistent arbitrage opportunities like that, do they?
This is a market can stay irrational problem. Modern compute infrastructure from phones, 5G, data centers, LLMs have their energy economics exactly backwards which combined with plastic waste is causing a massive global economic distortion that will correct itself bc physics doesn’t care about the white house, Vanguard, TSMC or anyone else. How long we can borrow from the future and put stress on the poor to prop up this insanely wasteful system who can say.
It's forward looking P/E is 24-26. That doesn't seem like a huge crash is coming. It could come down a bit but they print money. They also have potential car market and robots coming in.
This kind of prediction is hard because it has a dependency on when will AI companies crash. The market would need to lose confidence in AI which should make data-centre creation stop and then impact nVidia.
There's a bet here on profitability and it needs to play out.
How long do investors normally wait to see if a bet on new technology is a winner? I imagine that's quite arbitrary?
Worth noting that the implied volatility extracted here is largely a function of how far OTM the strike is relative to current spot, not some market-specific view on $100. If NVDA were trading at $250 today, the options chain would reprice and you'd extract similar vol for whatever strike was ~45% below. The analysis answers "what's the probability of a near-halving from here" more than "what's special about $100." Still useful for the prediction contest, but the framing makes it sound like the market is specifically opining on that price level.
Them being far above the median PE ratio for the S&P 500 tells you that a future correction would be a discount and you should buy? Please walk me through your logic on this one.
While I am no fan of NVIDIA, they are effectively a Monopoly for CUDA GPU.
This means that cash revenue will likely remain high long after the LLM hype bubble undergoes correction. The market will eventually saturate as incremental product improvements stall, and demand rolls off rather than implodes. =3
Most people buy low-strike puts as insurance against catastrophic market events.
Since catastrophic crises are rare, the price of these puts is quite low. But since many people fear a crisis, the price is very inflated over the actual probabilites. Which is why there are lots of people selling those puts as a business. These guys will bite the dust in case of a major crisis, but will make a ton if the market stays afloat.
Realistically, the current US government is so obsessed with its image that it will do everything to avoid a market crash during its term. The president has been pushing for lower rates for a while, and he's likely going to succeed in removing the head of the Fed and do just that. Lowering interest rates is just another way of pumping investment.
NVidia is definitely not going below $100 in 2026.
I’m more curious how these “future” contract will work out. Supposedly, a bunch of RAM is paid and allocated for that isn’t even made yet. If the bubble ever pops, the collateral is going to be on the order of 2007 subprime mortgage crisis
Since there's such an interdependence between nvidia and the other companies involed in AI to the point that if one fails they all fail, shouldn't the analysis focus on the weakest link in the AI circle jerk?
Nvidia is the biggest link, however, I'd wager OpenAI and the likes are big enough to make a significant dent in the mammoth. So yeah, this analysis is sort of a spherical cows in a vacuum situation.
Still, it's interesting the probability is so high while ignoring real-world factors. I'd expect it to be much higher due to:
- another adjacent company dipping
- some earnings target not being met
- china/taiwan
- just the AI craze slowing down
Yeah, no signs that the AI craze is slowing down, other than all the stories of it not living up to the hype and creating more jobs rather than replacing people and not doing what it says on the tin and the various security issues and that whole they can't expand for the next decade because they need more power plants thing.
I mean common sense reasoning tells me that if OpenAI has decided to turn into an ad business, the actual return expected from investing into compute isn't going to be nearly as great as advertised.
You have it turned upside down. The analysis is of people's beliefs. In other words, the underlying data is created from the beliefs of the people who trade it, and the analysis is taking those beliefs and applying it to a specific question.
Great question! The fine print of the text on Metaculus says the outcome will be judged as if there were no stock split. The question is essentially about the valuation of the company but phrased operationally about the stock price.
It's easy to predict that a bubble will pop, but there's a variance in the timing of approximately half a human lifetime, and if you don't guess that correctly, you throw away yours.
Everything that can't go on forever will eventually stop. But when?
This isn't technical analysis, this is an article on how to use the options market's price discovery mechanism to understand what the discovered price implies about the collective belief about the future price of the underlying.
This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump.
My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
I hear your argument, but short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon. Of course I could easily be wrong, but regardless I think the most predictable cause for a drop in the NVIDIA price would be that the CHIPS act/recent decisions by the CCP leads a Chinese firm to bring to market a CUDA compatible and reliable GPU at a fraction of the cost. It should be remembered that NVIDIA's /current/ value is based on their being locked out of their second largest market (China) with no investor expectation of that changing in the future. Given the current geopolitical landscape, in the hypothetical case where a Chinese firm markets such a chip we should expect that US firms would be prohibited from purchasing them, while it's less clear that Europeans or Saudis would be. Even so, if NVIDIA were not to lower their prices at all, US firms would be at a tremendous cost disadvantage while their competitors would no longer have one with respect to compute.
All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.
I suspect major algorithmic breakthroughs would accelerate the demand for GPUs instead of making it fall off, since the cost to apply LLMs would go down.
1 reply →
People will want more GPUs but will they be able to fund them? At what points does the venture capital and loans run out? People will not keep pouring hundreds of billions into this if the returns don't start coming.
1 reply →
Not that locked out: https://www.cnbc.com/2025/12/31/160-million-export-controlle...
1 reply →
Doesn't even necessarily need to be CUDA compatible... there's OpenCL and Vulkan as well, and likely China will throw enough resources at the problem to bring various libraries into closer alignment to ease of use/development.
I do think China is still 3-5 years from being really competitive, but still even if they hit 40-50% of NVidia, depending on pricing and energy costs, it could still make significant inroads with legal pressure/bans, etc.
3 replies →
> short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon
Or, you know, when LLMs don't pay off.
31 replies →
NVIDIA stock tanked in 2025 when people learned that Google used TPUs to train Gemini, which everyone in the community knows since at least 2021. So I think it's very likely that NVIDIA stock could crash for non-rationale reasons
edit: 2025* not 2024
It also tanked to ~$90 when Trump announced tariffs on all goods for Taiwan except semiconductors.
I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.
10 replies →
Google did not use TPUs for literally every bit of compute that led to Gemini. GCP has millions of high end Nvidia GPUs and programming for them is an order of magnitude easier, even for googlers.
Any claim from google that all of Gemini (including previous experiments) was trained entirely by TPUs is lies. What they are truthfully saying is that the final training run was done on all TPUs. The market shouldn’t react heavily to this, but instead should react positively to the fact that google is now finally selling TPUs externally and their fab yields are better than expected.
4 replies →
I really don't understand the argument that nvidia GPUs only work for 1-3 years. I am currently using A100s and H100s every day. Those aren't exactly new anymore.
It’s not that they don’t work. It’s how businesses handle hardware.
I worked at a few data centers on and off in my career. I got lots of hardware for free or on the cheap simply because the hardware was considered “EOL” after about 3 years, often when support contracts with the vendor ends.
There are a few things to consider.
Hardware that ages produce more errors, and those errors cost, one way or another.
Rack space is limited. A perfectly fine machine that consumes 2x the power for half the output cost. It’s cheaper to upgrade a perfectly fine working system simply because it performs better per watt in the same space.
Lastly. There are tax implications in buying new hardware that can often favor replacement.
15 replies →
The common factoid raised in financial reports is GPUs used in model training will lose thermal insulation due to their high utilization. The GPUs ostensibly fail. I have heard anecdotal reports of GPUs used for cryptocurrency mining having similar wear patterns.
I have not seen hard data, so this could be an oft-repeated, but false fact.
13 replies →
1-3 is too short but they aren’t making new A100s, theres 8 in a server and when one goes bad what do you do? you wont be able to renew a support contract. if you wanna diy you eventually you have to start consolidating pick and pulls. maybe the vendors will buy them back from people who want to upgrade and resell them. this is the issue we are seeing with A100s and we are trying to see what our vendor will offer for support.
They're no longer energy competitive. I.e. the amount of power per compute exceeds what is available now.
It's like if your taxi company bought taxis that were more fuel efficient every year.
30 replies →
Not saying your wrong. A few things to consider:
(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.
(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.
(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.
3 replies →
If power is the bottleneck, it may make business sense to rotate to a GPU that better utilizes the same power if the newer generation gives you a significant advantage.
From an accounting standpoint, it probably makes sense to have their depreciation be 3 years. But yeah, my understanding is that either they have long service lives, or the customers sell them back to the distributor so they can buy the latest and greatest. (The distributor would sell them as refurbished)
You aren't trying to support ad-based demand like OpenAI is.
I think the story is less about the GPUs themselves, and more about the interconnects for building massive GPU clusters. Nvidia just announced a massive switch for linking GPUs inside a rack. So the next couple of generations of GPU clusters will be capable of things that were previously impossible or impractical.
This doesn't mean much for inference, but for training, it is going to be huge.
This seems to take for granted that China and their foundries and engineering teams will never catch up. This seems foolish. I'm working under the assumption that sometime in the next ten years some Chinese company will have a breakthrough and either meet Nvidia's level or leapfrog them. Then the market will flood with great, cheap chips.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.
Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.
Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.
Personally I wonder even if the LLM hype dies down we'll get a new boom in terms of AI for robotics and the "digital twin" technology Nvidia has been hyping up to train them. That's going to need GPUs for both the ML component as well as 3D visualization. Robots haven't yet had their SD 1.1 or GPT-3 moment and we're still in the early days of Pythia, GPT-J, AI Dungeon, etc. in LLM speak.
3 replies →
That's the rub - it's clearly overvalued and will readjust... the question is when. If you can figure out when precisely then you've won the lottery, for everyone else it's a game of chicken where for "a while" money that you put into it will have a good return. Everyone would love if that lasted forever so there is a strong momentum preventing that market correction.
5 replies →
Crypto & AI can both be linked to part of a broader trend though, that we need processors capable of running compute on massive sets of data quickly. I don't think that will ever go down, whether some new tech emerges or we just continue shoveling LLMs into everything. Imagine the compute needed to allow every person on earth to run a couple million tokens through a model like Anthropic Opus every day.
1 reply →
Agree on looking at the company-behind-the-numbers. Though presumably you're aware of the Efficient Market Hypothesis. Shouldn't "slowed down datacenter growth" be baked into the stock price already?
If I'm understanding your prediction correctly, you're asserting that the market thinks datacenter spending will continue at this pace indefinitely, and you yourself uniquely believe that to be not true. Right? I wonder why the market (including hedge fund analysis _much_ more sophisticated than us) should be so misinformed.
Presumably the market knows that the whole earth can't be covered in datacenters, and thus has baked that into the price, no?
The EMH does not mean that markets are free of over-investment and asset bubbles, followed by crashes.
I saw a $100 bill on the ground. I nearly picked it up before I stopped myself. I realised that if it was a genuine currency note, the Efficient Market would have picked it up already.
I'll also point out there were insane takes a few years ago before nVidia's run up based on similar technical analysis and very limited scope fundamental analysis.
Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.
I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.
I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.
Also there's no way Nvidia's market share isn't shrinking. Especially in inference.
The large api/token providers, and large consumers are all investing in their own hardware. So, they are in an interesting position where the market is growing, and NVIDIA is taking the lion's share of enterprise, but is shrinking at the hyperscaler side (google is a good example as they shift more and more compute to TPU). So, they have a shrinking market share, but its not super visible.
1 reply →
Market share can shrink but if the TAM is growing you can still grow.
But will the whole pie grow or shrink?
I’m sad about Grok going to them, because the market needs the competition. But ASIC inference seems to require a simpler design than training does, so it’s easier for multiple companies to enter. It seems inevitable that competition emerges. And eg a Chinese company will not be sold to Nvidia.
What’s wrong with this logic? Any insiders willing to weigh in?
I'm not an insider, but ASICs come with their own suite of issues and might be obsolete if a different architecture becomes popular. They'll have a much shorter lifespan than Nvidia hardware in all likelihood, and will probably struggle to find fab capacity that puts them on equal footing in performance. For example, look at the GPU shortage that hit crypto despite hundreds of ASIC designs existing.
The industry badly needs to cooperate on an actual competitor to CUDA, and unfortunately they're more hostile to each other today than they were 10 years ago.
1 reply →
> The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years
Isn’t this entirely dependent on the economic value of the AI workloads? It all depends on whether AI work is more valuable than that cost. I can easily see arguments why it won’t be that valuable, but if it is, then that cost will be sustainable.
100% this. all of this spending is predicated on a stratospheric ROI on AI investments at the proposed investment levels. If that doesn't pan out, we'll see a lot of people left holding the cards including chip fabs, designers like Nvidia, and of course anyone that ponied up for that much compute.
1 reply →
I no AI fanboy at all. I think it there won’t be AGI anytime soon.
However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.
AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.
At this point computation is in essence commodity. And commodities have demand cycles. If other economic factors slowdown or companies go out of business they stop using compute or start less new products that use compute. Thus it is entirely realistic to me that demand for compute might go down. Or that we are just now over provisioning compute in short or medium term.
9 replies →
What if its penetration ends up being on the same level as modern crypto? Average person doesn't seem to particularly care about meme coins or bitcoin - it is not being actively used in day to day setting, there's no signs of this status improving.
Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.
So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.
6 replies →
> I no AI fanboy at all.
While thinking computers will replace human brains soon is rabid fanaticism this statement...
> AI will conquer the world like software or the smartphone did.
Also displays a healthy amount of fanaticism.
3 replies →
I think the way to think about the AI bubble is that we're somewhere in 97-99 right now, heading toward the dotcom crash. The dotcom crash didn't kill the web, it kept growing in the decades that followed, influencing society more and more. But the era where tons of investments were uncritically thrown at anything to do with the web ended with a bang.
When the AI bubble bursts, it won't stop the development of AI as a technology. Or its impact on society. But it will end the era of uncritically throwing investments at anyone that works "AI" into their pitch deck. And so too will it end the era of Nvidia selling pickaxes to the miners and being able to reach soaring heights of profitability born on wings of pretty much all investment capital in the world at the moment.
Bubble or not it’s simply strange to me that people confidently put a timeline on it. To name the phases of the bubble and calling when they will collapse just seems counter intuitive to what a bubble is. Brad Gerstner was the first “influencer” I heard making these claims of a bubble time line. It just seems downright absurd.
> technical analysis of the stock
AKA pictures in clouds
It's not flat growth that's currently priced in, but continuing high growth. Which is impossible.
> This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump. My 30k ft view is that the stock will inevitably slide as AI
Actually "technical analysis" (TA) has a very specific meaning in trading: TA is using past prices, volume of trading and price movements to, hopefully, give probabilities about future price moves.
https://en.wikipedia.org/wiki/Technical_analysis
But TFA doesn't do that at all: it goes in detail into one pricing model formula/method for options pricing. In the typical options pricing model all you're using is current price (of the underlying, say NVDA), strike price (of the option), expiration date, current interest rate and IV (implied volatility: influenced by recent price movements but independently of any technical analysis).
Be it Black-Scholes-Merton (european-style options), Bjerksund-Stensland (american-style options), binomial as in TFA, or other open options pricing model: none of these use technical analysis.
Here's an example (for european-style options) where one can see the parameters:
https://www.mystockoptions.com/black-scholes.cfm
You can literally compute entire options chains with these parameters.
Now it's known for a fact that many professional traders firms have their own options pricing method and shall arb when they think they find incorrectly priced options. I don't know if some use actual so forms of TA that they then mix with options pricing model or not.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.
No matter if you're right or not, I'd argue you're doing what's called fundamental analysis (but I may be wrong).
P.S: I'm not debatting the merits of TA and whether it's reading into tea leaves or not. What I'm saying is that options pricing using the binomial method cannot be called "technical analysis" for TA is something else.
Fundamental analysis is great! But I have trouble answering concrete questions of probability with it.
How do you use fundamental analysis to assign a probability to Nvidia closing under $100 this year, and what probability do you assign to that outcome?
I'd love to hear your reasoning around specifics to get better at it.
I think the idea of fundamental analysis that you focus on return on equity and see if that valuation is appreciably more than the current price (as opposed to assigning a probability)
Don't you need a model for how people will react to the fundamentals? People set the price.
1 reply →
“In a gold rush, sell shovels”… Well, at some point in the gold rush everyone already has their shovels and pickaxes.
Or people start to realize that the expected gold isn't really there and so stop buying shovels
1 reply →
The version I heard growing up was "In a gold rush, sell eggs."
1 reply →
Well, not to be too egregiously reductive… but when the M2 money supply spiked in the 2020 to 2022 timespan, a lot of new money entered the middle class. That money was then funneled back into the hands of the rich through “inflation”. That left the rich with a lot of spare capital to invest in finding the next boom. Then AI came along.
Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc
It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.
How much did you short the stock?
Add in the fact companies seriously invested in AI (and like workloads typically reliant on GPUs) are also investing more into bespoke accelerators, and the math for nVidia looks particularly grim. Google’s TPUs set them apart from the competition, as does Apple’s NPU; it’s reasonable to assume firms like Anthropic or OpenAI are also investigating or investing into similar hardware accelerators. After all, it’s easier to lock-in customers if your models cannot run on “standard” kit like GPUs and servers, even if it’s also incredibly wasteful.
The math looks bad regardless of which way the industry goes, too. A successful AI industry has a vested interest in bespoke hardware to build better models, faster. A stalled AI industry would want custom hardware to bring down costs and reduce external reliance on competitors. A failed AI industry needs no GPUs at all, and an inference-focused industry definitely wants custom hardware, not general-purpose GPUs.
So nVidia is capitalizing on a bubble, which you could argue is the right move under such market conditions. The problem is that they’re also alienating their core customer base (smaller datacenters, HPC, gaming market) in the present, which will impact future growth. Their GPUs are scarce and overpriced relative to performance, which itself has remained a near-direct function of increased power input rather than efficiency or meaningful improvements. Their software solutions - DLSS frame-generation, ray reconstruction, etc - are locked to their cards, but competitors can and have made equivalent-performing solutions of their own with varying degrees of success. This means it’s no longer necessary to have an nVidia GPU to, say, crunch scientific workloads or render UHD game experiences, which in turn means we can utilize cheaper hardware for similar results. Rubbing salt in the wound, they’re making cards even more expensive by unbundling memory and clamping down on AIB designs. Their competition - Intel and AMD primarily - are happily enjoying the scarcity of nVidia cards and reaping the fiscal rewards, however meager they are compared to AI at present. AMD in particular is sitting pretty, powering four of the five present-gen consoles, the Steam Deck (and copycats), and the Steam Machine, not to mention outfits like Framework; if you need a smol but capable boxen on the (relative) cheap, what used to be nVidia + ARM is now just AMD (and soon, Intel, if they can stick the landing with their new iGPUs).
The business fundamentals paint a picture of cannibalizing one’s evergreen customers in favor of repeated fads (crypto and AI), and years of doing so has left those customer markets devastated and bitter at nVidia’s antics. Short of a new series of GPUs with immense performance gains at lower price and power points with availability to meet demand, my personal read is that this is merely Jenson Huang’s explosive send-off before handing the bag over to some new sap (and shareholders) once the party inevitably ends, one way or another.
> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
Exactly, it is currently priced as though infinite GPUs are required indefinitely. Eventually most of the data centres and the gamers will have their GPUs, and demand will certainly decrease.
Before that, though, the data centres will likely fail to be built in full. Investors will eventually figure out that LLMs are still not profitable, no matter how many data centres you produce. People are interested in the product derivatives at a lower price than it costs to run them. The math ain't mathin'.
The longer it takes to get them all built, the more exposed they all are. Even if it turns out to be profitable, taking three years to build a data centre rather than one year is significant, as profit for these high-tech components falls off over time. And how many AI data centres do we really need?
I would go further and say that these long and complex supply chains are quite brittle. In 2019, a 13 minute power cut caused a loss of 10 weeks of memory stock [1]. Normally, the shops and warehouses act as a capacitor and can absorb small supply chain ripples. But now these components are being piped straight to data centres, they are far more sensitive to blips. What about a small issue in the silicon that means you damage large amounts of your stock trying to run it at full power through something like electromigration [2]. Or a random war...?
> The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
Yep. Nothing about this adds up. Existing data centres with proper infrastructure are being forced to extend use for previously uneconomical hardware because new data centres currently building infrastructure have run the price up so high. If Google really thought this new hardware was going to be so profitable, they would have bought it all up.
[1] https://blocksandfiles.com/2019/06/28/power-cut-flash-chip-p...
[2] https://www.pcworld.com/article/2415697/intels-crashing-13th...
[dead]
According to nvidia’s 2025 annual report [1], 34% of their sales for 2025 comes from just 3 customers.
Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].
This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).
To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.
[1] https://s201.q4cdn.com/141608511/files/doc_financials/2025/a..., page 155: > Sales to direct Customers A, B and C represented 12%, 11% and 11% of total revenue, respectively, for fiscal year 2025.
[2] same, page 116: > Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.
I have lots of skepticism about everything involved in this, but on this particular point:
It's a bit like TSMC: you couldn't buy space on $latestGen fab because Apple had already bought it all. Many companies would have very much liked to order H200s and weren't able to, as they were all pre-sold to hyperscalers. If one of them stopped buying, it's very likely they could sell to other customers, though there might be more administrative overhead?
Now there are some interesting questions about Nvidia creating demand by investing huge amounts of money in cloud providers that will order nv hardware, but that's a different issue.
Its probably not very likely that if a large buyer pulled out, NVIDIA could just sell to other customers. If a large buyer pulls out, that's a massive signal to everyone else to begin cutting costs as well. The large buyer either knows something everyone else doesn't, or knows something that everyone else has already figured out. Either way, the large buyer pulling out signals "I don't think the overall market is large enough to support this amount of compute at these prices at current interest rates" and everybody is doing the same math too.
None of those customers can afford to cancel their orders. OpenAI, Google and Meta cannot afford to get cheap on GPUs when presumably they believe GAI is around the corner. The first company to achieve GAI will win because at that point all gains will become exponential.
All the AI companies are locked in a death loop where they must spend as much money as possible otherwise everything they invested will immediately become zero. No one is going to pay for an LLM when the competitor has GAI. So it's death loop for everyone that has become involved in this race.
He doesn't really address his own question.
He's answering the question "How should options be priced?"
Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Whether Nvidia specifically takes a big dive depends much more on whether they continue to meet growth estimates than general volatility. If they miss earnings estimates in a meaningful way the market is going to take the stock behind the shed and shoot it. If they continue to exceed estimates the stock will probably go up or at least keep its present valuation.
I've been selling options for almost a decade now, including running trading algorithms, and was laughing a bit to myself because it was basically just the math in an everyday option chain. As you already know, anyone can look at the strike they are talking about, with the IV already cooked into it, on platforms like Think Or Swim or even Yahoo Finance. Some of the stuff can be pretty useful though in backtesting and exploration.
All that aside, I'm impressed it made it to the HN front page.
> Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Other way around: if NVidia sinks, it likely takes a bunch of dependent companies with it, because the likely causes of NVidia sinking all tell us that there was indeed an AI bubble and it is popping.
Indeed, the market as a whole would be affected. But is not NVIDIA more of a software company than a hardware one? This bugs the shit out of me.
They are maintaining this astronomical growth through data centers margins from the design of their chips and all of that started from graphics related to video games.
> But is not NVIDIA more of a software company than a hardware one?
No? That’s why they have almost no competition. Hardware starting costs are astronomical
3 replies →
I'm surprised more people are not talking about the fact that the two best models in the world, Gemini 3 and Claude 4.5 Opus, were both trained on Google TPU clusters.
Presumably, inference can be done on TPUs, Nvidia chips, in Anthropic's case, new stuff like Trainium.
Google is a direct competitor to many LARGE buyers of GPUs and therefore a non starter from a business perspective. In addition, many companies cannot single source due to risk considerations. Hardware is different because buying GPUs is a capital investment. You own the asset and revisit the supplier only at the next refresh cycle, not continuously as with rented compute.
It goes to nearly zero if China invades Taiwan, and that seems like it has at least a 10% chance of happening in the next year or two.
It doesn't goto nearly zero. TSMC has a large fab in Arizona and they are continuing to expand it. They also have a fab in Washington, and in Japan. [1]
[1]https://www.tsmc.com/english/aboutTSMC/TSMC_Fabs
The fab in Washington is very old (notice it's still equipped for 8 inch wafers) and so pretty irrelevant to Nvidia's business.
I'm not quite sure what process they run there but I believe it was an acquisition 10+ years ago, not built from the ground up by them.
Edit: their Japan fab is also a mature node so not very relevant here. And their Arizona fab is a very very small portion of their volume and with far worse margin.
1 reply →
I agree. It's funny that this is one of the cited reason for the (relative) value suppression of tsmc, but the same factors should apply to Nvidia too.
Going to zero is one potential outcome. Equally plausible is it goes up 10% in a relatively quick battle or diplomatic outcome which ends the geopolitical uncertainty.
There's approximately 0% chance that China will ship leading edge wafers from captured TSMC to the West.
6 replies →
If China does invade Taiwan, I feel like most people are going to have bigger problems than the Nvidia stock price.
I think they are already hedging for Taiwan. 1. They just pseudo-acquired Groq, fully made in USA (GlobalFoundries) and with a diversified supply chain. 2. And they just announced they will be re-introducing RTX 3090 made in Korea (Samsung). 3. And they plan to produce chips in Intel's new US fabs soon.
I think the bigger problems of the AI bubble are energy and that it's gaining a terrible reputation for being the excuse for mass layoffs while suffocating the Internet with slop/brainrot content. All while depending on government funding to grow.
But then again what won't? Non tech stocks?
Yes, lots of other companies would be affected to a greater or lesser extent (even non-tech stocks), but specifically any company that relies on manufacturing all their product in Taiwan will be affected most of all.
Industrial military complex and government contractors.
5 replies →
Gold stocks, basic materials, MSCI world and emerging market indexes. Look at their prices and see how very smart people are positioning their money.
1 reply →
The whole economy will crash. Probably won't be due to China invading Taiwan though. More likely because the president decided to delete their country's world reserve currency status (which is another word for a trade deficit).
What does the US gov't do in response? Wouldn't they throw globs of money at Intel and Nvidia?
They already have.
Idk the pro china side is getting more and more support, at this rate they’ll vote themselves into mainland
Well, the reality is that most people don't want a bloodbath and it's increasingly looking like external support won't come, so what you gonna do... life is a very complex chess game, gotta play your pieces right.
At this rate, even if they can't get the Taiwanese population to consent, it probably makes more sense to wait anyway to see how low America can sink. The lower America goes, the better their chance for success.
1 reply →
Where do you see the pro china side getting more and more support? As far as I can tell it's sharply swung towards maintaining independence in the past decade or two with single digit support of unification with the mainland.
https://esc.nccu.edu.tw/PageDoc/Detail?fid=7801&id=6963
https://esc.nccu.edu.tw/upload/44/doc/6963/Tondu202512.png
An EU type agreement will keep peace for some time. Remove all trade barriers between two countries, have a treaty preventing any side to be used militarily by third party, no attacking each other and free movement of all vessels through each other's seas. Maybe few more
4 replies →
I think Taiwanese elites can be bought, they say they can’t but I think that’s just part of the bargaining for a higher price. The overtures towards a costly and destructive invasion is Chinas attempt at lowering that price. As is the strategy of building up an indigenous chip manufacturing industry. The aggressive rhetoric from China has the added benefit of keeping the US on a self sabotaging aggressive posture.
I mean that's obviously the best outcome for the Chinese government. Same thing that happens/ed to Hongkong. War is bad for everybody.
US blows up the fabs on the way out!
/s (unless???)
but they're expected to have 8 or 9 aircraft carriers by 2035, doesn't it make sense to wait until then?
It’s only about 200km across the straight. They have over a thousand fighters and a couple of hundred bombers capable of crossing that gap.
If the US is fighting with Europe and South America, China might not that many.
NVIDIA has been producing Blackwell in Arizona since October. Don't be dramatic.
There would be a supply crunch but a lot of dollars will be shuffled VERY fast to ramp up production.
Arizona fabs don't work without TW's many sole source suppliers for fab consumables. They'll likely grind to halt after few months when stock runs out. All the dollar shuffling's not going to replace supply chain that will take (generously) years to build, if ever.
They definitely made at least one wafer in Arizona in October.
Packaging? Assembling onto boards?
1 reply →
China invading Taiwan makes zero sense, they just flex those muscles for domestic consumption. They will probably take over Taiwan, but they'll do it how modern major powers do anything: propaganda, influence campaigns, and soft power.
Russia invading Ukraine also made zero sense, given their actual capabilities and the likely (now realized) consequences. The leader doesn't always have the best information, it turns out.
Either that, or the leader does have access to the best information, and they just DGAF. That condition seems to be going around too.
1 reply →
They're enjoying a massive demand for GPUs due to AI blowing up, at a time when there isn't much competition, yet the technology is already pleateauing, with similar offerings from AMD, not to mention proven training & inference chips from Google & AWS, plus the Chinese national strategy of prioritizing domestic chips
The only way the stock could remain at its current price or grow (which is why you'd hold it) is if demand would just keep going up (with the same lifecycle as current GPUs) and that there would be no competition, which the latter to me us just never going to be a thing.
Investors are convinced that Nvidia can maintain its lead because they have the "software" side, I.e. CUDA, which to me is so ridiculous, as if with the kind of capital that's being deployed into these datacenters, you couldn't fit your models into other software stacks by hiring people....
or couldn't use a LLM to help port your CUDA code to "new framework", i.e. software is no longer a lock-in....
assuming LLM coding agents are good, but if they aren't any good, then what is the value of the CUDA code?
> One of the questions of the 2026 acx prediction contest is whether Nvidia’s stock price will close below $100 on any day in 2026.
Maybe I’m missing something, but isn’t this just a standard American put option with a strike of $100 and expiry of Dec 31st?
No because if it goes to $99.99, you don't win much. With a prediction contest it is either you win or you lose.
Not really. American put options will pay differently for 95 dollars vs 99 dollars, while this contract settles to 1 either which way.
It’s a binary option.
As others have noted, the article is analysing the actual financial markets angle.
For my two cents on the technical side, it is likely that any Western-origin shakiness will come from Apple and how it manages to land the Gemini deal and Apple Intelligence v2. There is an astounding amount of edge inference sitting in people’s phones and laptops that only slightly got cracked open with Apple Intelligence.
Data centre buildouts will get corrected when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet. This will serve as a guiding principle for any future buildout and heavyweight inference cards that Nvidia is supplying. The 2-5 year moat top providers have with the largest models will get chomped at by the leisure/hobby/educational use cases that lightweight models capably handle. Small language and visual models are already amazing. The next crack will appear when the past gen cards (if they survive the around the clock operation) get bought up by second hand operators that can provide capable inference of even current gen models.
If past knowledge of DC operators holds (e.g. Google and its aging TPUs that still get use), the providers with the resources to buy new space for newer gens will accumulate the amount of hardware, but the providers who need to continuously shave off the financial hit that comes with using less efficient older cards.
I’m excited to see future blogs about hardware geeks buying used inference stacks and repurposing them for home use :)
>when the numbers come in from Apple: how large of a share in tokens used by the average consumer can be fulfilled with lightweight models and Google searches of the open internet
is there any reason to expect that this information will ever be known outside of apple?
Accuracy wise we won't know the exact numbers, but insiders and industry experts usually are able to find ballpark figures that they share with the press. The alternative is the usual find out the estimates through competitors' lost MAU numbers in apps like ChatGPT for iOS.
It's entirely possible it will crash, but I also don't think it'll go bankrupt or anything.
I don't typically buy stock to flip it right away; I have some Nvidia stock that I bought the day after ChatGPT was launched, and I bought a bit more when it was $90/share about a ~year ago. If it drops to $100, then I'll still be in the black, but even if it drops to $50, I'm not going to worry because I figure that I can just hold onto it until another upswing.
Nvidia has been around long enough and has enough market penetration in datacenters and gaming that I don't think it's going to go bust, and I figure that it will eventually appreciate again just due to inflation.
> Nvidia has been around long enough and has enough market penetration in datacenters and gaming that I don't think it's going to go bust, and I figure that it will eventually appreciate again just due to inflation.
Shouldn't the same argument also apply to Intel?
Intel doesn't make desirable products
If it appreciates just due to inflation you didn't actually make any money. Unless your investment beats inflation, you lose value.
I don't dispute that all. I'm just ok with that being the outcome; if it keeps up with inflation then it's still better than storing it in a bank, and the thing that would upset me more than "not gaining" would be "actively losing".
Now obviously, if it drops below from what I paid for it and then it takes inflation to catch up, then yeah, that's definitely "lost money", but that's just the risk of the stock market, especially with individual stocks. I also think that if it crashes, Nvidia might still have another surge eventually, even if it doesn't get back to its full glory.
I definitely would not buy new Nvidia stock at its current price though.
It's a problem if you have to keep asking "are we in a bubble?"
Everyone is saying data center build outs are the main thing to look out for. But those data centers with all those gpus will need to replace those gpus right? Nvidia will come up with better, faster, more efficient gpus.
LLM use age won't crash either, it might decline or taper off but it's here to stay.
My concern is better models that won't need a whole of GPU, or China comping up with their own foundry and GPUs that compete. There is also the strategy issue, can Nvidia's leadership think global enough? will they start pursuing data centers in europe, latam, asia? can they make gpus cheap enough to compete in those regions?
The way things are, lots of countries want this tech local, but they can't deny the demand either.
Europe for example might not want anything to do with American AI companies, but they still need GPUs for their own models. But can Nvidia rebrand itself as a not-so-american-but-also-american company? Like Coca Cola for example. i.e.: not just operate in europe but have an HQ in europe that has half their execs working from there, and the rest from california. Or perhaps asia is better (doubt)? either way, they can't live off of US demand forever, or ignore geopolitics.
There is one thing everybody forgets when making such predictions: companies don't stand still. Nvidia and every other tech business is constantly exploring new options, taking over competitors, buying startups with novel technologies etc... Nvidia is no slouch in that regard, and their recent quasi-acquisition of Groq is just one example of this. So, when attempting at making predictions, we're looking at a moving target, not systems set in stone. If the people at the helm are smart (and they are), you can expect lots of action and ups and downs - especially in the AI sphere.
My personal opinion, having witnessed first hand nearly 40 years of tech evolution, is that this AI revolution is different. We're at the very beginning of a true paradigm shift: the commoditization of intelligence. If that's not enough to make people think twice before betting against it, I don't know what is. And it's not just computing that is going to change. Everything is about to change, for better or worse.
I don't think it'll crash. The US Federal Government will throw money at anything right now.
checks calendar Ah, NVIDIA earnings call is close - prepare for the inevitable doomer articles.
I would be wary of taking analysis like this website at face value unless you know enough about quant finance to check some of the working for yourself. Just a skim shows a few statements that are questionable at best. Eg
> the theory of unbiased random walks assumes constant volatility throughout the year
No. I’m pretty sure it doesn’t. If you assume a brownian motion with a constant volatility as your stochastic process for computing the walk then of course vol is constant by definition, but you can use a stochastic vol process, a stochastic vol process (eg Heston), one with jumps or even an SVJJ process to compute the walk[2] if you want to. As long as you don’t have a drift term and the jumps are symmetrical the process will still (I think) be unbiased.
There are technical reasons why it may or may not be important to use stochastic vol, but if I recall correctly, it only really matters if you care about “forward volatility” (eg the volatility of Nvidia one year from some future point in time) which you would if pricing something that uses forward-starting options. Then the term structure of the volatility surface at a future date is important so you need a stochastic vol model. If you care about the price evolution but not the future volatility then you can validly make the simplifying assumption that jumps will cancel each other out over time and that volatility is a locally deterministic function of time and price (if not constant, which it obviously is not) and use something like a Dupire model.[3]
More significantly, implied volatility is just the market price of a particular option expressed in terms of volatility. This is convenient for traders so they can compare option prices on a like for like basis between underlyers without constantly having to adjust for differences in the underlying price, strike and time. Implied volatility is not actually the overall expected volatility of the underlying instrument. For that, you would have to fit one of the models above to market prices and calculate the expectation over all strikes and times. And that still is just the market’s opinion of the volatility, not an actual probability even if you apply the BoE adjustment thing he does in the article.
[1] https://www.homepages.ucl.ac.uk/~ucahgon/Heston.pdf
[2] “SVJ” means stochastic vol with jumps (ie discontinities) in the underlying price evolution. SVJJ means stochastic vol with jumps both in the price of the underlying and in the volatility. An example of this is the Matytsin model, which everyone just calls “SVJJ” but it’s not the only possible svjj model https://www.maplesoft.com/support/help/maple/view.aspx?path=...
[3] https://www.math.kth.se/matstat/gru/5b1575/Projects2016/Vola...
This is more of a derivative pricing article and has nothing to do with nvidia really
NVIDIA Vera Rubin NVL72 unveiled at CES makes any other computer look like a pocket calculator, and that's why I wouldn't want to be bearish on NVDA right now, see https://www.nvidia.com/en-us/data-center/vera-rubin-nvl72
Who said that monads don't have any application?
They implement Applicative, so by definition they do
New competition is an issue. It wasn’t as lucrative to compete with nvidia in the past
How much of their turnover is financed directly or indirectly by themselves, then leveraged further by their 'customers' to collaterize further investments?
Are they already "too big to fail"? For better or worse, they are 'all in' on AI.
The thing is, in this gold rush, Nvidia is the one selling shovels.
Predicting any stock will crash, be it from a technical analysis or from looking at fundamentals, is a fool's game. As Keynes allegedly said, the market can stay irrational longer than you can remain solvent.
The poster child for this is Tesla. Nothing fundamental justifies Tesla's valuation.
IMHO the only rational way to look at the future of AI and the companies from profit from it is to look at geopolitics.
The market seems to have decided there's going to be one winner of the AI race. I actually don't think that'll be OpenAI. I think it'll be Google or Nvidia of the companies currently in the race. But I also don't think it'll be either of them.
The magic of software is that it is infinitely reproducible. That makes it difficult to build a wall around it. Microsoft, Facebook, Apple and Google have successfully built moats around their software very successfully in spite of this. Google's big advantage in the AI race is their ability to build and manage data centers and that they'll probably end up relying on their own hardware rather than NVidia.
I think China will be the AI winner or they'll make sure there is no winner. It's simply too important to them. For me, DeepSeek was a shot across the bow that they were going to commoditize these models.
The US blocked the export of the best lithography machines AND the best chips to China. IMHO this was a mistake. Being unable to import chips meant Chinese companies had no choice but to make their own. This created a captive market for China recreating EUV technology. Chinese companies have no choice but to buy Chinese chips.
The Chinese government has the patience and infrastructure for recreating ASML's technology and it's an issue of national security. And really all it takes is hiring a few key people to recreate that technology. So Western governments and analysts who said China will take 20+ years to catch up (if they ever do) simply don't understand China or the market they're talking about.
They sound exactly like post-WW2 generals and politicians who thought the USSR would take 20+ years to copy the atomic bomb. It took 4 years. And hydrogen bombs came even quicker.
There's a story that should get more attention: China has reportedly refused a deal for NVidia's latest chips [1]. If true, why do you think they're doing that? Because they don't want to be reliant on foreign chips. They're going to make their own.
[1]: https://ca.finance.yahoo.com/news/nvidia-stock-slides-china-...
That's smoke and mirrors. You can't logically predict the market. It never worked.
Sure you can predict the market. Making money off of it beyond the regular risk-adjusted return is what's hard. (And the prediction of this article is indeed based on that assumption.)
This is fun math to play with, but completely misses the point of how and why options are priced the way they are. Think of horse racing - when a horse is 1000 to 1 odds the odds of that horse winning are much, much lower. The odds are non-zero, but the oddsmakers are considering who the other side is and why they are buying that ticket.
Most options are actually used to hedge large positions and are rolled over well before the "due date". YOLOing calls and puts is a Robin Hood phenomenon and the odds of "fair pricing" are heavily affected by these big players, so using that data as some sort of price discovery is flawed from the get go.
Are you saying options are not priced at the cost of hedging them? That implies a lot of money could be made by arbitraging between the hedge and the option.
That sounds like an egregious statement. Markets don't have simple persistent arbitrage opportunities like that, do they?
was expecting some actual reasons presented as to why this would happen. instead got some math.
The simple answer to the question:
Nvidia stock crash will happen when the vendor financing bubble bursts.
They are engaged in a dangerous game of circular financing. So it is case of when, not if the chickens come home to roost.
It is simply not sustainable.
The real question is what else will this cause to fall when it does happen.
This is a market can stay irrational problem. Modern compute infrastructure from phones, 5G, data centers, LLMs have their energy economics exactly backwards which combined with plastic waste is causing a massive global economic distortion that will correct itself bc physics doesn’t care about the white house, Vanguard, TSMC or anyone else. How long we can borrow from the future and put stress on the poor to prop up this insanely wasteful system who can say.
10% given the info we have now. Or 10% given what info are likely to come in the future that satisfies the 10% prediction? Or are these 2 the same?
It's forward looking P/E is 24-26. That doesn't seem like a huge crash is coming. It could come down a bit but they print money. They also have potential car market and robots coming in.
This kind of prediction is hard because it has a dependency on when will AI companies crash. The market would need to lose confidence in AI which should make data-centre creation stop and then impact nVidia.
There's a bet here on profitability and it needs to play out.
How long do investors normally wait to see if a bet on new technology is a winner? I imagine that's quite arbitrary?
I'm calling it - this is a submarine article to prove that Haskell is used in the real world to solve actual problems
https://archive.is/HXmoa
Worth noting that the implied volatility extracted here is largely a function of how far OTM the strike is relative to current spot, not some market-specific view on $100. If NVDA were trading at $250 today, the options chain would reprice and you'd extract similar vol for whatever strike was ~45% below. The analysis answers "what's the probability of a near-halving from here" more than "what's special about $100." Still useful for the prediction contest, but the framing makes it sound like the market is specifically opining on that price level.
this is gpt, right?
There are grammatical mistakes and abbreviations, big tells that it's NOT ChatGPT.
I had a conversation (prompts) with Claude about this article because I didn't feel I could as succinctly describe my point alone.
Nvidia PE ratio: 44
I do hope they crash so that I can buy as much as possible at a discount.
Them being far above the median PE ratio for the S&P 500 tells you that a future correction would be a discount and you should buy? Please walk me through your logic on this one.
Every gambler thinks they can time the market, and buy the dip.
In general, they often get stung by the dead cat bounce, =3
https://en.wikipedia.org/wiki/Dead_cat_bounce
This implies you think a crash would be a temporary mispricing of the stock, which will recover in value, correct?
While I am no fan of NVIDIA, they are effectively a Monopoly for CUDA GPU.
This means that cash revenue will likely remain high long after the LLM hype bubble undergoes correction. The market will eventually saturate as incremental product improvements stall, and demand rolls off rather than implodes. =3
The option market is an insurance market.
Most people buy low-strike puts as insurance against catastrophic market events.
Since catastrophic crises are rare, the price of these puts is quite low. But since many people fear a crisis, the price is very inflated over the actual probabilites. Which is why there are lots of people selling those puts as a business. These guys will bite the dust in case of a major crisis, but will make a ton if the market stays afloat.
Realistically, the current US government is so obsessed with its image that it will do everything to avoid a market crash during its term. The president has been pushing for lower rates for a while, and he's likely going to succeed in removing the head of the Fed and do just that. Lowering interest rates is just another way of pumping investment.
NVidia is definitely not going below $100 in 2026.
I’m more curious how these “future” contract will work out. Supposedly, a bunch of RAM is paid and allocated for that isn’t even made yet. If the bubble ever pops, the collateral is going to be on the order of 2007 subprime mortgage crisis
Since there's such an interdependence between nvidia and the other companies involed in AI to the point that if one fails they all fail, shouldn't the analysis focus on the weakest link in the AI circle jerk?
Nvidia is the biggest link, however, I'd wager OpenAI and the likes are big enough to make a significant dent in the mammoth. So yeah, this analysis is sort of a spherical cows in a vacuum situation.
Still, it's interesting the probability is so high while ignoring real-world factors. I'd expect it to be much higher due to: - another adjacent company dipping - some earnings target not being met - china/taiwan - just the AI craze slowing down
Yeah, no signs that the AI craze is slowing down, other than all the stories of it not living up to the hype and creating more jobs rather than replacing people and not doing what it says on the tin and the various security issues and that whole they can't expand for the next decade because they need more power plants thing.
I mean common sense reasoning tells me that if OpenAI has decided to turn into an ad business, the actual return expected from investing into compute isn't going to be nearly as great as advertised.
"Predictably" prediction markets have opened up space in the void left by journalism for tea-leaf reading with the fig leaf of mathy jargon.
People don't actually believe this type of analysis... do they?
You have it turned upside down. The analysis is of people's beliefs. In other words, the underlying data is created from the beliefs of the people who trade it, and the analysis is taking those beliefs and applying it to a specific question.
The entire options market is built on this kind of analysis.
does that include the chance for a stock split?
Great question! The fine print of the text on Metaculus says the outcome will be judged as if there were no stock split. The question is essentially about the valuation of the company but phrased operationally about the stock price.
Click bait for teaching options analysis
Itt: trust me, bro capitalism will fail this time
[dead]
[dead]
It's easy to predict that a bubble will pop, but there's a variance in the timing of approximately half a human lifetime, and if you don't guess that correctly, you throw away yours.
Everything that can't go on forever will eventually stop. But when?
Well put and clearly explains why "timing the market" is never a good plan.
Technical analysis is amazing, it is most refined form of pseudoscience.
This isn't technical analysis, this is an article on how to use the options market's price discovery mechanism to understand what the discovered price implies about the collective belief about the future price of the underlying.
That's what "technical analysis" means in the finance world, though... so, am I missing a joke?
1 reply →
how so? (i'm not too familiar)