Comment by lairv
20 hours ago
NVIDIA stock tanked in 2025 when people learned that Google used TPUs to train Gemini, which everyone in the community knows since at least 2021. So I think it's very likely that NVIDIA stock could crash for non-rationale reasons
edit: 2025* not 2024
It also tanked to ~$90 when Trump announced tariffs on all goods for Taiwan except semiconductors.
I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.
The market is full of people trying to anticipate how other people are going to react and exploit that by getting there first. There's a layer aimed at forecasting what that layer is going to do as well.
It's guesswork all the way down.
A bunch of "Greater Fool" motivation too.
https://en.wikipedia.org/wiki/Greater_fool_theory
Personally, I try to predict how others are going to predict that yet others will react.
4 replies →
Keynesian beauty contest.
This was also on top of claims (Jan 2025) that Deepseek showed that "we don't actually need as much GPU, thus NVidia is less needed"; at least it was my impression this was one of the (now silly-seeming) reasons NVDA dropped then.
> I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.
These days you have AI bots doing sentiment based training.
If you ask me... all these excesses are a clear sign for one thing, we need to drastically rein in the stonk markets. The markets should serve us, not the other way around.
Google did not use TPUs for literally every bit of compute that led to Gemini. GCP has millions of high end Nvidia GPUs and programming for them is an order of magnitude easier, even for googlers.
Any claim from google that all of Gemini (including previous experiments) was trained entirely by TPUs is lies. What they are truthfully saying is that the final training run was done on all TPUs. The market shouldn’t react heavily to this, but instead should react positively to the fact that google is now finally selling TPUs externally and their fab yields are better than expected.
> including all previous experiments
How far back do you go? What about experiments into architecture features that didn’t make the cut? What about pre-transformer attention?
But more generally, why are you so sure that they team that built Gemini didn’t exclusively use TPUs while they were developing it?
I think that one of the reasons that Gemini caught up so quickly is because they have so much compute at fraction of the price of everyone else.
Why should it not react heavily? What’s stopping this from being a start of a trend for google and even Amazon?
They are not lies.
JAX is very easy to use. Give it a try.