Comment by gck1
2 days ago
> Yes the models are smart, but you really cant "build things" despite the marketing if you actively beat back your users for trying
I think this is the most important takeaway from this thread and at some point, this will end up biting Google and Anthropic back.
OpenAI seems to have realized this and is actively trying to do the opposite. They welcomed OpenCode the same day Anthropic banned them, X is full of tweets of people saying codex $20 plan is more generous than Anthropic's $200 etc.
If you told me this story a year ago without naming companies, I would tell you it's OpenAI banning people and Google burning cash to win the race.
And it's not like their models are winning any awards in the community either.
My impression is there's a definite shortage of GPUs, and if OpenAI is more reliable it's because they have fewer customers relative to the number of GPUs they have. I don't think Google is handing out 429s because they are worried about overspending; I think it's because they literally cannot serve the requests.
This sounds very plausible. OpenAI has hoarded 40% of world's RAM supply, which they likely have no use for other than to starve competition. They (or other competitors) could be utilizing the same strategy for other hardware.
Which is worrying, because if this continues, and if Google, who has GCP is struggling to serve requests, there's no telling what's going to happen with services like Hetzner etc.
> OpenAI has hoarded 40% of world's RAM supply
I believe OpenAI's purchasing is somewhat overstated, it definitely has no effect on Google's current ability to serve Gemini requests, but it is obvious that there's a shortage of most components, and it's also obvious that even internally Google is having to make hard choices about who to let use GPUs when.
I definitely think OpenAI likely has less use for GPUs than Google. Google has $300B in annual revenue vs. $20B for OpenAI. Even if you assume 100% of OpenAI's revenue is going to renting GPUs and they are taking a 50% loss there's still a lot of room for Google to be profitable and spending more money on GPUs, and not have enough GPUs. Google also just has a wider variety of models to train and run, from Waymo to Search to whatever advertising models.
https://docs.hetzner.com/general/infrastructure-and-availabi... price increases
OpenAI is dependent on same hyperscalers (most specifically Microsoft/Azure) as everyone else, and even have access to preferential pricing due to their partnership.
A better explanation is to point out that ChatGPT is still far and away the most popular AI product on the planet right now. When ChatGPT has so many more users, multi-tenant economic effects are stronger, taking advantage of a larger number of GPUs. Think of S3: requests for a million files may load them from literally a million drives due to the sheer size of S3, and even a huge spike from a single customer fades into relatively stable overall load due to the sheer number of tenants. OpenAI likely has similar hardware efficiencies at play and thus can afford to be more generous with spikes from individual customers using OpenCode etc.
I would guess the biggest AI product on the planet is Google's Search AI. Although even that might not be the case, unless your definition of AI is just "LLMs" and not any sort of ML that requires a GPU.
You can build plenty with Google ai pro plan and Antigravity. Yea there's some limits that should be even higher, but you can still build stuff.
It's unfortunate though that they lie and deceive by having a name called "Open"AI when they are in fact "Closed". And the whole non-profit to profit and Microsoft deals are just untrustable and unethical.
They also actively employ dark strategies in cooperation with CIA and who knows when they will pull the rug under you again.
Do you really trust a foundational rotten group of people who avoid accountability?
I don't know what it's called when something becomes an irony and then this irony becomes an irony itself, but that's what's up with OpenAI today. On one hand, they started this 'we're closing things down because safety' line, they normalized $200/mo subscriptions, but now they're becoming the most open AI company between the big 3. Their tooling is open source, they're lenient on their quotas on lower plans, and their allowance of third party integrations is also unique.
I would still consider OpenAI naming incorrect, but between the 3, they kind of are, open.