Comment by lelanthran
20 hours ago
Firstly, the $200/m plan is at a loss, they'll make a profit on PAYG tokens, not plans.
Secondly, this is looking very risky: they are at the bottom of the value chain and eventually they'll be running on razor thin margins like all actors who are at the bottom of the value chain.
Anything they can offer is doable by their competitors (in Google's case, they can even do it cheaper due to ow ing the vertical which OpenAI doesn't own).
Their position in the value chain means they are in a precarious spot: any killer app for AI that comes along will be owned by a customer of OpenAI, and if OpenAI attempts to skim that value for itself that customer will simply switch to a new provider of which there are many, including, eventually, the customer themselves should they decide to self host.
Being an AI provider right now is a very risky proposition because any attempt to capture value can be immediate met with "we're switching to a competitor" or even the nuclear "fine, we'll self host an open model ourselves".
We'll know more only when we see what the killer app is, when it eventually comes.
The real revenue opportunity for OpenAI is advertising. More than 25% of Americans use ChatGPT instead of Google, and OpenAI has already announced partnership with Shopify to directly list products. But for now they are focused on market share.
Google does not really own the complete AI stack, NVDA is extracting a lot of the value there.
Google has two other impediments to doing what ChatGPT does.
Googles entire business model is built around search. They have augmented search with AI, but that is not the same as completely disrupting an incredibly profitable business model with an unprofitable and unproven business model.
Also... Americans are in the habit of going to ChatGPT now for AI. When you think of AI, you now think of ChatGPT first.
The real risk is we are at the tail end of a long economic boom cycle, OpenAI is incredibly dependent on additional funding rounds, and if we recess access to that funding gets cut off.
I would argue that Google is even better place for advertising. All they need to do is enable advertising in Gemini. There is a whole ecosystem already in place for Google advertising.
> Google does not really own the complete AI stack, NVDA is extracting a lot of the value there.
...not sure what you're implying.
Google most definitely has their own stack (spanning hardware-to-software) for AI. Gemini was trained on in-house TPUs:
https://www.forbes.com/sites/richardnieva/2023/12/07/google-...
Many people have a lot of context built up with ChatGPT. I know people who refuse to try Anthropic because it "doesn't know them as well" & can't answer their questions.
HN views this as negative but many people see this as a positive.
Except Google search keeps growing per their last earnings report. You'd think if 25% of Americans have switched to ChatGPT it would have hit the numbers by now...
We’re still in the period where people ask Google before reverting to ChatGPT. Wait until habits change.
1 reply →
I also believe the main business will be via APIs and integrations, but wouldn't be surprised on the consumer side if it ends up being on phones, in your house ala Alexa, in your car etc. Big brands typically win in B2C. Tons of affiliate and transactional potential (ie, do my grocery shopping or buy tshirt). That's assuming LLMs don't plateau and become generic with minor specialization like databases.
> Googles entire business model is built around search. They have augmented search with AI
No, it’s that Google Search doesn’t find anything anymore. You write a class name—it doesn’t index those anymore. So you revert to asking it a question about your bug, it’s no AI-fied enough. Perplexity and ChatGPT find what Google chose to stop indexing.
Google may be built around advertising, but certainly not around Search.
I don't believe that 25% quote, at all.
>Google does not really own the complete AI stack, NVDA is extracting a lot of the value there.
Google doesn't use Nvidia hardware at all except offering it to customers on their cloud offerings. They don't use it for training nor do they use it for inference.
[dead]
I feel like being at the bottom of the value chain is a mis-categorization. If you consider base LLM model as their sole offering I agree with you, but these companies have shown an eagerness to eat their way up the value chain. Agent mode, Search, Study Mode, AI code editors, are such examples of products that could be higher-on-the-chain startups but are offered in-house by OpenAI.
This reminds me of Amazon choosing to sell products that it knows are doing well in the marketplace, out-competing third party sellers. OpenAI is positioned to out-compete its competitors on virtually anything because they have the talent and more importantly, control over the model weights and ability to customize their LLMs. It's possible the "wrapper" startups of today are simply doing the market research for OpenAI and are in danger of being consumed by OpenAI.
OpenAI valued at 300B will never be able to produce the same products "wrappers" that these 5 people startups are making. Same reason Facebook could not make Instagram, of Jira could never make bootcamp for example.
Counterexample- Facebook made Threads which has similar # users as Twitter now.
9 replies →
It is difficult to host a model such as 617B locally.
OpenAI will never be profitable selling raw tokens.
They need the application layer that allows them to sell additional functionality and decouple the cost of a plan from the cost of tokens. See Lovable, they abstract away tokens as "credits" and most likely sell the tokens at a ~10x markup.
The idea of running a company that sells tokens is like starting a company that sells MySQL calls.
> The idea of running a company that sells tokens is like starting a company that sells MySQL calls.
I think DynamoDB is plenty profitable :)
I don't think they are at the bottom and that's the issue.
Nvidia is at the bottom or if we get charitable cloud providers.
They are the ones who would have the margins, from their rent seeking.
And to be frank other than consumers everyone else is at the fucking bottom..
Getting squeezed for user acquisition when the margins of the old and cheap internet software service don't exist.
I think you are correct, but my hot take is that they will capture most of the G7 through scummy regulatory capture and bundling with Microsoft. They will use this to mostly dominate the markets and run at small profitable margins. They will then pad out revenue by bundling in advertisement and agenda based pay to play messaging. They will also do a bunch of military and government contracts, take positions in profitable applications (or simply copy them) and maybe even do a hardware offering. Ultimately the company will end up being something like a facebook/google/palantir/apple hybrid. I'll admit the execution barrier is high, but the valuation is justified if they achieve. These are proven executors who have a nearly sociopathic capitalist mindset with deep ties to governments and corporations globally. I think it's probably likely they execute and if they fail in the grand scheme, it's hard to imagine they fail enough to bring down the company.
Let's not forget this company was founded by basically stealing seed investment from the non-profit arm, completely abandoning the mission, crushing dissent in the company and blackmailing the board. Sam will do anything to succeed and they have the product and powerbase to do it.
This guy gets it. +1
Maybe the better analogy for LLM businesses isn’t SaaS but more like power generation.
If AI really becomes that ubiquitous then OpenAI capturing that value is no less ridiculous than ComEd capturing the value of every application of electric power.
A very good comparison. Why are electric companies and railways state-owned? Of course, not entirely. They have a string of private companies, but the core is state-owned and monopolistic. OpenAI will be like that. It is already flirting with the government to get the best access and be able to control the thinking of officials. Manipulation of officials and politicians. Isn't that beautiful and self-perpetuating profit?
> If AI really becomes that ubiquitous then OpenAI capturing that value is no less ridiculous than ComEd capturing the value of every application of electric power.
They do? The electric provider, last I checked, does notcapture the value of every application of electric power.
Some business uses (amongst other things) $1 of electricity to make a widget that they then sell $100 - the value there is captured by the business, not by the provider.
Same with tokens; the provider (OpenAI, Anthropic, whoever) provides tokens, but the business selling a solution using those tokens would be charging many orders of magnitude more for those tokens when those tokens are packed into the solution.
The provider can't just raise prices to capture the value (cos then the business would switch to a new provider, or if they all raise prices, the business would self-host), they have to compete with the business by selling the same solution.
Going back to the electric company analogy, if the electricity supplier wants to capture more of the value in the widget, they have to create the widget themselves and compete with the business who is currently creating the widget.
If the business has a moat of any type (including customer service, customisation, market differentiation, etc) the electricity provider is out of luck.
What OpenAI has is the know-how in developing new models and training them efficiently. That's a kind of value they can provide even in a world where open-sourced local models are in common use.
Sure but so do like a dozen other companies. Given that models bump past each other every few months, I haven’t seen anything that says they have any kind of competitive advatange.
1 reply →
So they become a consultancy?