I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.
> I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.
For OpenAI to produce a 10% return, every iPhone user on earth needs to pay $30/month to OpenAI.
They don't sell their models to individuals only but also to companies with most likely different business and pricing models so that's an overly simplistic view of their business. YoY their spending increases, we can safely assume that one of the reasons is the growing user base.
Time will probably come when we won't be allowed to consume frontier models without paying anything, as we can today, and time will come when this $30 will most likely become double or triple the price.
Though the truth is that R&D around AI models, and especially their hosting (inference), is expensive and won't get any cheaper without significant algorithmic improvements. According to the history, my opinion is that we may very well be ~10 years from that moment.
Not sure where that math is coming from. Assuming it's true, you're ignoring that some users (me) already pay 10X that. Btw according Meta's SEC filings: https://s21.q4cdn.com/399680738/files/doc_financials/2023/q4... they made around $22/month/american user (not even heavy user or affluent iPhone owner) in q3 2023. I assume Google would be higher due to larger marketshare.
Those are effectively made up numbers, since they're given to him by an anonymous source we have no way of corroborating, and we can't even see the documents themselves, and it contradicts not just OpenAI's official numbers, but first principles analyses of what the economics of inference should be[1] and the inference profit reports of other companies, as well as just an analysis of the inference market would suggest[2]
Google Search has 3 sources of revenue that I am aware of: ad revenue from the search results page, sponsored search results, and AdSense revenue on the websites the user is directed to.
If users just look at the AI overview at the top of the search page, Google is hobbling two sources of revenue (AdSense, sponsored search results), and also disincentivizing people from sharing information on the web that makes their AI overview useful. In the process of all this they are significantly increasing the compute costs for each Google search.
This may be a necessary step to stay competitive with AI startups' search products, but I don't think this is a great selling point for AI commercialization.
And so ends the social contract of the web, the virtuous cycle of search engines sending traffic to smaller sites which collect ad revenue which in turn boosts search engine usage.
Thank god. The fake search results, the money that manipulates our access to information. all gone. Finally we can try something else. I have a feeling it's going to be worse though.
I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.
> I've fed thousands of dollars to Anthropic/OAI/etc for their coding models over the past year despite never having paid for dev tools before in my life. Seems commercially viable to me.
For OpenAI to produce a 10% return, every iPhone user on earth needs to pay $30/month to OpenAI.
That ain’t happening.
They don't sell their models to individuals only but also to companies with most likely different business and pricing models so that's an overly simplistic view of their business. YoY their spending increases, we can safely assume that one of the reasons is the growing user base.
Time will probably come when we won't be allowed to consume frontier models without paying anything, as we can today, and time will come when this $30 will most likely become double or triple the price.
Though the truth is that R&D around AI models, and especially their hosting (inference), is expensive and won't get any cheaper without significant algorithmic improvements. According to the history, my opinion is that we may very well be ~10 years from that moment.
EDIT: HSBC has just published some projections. From https://archive.ph/9b8Ae#selection-4079.38-4079.42
> Total consumer AI revenue will be $129bn by 2030
> Enterprise AI will be generating $386bn in annual revenue by 2030
> OpenAI’s rental costs will be a cumulative $792bn between the current year and 2030, rising to $1.4tn by 2033
> OpenAI’s cumulative free cash flow to 2030 may be about $282bn
> Squaring the first total off against the second leaves a $207bn funding hole
So, yes, expensive (mind the rental costs only) ... but forseen to be penetrating into everything imagineable.
3 replies →
Not sure where that math is coming from. Assuming it's true, you're ignoring that some users (me) already pay 10X that. Btw according Meta's SEC filings: https://s21.q4cdn.com/399680738/files/doc_financials/2023/q4... they made around $22/month/american user (not even heavy user or affluent iPhone owner) in q3 2023. I assume Google would be higher due to larger marketshare.
1 reply →
If you fed thousands of dollars to them, but it cost them tens of thousands of dollars in compute, it’s not commercially viable.
None of these companies have proven the unit economics on their services
If all frontier LLM labs agreed to a truce and stopped training to save on cost, LLMs would be immensely profitable now.
That isn't what I've seen: https://www.wheresyoured.at/oai_docs/
Those are effectively made up numbers, since they're given to him by an anonymous source we have no way of corroborating, and we can't even see the documents themselves, and it contradicts not just OpenAI's official numbers, but first principles analyses of what the economics of inference should be[1] and the inference profit reports of other companies, as well as just an analysis of the inference market would suggest[2]
[1]: https://martinalderson.com/posts/are-openai-and-anthropic-re..., https://github.com/deepseek-ai/open-infra-index/blob/main/20...
[2]: https://www.snellman.net/blog/archive/2025-06-02-llms-are-ch...
https://simonwillison.net/2025/Aug/17/sam-altman/#:~:text=Su...
Also independent analysis: https://news.ycombinator.com/threads?id=aurareturn&next=4596...
google what you just said and look at the top hit
it's a AI summary
google eats that ad revenue
it eats the whole thing
it blocked your click on the link... it drinks your milkshake
so, yes, there a 100 billion commercially viable product
Google Search has 3 sources of revenue that I am aware of: ad revenue from the search results page, sponsored search results, and AdSense revenue on the websites the user is directed to.
If users just look at the AI overview at the top of the search page, Google is hobbling two sources of revenue (AdSense, sponsored search results), and also disincentivizing people from sharing information on the web that makes their AI overview useful. In the process of all this they are significantly increasing the compute costs for each Google search.
This may be a necessary step to stay competitive with AI startups' search products, but I don't think this is a great selling point for AI commercialization.
And so ends the social contract of the web, the virtuous cycle of search engines sending traffic to smaller sites which collect ad revenue which in turn boosts search engine usage.
To thunderous applause.
Thank god. The fake search results, the money that manipulates our access to information. all gone. Finally we can try something else. I have a feeling it's going to be worse though.
[dead]
[dead]