← Back to context

Comment by tonfa

10 hours ago

> They aren't going away but for some they may become prohibitively expensive after all the subsidies end.

Even if inference was subsidized (afaik it isn't when paying through API calls, subscription plans indeed might have losses for heavy users, but that's how any subscription model typically work, it can still be profitable overall).

Models are still improving/getting cheaper, so that seems unlikely.

> afaik it isn't when paying through API calls

There is no evidence for this. The claims that API is "profitable on inference" are all hearsay. Despite the fact that any AI executive could immediately dismiss the misconception by merely making a public statement beholden to SEC regulation, they don't.

> Models are still improving/getting cheaper

The diminishing returns have set in for quality, and for a while now that increased quality has come at the cost of massive increases in token burn, it's not getting cheaper.

Worse yet, we're in an energy crisis. Iran has threatened to strike critical oil infrastructure, and repairs would take years.

AI is going to get significantly more expensive, soon.

It probably is still subsidized, just not as much. We won't know if these APIs are profitable unless these companies go public, and till then it's safe to bet these APIs are underpriced to win the market share.

  • Third-party AI inference with open models is widely available and cheap. You're paying as much as proprietary mini-models or even less for something far more capable, and that without any subsidies (other than the underlying capex and expense for training the model itself).

  • Anthropic has shared that API inference has a ~60% margin. OpenAI's margin might be slightly lower since they price aggressively but I would be surprised if it was much different.