← Back to context

Comment by deaux

5 hours ago

> I'd put my money on token prices doubling to tripling over the next 12-24 months.

Chinese open weights models make this completely infeasible.

What do weights have to do with how much it costs to run inference? Inference is heavily subsidized, the economics of it don't make any sense.

Anthropic and OpenAI could open source their models and it wouldn't make it any cheaper to run those models.. You still need $500k in GPUs and a boatload of electricity to serve like 3 concurrent sessions at a decent tok/ps.

There are no open source models, Chinese or otherwise that are going to be able to be run profitably and give you productivity gains comparable to a foundation model. No matter what, running LLMs is expensive and the capex required per tok/ps is only increasing, and the models are only getting more compute intensive.

The hardware market literally has to crash for this to make any sense from a profitability standpoint, and I don't see that happening, therefor prices have to go up. You can't just lose billions year after year forever. None of this makes sense to me. This is simple math but everyone is literally delusional atm.

  • Open weights means that the current prices for inference of Chinese models are indicative of their cost to run because.

    https://openrouter.ai/moonshotai/kimi-k2.5

    It's a fantasy to believe that every single one of these 8 providers is serving at incredibly subsidized dumping prices 50% below cost and once that runs out suddenly you'll pay double for 1M of tokens for this model. It's incredibly competitive with Sonnet 4.5 for coding at 20% of the token price.

    I encourage you to become more familiar with the market and stop overextrapolating purely based on rumored OpenAI numbers.

    • I'm not making any guesses, I happen to know for a fact what it costs. Please go try to sell inference and compete on price. You actually have no clue what you're talking about. I knew when I sent that response I was going to get "but Kimi!"

      3 replies →