← Back to context

Comment by parliament32

20 hours ago

Sure it is. They're well aware their product is a money furnace and they'd have to charge users a few orders of magnitude more just to break even, which is obviously not an option. So all that's left is.. convince users to burn tokens harder, so graphs go up, so they can bamboozle more investors into keeping the ship afloat for a bit longer.

If this claim is true (inference is priced below cost), it makes little sense that there are tens of small inference providers on OpenRouter. Where are they getting their investor money? Is the bubble that big?

Incidentally, the hardware they run is known as well. The claim should be easy to check.

  • To be clear, I'm talking about subscription pricing. API pricing for Anthropic is probably at-cost.

    I dare you to run CC on API pricing and see how much your usage actually costs.

    (We did this internally at work, that's where my "few orders of magnitude" comment above comes from)

It's an option and they are going to do it. Chinese models will be banned and the labs will happily go dollar for dollar in plan price increases. $20 plans won't go away, but usage limits and model access will drive people to $40-$60-$80 plans.

At cell phone plan adoption levels, and cell phone plan costs, the labs are looking at 5-10yr ROI.