← Back to context

Comment by oa335

18 hours ago

> Inference providers are already profitable.

That surprises me, do you remember where you learned that?

Lots of sources, and you can do the math yourself.

Here's a few good ones:

https://github.com/deepseek-ai/open-infra-index/blob/main/20... (suggests Deepseek is making 80% raw margin on inference)

https://www.snellman.net/blog/archive/2025-06-02-llms-are-ch...

https://martinalderson.com/posts/are-openai-and-anthropic-re... (there's a HN discussion of this where it was pointed out this overestimates the costs)

https://www.tensoreconomics.com/p/llm-inference-economics-fr... (long, but the TL;DR is that serving Lllama 3.3 70B costs around $0.28/million tokens input, $0.95 output at high utilization. These are close to what we see in the market: https://artificialanalysis.ai/models/llama-3-3-instruct-70b/... )