Comment by famouswaffles
10 hours ago
>But OpenAI doesn't have tiny COGS: inference is expensive as fuck.
No, inference is really cheap today, and people saying otherwise simply have no idea what they are talking about. Inference is not expensive.
10 hours ago
>But OpenAI doesn't have tiny COGS: inference is expensive as fuck.
No, inference is really cheap today, and people saying otherwise simply have no idea what they are talking about. Inference is not expensive.
Clearly not cheap enough.
> Even at $200 a month for ChatGPT Pro, the service is struggling to turn a profit, OpenAI CEO Sam Altman lamented on the platform formerly known as Twitter Sunday. "Insane thing: We are currently losing money on OpenAI Pro subscriptions!" he wrote in a post. The problem? Well according to @Sama, "people use it much more than we expected."
https://www.theregister.com/2025/01/06/altman_gpt_profits/
So just raise the price or decrease the cost per token internally.
Altman also said 4 months ago:
https://simonwillison.net/2025/Aug/17/sam-altman/