← Back to context

Comment by HugoDias

5 hours ago

According to their benchmarks, GPT 5.4 Nano > GPT-5-mini in most areas, but I'm noticing models are getting more expensive and not actually getting cheaper?

GPT 5 mini: Input $0.25 / Output $2.00

GPT 5 nano: Input: $0.05 / Output $0.40

GPT 5.4 mini: Input $0.75 / Output $4.50

GPT 5.4 nano: Input $0.20 / Output $1.25

models are getting costlier but by performance getting cheaper. perhaps they don't see a point supporting really low performance models?

  • I would be curious to know if from the enterprise / API consumption perspective, these low-performance models aren't the most used ones. At least it matches our current scenario when it comes to tokens in / tokens out. I'd totally buy the price increase if these are becoming more efficient though, consuming less tokens.

Those are bigger models. The serving isn’t going to be cheaper.

Why expect cheaper then? The performance is also better

  • You seem to have insight into the size of OpenAI’s models.

    Care to share the parameter counts for them?