← Back to context

Comment by simianwords

10 hours ago

you are wrong. https://epoch.ai/data-insights/llm-inference-price-trends

this is accounting for the fact that more tokens are used.

The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with.

  • this is incorrect. the cost to achieve the same task by old models is way higher than by new models.

    > Newer models cost more than older models

    where did you see this?

    • On the link you shared, 4o vs 3.5 turbo price per 1m tokens.

      There’s no such thing as ”same task by old model”, you might get comparable results or you might not (and this is why the comparison fail, it’s not a comparison), the reason you pick the newer models is to increase chances of getting a good result.

      1 reply →