← Back to context Comment by simianwords 1 month ago ok but the capabilities are also rising. what point are you trying to make? 11 comments simianwords Reply oytis 1 month ago That it's not getting cheaper? jstummbillig 1 month ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount. simianwords 1 month ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 1 month ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
oytis 1 month ago That it's not getting cheaper? jstummbillig 1 month ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount. simianwords 1 month ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 1 month ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
jstummbillig 1 month ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount.
simianwords 1 month ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 1 month ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
techpression 1 month ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
That it's not getting cheaper?
But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount.
you are wrong. https://epoch.ai/data-insights/llm-inference-price-trends
this is accounting for the fact that more tokens are used.
The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with.
7 replies →