← Back to context Comment by simianwords 9 hours ago ok but the capabilities are also rising. what point are you trying to make? 11 comments simianwords Reply oytis 9 hours ago That it's not getting cheaper? jstummbillig 9 hours ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount. simianwords 9 hours ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 9 hours ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
oytis 9 hours ago That it's not getting cheaper? jstummbillig 9 hours ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount. simianwords 9 hours ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 9 hours ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
jstummbillig 9 hours ago But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount.
simianwords 9 hours ago you are wrong. https://epoch.ai/data-insights/llm-inference-price-trendsthis is accounting for the fact that more tokens are used. techpression 9 hours ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
techpression 9 hours ago The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with. 7 replies →
That it's not getting cheaper?
But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount.
you are wrong. https://epoch.ai/data-insights/llm-inference-price-trends
this is accounting for the fact that more tokens are used.
The chart shows that they’re right though. Newer models cost more than older models. Sure they’re better but that’s moot if older models are not available or can’t solve the problem they’re tasked with.
7 replies →