Comment by InsideOutSanta
11 hours ago
Sure, running an LLM is cheaper, but the way we use LLMs now requires way more tokens than last year.
11 hours ago
Sure, running an LLM is cheaper, but the way we use LLMs now requires way more tokens than last year.
10x more tokens today cost less than than half of X tokens from ~mid 2024.
ok but the capabilities are also rising. what point are you trying to make?
That it's not getting cheaper?
But it is, capability adjusted, which is the only way it makes sense. You can definitely produce last years capability at a huge discount.
you are wrong. https://epoch.ai/data-insights/llm-inference-price-trends
this is accounting for the fact that more tokens are used.
8 replies →