Comment by magic_hamster
2 months ago
There's a lot in this comment that doesn't exactly fit.
First of all, there could be other solutions, such as B2B subsidizing individual user plans, or more fine grained model tiering per cost.
Also, yes you can get some access for free, but even today the higher tiers of proprietary models is around $200/mo for individual users, which might still be subsidized but is definitely not free, and is quite a chunk of money at $2400 a year!
I don't know what your setup is at the moment, but it's possible more efficient hardware and stack are available that you're not utilizing. Of course this depends on what models you're trying to run.
I think that smaller models will become a lot better, and hardware will become more optimized as well. We're starting to see this with NPUs and TPUs.
All this means running models will cost less, and maybe upgrading the power grid will also reduce cost of energy, making it more affordable.
I don't see any way that AI will go away because it "hits a wall". We have long passed the point of no return.
You are looking at it from the individual's PoV, but the OP is using the bird view from high above. It is the total amount of effort deployed today already to provide all the existing AI services, which is enormous. Data centers, electricity, planning/attention (entities focused on AI have less time to work on something else), components (Nvidia shortage, RAM shortage), etc.
This is not about finance, but about the real economy and how much of it, and/or its growth, is diverted to AI. The real economy is being reshaped, influencing a lot of other sectors independent of AI use itself. AI heavily competes with other uses for many kinds of actual real resources - without having equally much to show for it yet.
Just an example: https://www.technologyreview.com/2025/05/20/1116327/ai-energ... (it is worth skip-reading it all, the headline on its own is useless)