Comment by Towaway69
8 hours ago
I think your answer is the reason why. LLM performance is fine when applied to everything they can do.
Take LLM out that safe space and suddenly they are no silver bullet, in fact they are unless.
So of course those making the 10x claim mean in the safe space where LLM can handle all activities required. You can’t have it both ways 10x and difficult and confusing tasks for LLMs.
Right, I get that. I'm just saying it seems wrong to throw up minority examples. Nobody is pumping out AAA games at speed with LLMs, nor is anyone claiming to do so. There will likely always be some areas where LLMs are bad or useless.
How many people are writing crud apps using mainstream languages vs COBOL though? You don't need 100% silver bullet 1-shot everything, just to recognize the signals that for many use cases, there's a significant shift happening. The safe space is expanding and velocity is increasing.
Definitely the safe space is expanding but how fragile and expensive is this expansion?
AI requires a larger amount of fragile resources to work as opposed to an editor, keyboard and a human.
It some sense it’s a bit like the bitcoin revolution that slowed down once transaction times ballooned out. And blockchains didn’t replace databases as expected. Probably for very good reasons: resources required v. results delivered.
I personally agree that AI is great technology for some great new tools. But we still haven’t found its limits: cost v. results. That happened with bitcoins and blockchains is still outstanding for AIs.