Comment by danielparsons

2 days ago

you're looking at nearly the entire curve of the tech's development. that's like saying lightbulbs became 99% more energy efficient and therefore will become another 99% more energy efficient. but most techs follow an S curve.

>you're looking at nearly the entire curve of the tech's development

That's a pretty strong statement that would need some data or at least a mathematical argument to back it up. Otherwise it's like saying in the 1980s that PCs with 640kB RAM have reached their pinnacle in terms of what users can expect in real life benefits and there's no reason to keep pushing the tech.

  • *entire curve to-date (I should have clarified). Yes it will get better for a long time, but where we are on the curve is harder to say. Lots of metrics to choose from, like "well it's incorrect 90% less often than a year ago, so that's a 10x improvement!". But the real metric that matters is how useful it is to people, and based on user data it looks like the only area it's getting exponentially more useful YoY is for programming. Lot of coders using it 10x more than before to code 10x faster. Not sure any other profession uses it for more than a juiced-up search engine / proofreader.

    • Tbf that sounds like a strong bias from someone who works exclusively in software development and simply hasn't found other uses. But I have worked with integrating LLMs across quite a few applications and departments by now and I can comfortably say that programming is not the only thing where we see extreme benefits. I wouldn't even say it's the area that has seen the most benefit so far. There used to be a lot of mundane work outside of software development that was easy prey even for early models. And with the current cutting edge models I'm pretty sure that you could replace >75% white collar jobs if you just get the context engineering right. That's the hard part right now, not the raw intelligence necessary for arbitrary data processing. But frameworks are getting there fast.