← Back to context

Comment by fallingknife

20 hours ago

Computing power requirements and the fact that my subjective appraisal of the performance of these LLMs doesn't match the insane scaling curves that AI companies put out showing that their capabilities double every 6 months. Even if a 100x increase in computing power could equal a software engineer (and that's far from certain), that would be more expensive than a software engineer.

But really the burden of proof is on you here since you are making extraordinary claims of AI superintelligence replacing all jobs in 10-15 years. You are making the trillion pound baby argument, so you need to back it up.

It's not, this isn't a lawsuit, it's a casual discussion, and I don't really care whether I'm convincing you or not, I'm taking about this because I enjoy it. I can tell you though that DeepSeek R1 & RL approaches do have shown the power to scale coding and reasoning much further without increasing model size or data requirements much, & that new improvements come non stop in the field from the billions being ingested and all the minds being focused on this all day long as it's so obvious to everyone that it's powerful