Comment by Kostchei
4 hours ago
Are you saying that people can't work out what to code using these? Or that code is not a worthy subject to use AI for? 'cause I got news for you... 1. Improving coding improved reasoning in the models. Having a verifiable answer that is not a single thing is a good training test. 2. Software has been used for fairly serious things. We used to have skyscrapers of people doing manual math. Now we have campuses of people doing manual code. You might argue that nobody would trust AI to write code when it matters. History tells us that if that is ever true, it will pass. 3. We are not going to run out of planet. It just feels to folks that there is not enough planet for their dreams and we get population panic, energy panic etc. There is a huge fusion reactor conveniently holding us in it's gravity well and spewing out many orders of magnitude more energy than we can currently use. Chill.
I think at Gas Country levels we will need better networking systems. Maybe that backbone Nvidia just built....
Replacing human computers with electronic computers is nothing like what LLMs do or how they work. The electronic computer is straight up automation. Same input in gives you the same input out every time. Electronic computers are actually pretty simple. They just do simple mathematical operations like add, subtract, multiply, and divide. What makes them so powerful is that they can do billions of those simple operations a second.
LLMs are not simple deterministic machines that automate rote tasks like computers or compilers. People, please stop believing and repeating that they are the next level of abstraction and automation. They aren't.