Comment by shinycode
5 months ago
We have thousand of old systems to maintain. Not sure everything could be rewritten or maintained with only LLM. If an LLM builds a whole system on its own and is able to maintain and fix it then it’s not just us software developper who will suffer, it means nothing to sale or market, people will just ask an LLM to do something. No sure this is possible. ChatGPT gave me a list of commands for my ec2 instance and one of them when executed made me loose access to ssh. It didn’t warn me. So « blindly » following an LLM lead on a cascade of instructions on a massive scale and on a long period could also lead to massive bugs or corruption of datas. Who did not ask an LLM for some code, that contained mistakes and we had to point the mistakes to it. I doubt system will stay robust with full autonomy without any human supervision. But it’s a great tool to iterate and throw away code after testing ideas
No comments yet
Contribute on Hacker News ↗