Comment by slopinthebag

9 hours ago

I've seen folks say that the future of using computers will be with an LLM that generates code on the fly to accomplish tasks. I think this is a bit ridiculous, but I do think that operating computers through natural language instructions is superior for a lot of cases and that seems to be where we are headed.

I can see a future where software is built with a CLI interface underneath the (optional) GUI, letting an LLM hook directly into the underlying "business" logic to drive the application. Since LLM's are basically text machines, we just need somebody to invent a text-driven interface for them to use...oh wait!

Imagine booking a flight - the LLM connects to whatever booking software, pulls a list of commands, issues commands to the software, and then displays the output to the user in some fashion. It's basically just one big language translation task, something an LLM is best at, but you still have the guardrails of the CLI tool itself instead of having the LLM generate arbitrary code.

Another benefit is that the CLI output is introspectable. You can trace everything the LLM is doing if you want, as well as validate its commands if necessary (I want to check before it uses my credit card). You don't get this if it's generating a python script to hit some API.

Even before LLM's developers have been writing GUI applications as basically a CLI + GUI for testability, separation of concerns etc. Hopefully that will become more common.

Also this article was obviously AI generated. I'm not going to share my feelings about that.

Ofc it is written by ai, I have a skill for it -

https://github.com/thellimist/thellimist.github.io/blob/mast...

https://github.com/thellimist/thellimist.github.io/blob/mast...

I dump a voice message, then blog comes out. Then I modify a bunch of things, and iterate 1-2 hours to get it right

  • Might need to iterate on them more because it's still quite obviously machine written, and a lot of people find it disrespectful to read content that was LLM generated.