Comment by robviren

13 hours ago

I have been playing with the idea of an LLM native programming language focusing on token efficiency, comprehension, and attention. It is interesting to see what the various large models come up with. A common theme actually reminds me quite of bit of assembly. The verb prefixing, limited statements per line, small concept surface area all appeared in multiple conversations across several larger models. The big difference being assembly lacks semantic meaning leaving some benefit on the table. I still cannot believe what some did with the tech, RCT is such a retro favorite.