Comment by pixl97

18 days ago

> because you are assuming their world-model of biking can be expressed in language. It can't!

So you can't build an AI model that simulates riding a bike? I'm not stating a LLM model, I'm just saying the kind of AI simulation we've been building virtual worlds with for decades.

So, now that you agree that we can build AI models of simulations, what are those AI models doing. Are they using a binary language that can be summarized?

Obviously you can build an AI model that rides a bike, just not an LLM that does so. Even the transformer architecture would need significant modification to handle the multiple input sensor streams, and this would be continuous data you don't tokenize, and which might not need self-attention, since sensor data doesn't have long-range dependencies like language does. The biking AI model would almost certainly not resemble an LLM very much.

Calling everything "language" is not some gotcha, the middle "L" in LLM means natural language. Binary code is not "language" in this sense, and these terms matter. Robotics AIs are not LLMs, they are just AI.

  • >Binary code is not "language" in this sense

    Any series of self consistent encoded signals can be language. You could feed an LLM wireless signals if until it learned how to connect to your wifi if you wanted to. Just assign tokens. You're acting like words are something different than encoded information. It's the interconnectivity between those bits of data that matters.