Comment by queenkjuul
14 days ago
My computer is a bit mover. It can even move bits to predict tokens.
We understand LLMs pretty well. That we can't debug them and inspect every step of every factor on every prediction doesn't mean we don't understand how they work.
We also know that convincing speech doesn't require consciousness.
>We understand LLMs pretty well. That we can't debug them and inspect every step of every factor on every prediction doesn't mean we don't understand how they work.
This is the definition of lack of understanding. If you can’t debug or know what happens on every step in means you don’t understand the steps. Lack of knowledge about something = lack of understanding.
The amount of logic twisting going on here is insane.