← Back to context

Comment by ninetyninenine

14 days ago

>We understand LLMs pretty well. That we can't debug them and inspect every step of every factor on every prediction doesn't mean we don't understand how they work.

This is the definition of lack of understanding. If you can’t debug or know what happens on every step in means you don’t understand the steps. Lack of knowledge about something = lack of understanding.

The amount of logic twisting going on here is insane.