Comment by HarHarVeryFunny

2 days ago

> Yes - but LLMs also get this "embodied knowledge" passed down from human-generated training data.

It's not the same though. It's the difference between reading about something and, maybe having read the book and/or watched the video, learning to DO it yourself, acting based on the content of your own mind.

The LLM learns 2nd hand heresay, with no idea of what's true or false, what generalizations are valid, or what would be hallucinatory, etc, etc.

The human learns verifiable facts, uses curiosity to explore and fill the gaps, be creative etc.

I think it's pretty obvious why LLMs have all the limitations and deficiencies that they do.

If 2nd hand heresay (from 1000's of conflicting sources) really was good as 1st hand experience and real-world prediction, then we'd not be having this discussion - we'd be bowing to our AGI overlords (well, at least once the AI also got real-time incremental learning, internal memory, looping, some type of (virtual?) embodiment, autonomy ...).

"The LLM learns 2nd hand heresay, with no idea of what's true or false, what generalizations are valid, or what would be hallucinatory, " - do you know what is true and what is false? Take this: https://upload.wikimedia.org/wikipedia/commons/thumb/b/be/Ch... - Do you believe your eyes or do you believe the text about it?