Comment by someguyiguess

21 hours ago

Any sufficiently complex LLM is indistinguishable from AGI

> Any sufficiently complex LLM is indistinguishable from AGI

Isn't this tautology? We've de facto defined AGI as a "sufficiently complex LLM."

  • Yes! Same logic as the financials, in which the companies pass back and forth the same $200 Billion promissory note.

  • No, it’s just an example of something that’s indistinguishable from AGI. Of all the things that are or are indistinguishable from AGI, a sufficiently complex LLM is one. A sufficiently complex decision tree is probably another. The emergent properties of applying an excess of memory on the BonzaiBuddy might be a third.

If we take that statement as fact then I don't believe we are even close to an LLM being sufficiently complex enough.

However, I don't think it is even true. LLMs may not even be on the right track to achieving AGI and without starting from scratch down an alternate path it may never happen.

LLMs to me seem like a complicated database lookup. Storage and retrieval of information is just a single piece of intelligence. There must be more to intelligence than a statistical model of the probable next piece of data. Where is the self learning without intervention by a human. Where is the output that wasn't asked for?

At any rate. No amount of hype is going to get me to believe AGI is going to happen soon. I'll believe it when I see it.