Comment by bbarnett

2 months ago

You're missing the point, I think.

The relationships you are describing are why airflight/spaceflight and AI/AGI are a good comparison.

We will never get AGI from an LLM. We will never fly to the moon via winged flight. These are examples of how one method of doing a thing, will never succeed in another.

Citing all the similarities between airflight and spaceflight makes my point! One may as well discuss how video games are on a computer platform, and LLMs are on a computer platform, and say "It's the same!", as say airflight and spaceflight are the same.

Note how I was very clear, and very specific, and referred to "winged flight" and "low/high pressure", which will never, ever, ever get one even to space. Nor allow anyone to navigate in space. There is no "lift" in space.

Unless you can describe to me how a fixed wing with low/high pressure is used to get to the moon, all the other similarities are inconsequential.

Good grief, people are blathering on about metallurgy. That's not a connection, it's just modern tech, has nothing to do with the method of flying (low/high pressure around the wing), and is used in every industry.

I love how incapable everyone has been in this thread of concept focus, incapable of separating the specific from the generic. It's why people think, generically, that LLMs will result in AGI, too. But they won't. Ever. No amount of compute will generate AGI via LLM methods.

LLMs don't think, they don't reason, they don't infer, they aren't creative, they come up with nothing new, it's easiest to just say "they don't".

One key aspect here is that knowledge has nothing to do with intelligence. A cat is more intelligent than any LLM that will ever exist. A mouse. Correlative fact regurgitation is not what intelligence is, any more than a book on a shelf is intelligence, or the results of Yahoo search 10 years ago were.

The most amusing is when people mistake shuffled up data output from an LLM as "signs of thought".