Comment by nyrikki

3 years ago

Your links claim that:

'GPT-4 is not “AI” because AI means “AGI,”'

Is a more strict term that hasn't typically applied to AI as an example of my above claim.

As we lack general definitions, it isn't invalid but no AI is thought to be possible with their claims.

AI being computer systems that perform work that typically requires humans within a restricted domain is closer to what most researchers would use in my experience.

"AI" is one of the few words that has a looser definition as jargon than in general discourse. In general discourse, "AI" has a precise meaning: "something that can think like a human." As jargon though, "AI" means "we can get funding for calling this 'AI'." I would say LLMs count as AI exactly because they can simulate human-like reasoning. Of course they still have gaps, but on the other hand, they also have capabilities that humans don't have. On balance, "AI" is fair, but it's only fair because it's close to the general usage of the term. The jargon sense of it is just meaningless and researchers should be ashamed of letting the term get so polluted when it had and has a very clear meaning.