Comment by sandgiant

2 years ago

You are probably joking, but I think it's actually very important to look at the language we use around LLMs, in order not to get stuck in assumptions and sociological bias associated with a vocabulary usually reserved for "magical" beings, as it were.

This goes both ways by the way. I could be convinced that LLMs can achieve something the likes of intuition, but I strongly believe that it is a very different kind of intuition than we normally associate with humans/animals. Usins the same label is thus potentially confusing, and (human pride aside) might even prevent us from appreciating the full scope of what LLMs are capable of.

I think the issue is that we're suddenly trying to pin down something that was previously fine being loosely understood, but without any new information.

If someone came to the table with "intuition is the process of a system inferring a likely outcome from given inputs by the process X - not to be confused with matmultuition which is process Y", that might be a reasonable proposal.