Comment by B1FF_PSUVM
3 months ago
> not as statistical machines, but geometric machines. When you train LLMs you are essentially moving concepts around in a very high dimensional space.
That's intriguing, and would make a good discussion topic in itself. Although I doubt the "we have the same thing in [various languages]" bit.
Mother/water/bed/food/etc easily translates into most (all?) languages. Obviously such concepts cross languages.
In this analogy they are objects in high dimensional space, but we can also translate concepts that don’t have a specific word associated with them. People everywhere have a way to refer to “corrupt cop” or “chess opening” and so forth.
> Mother/water/bed/food/etc easily translates into most (all?) languages. Obviously such concepts cross languages.
See also: Swadesh List and its variations (https://en.wikipedia.org/wiki/Swadesh_list), an attempt to make a list of such basic and common concepts.
"Bed" and "food" don't seem to be in those lists though, but "sleep" and "eat" are.
What do you mean, exactly, about the doubting part? I thought it was fairly well known that LLMs possess superior translation capabilities.
Sometimes you do not have the same concepts - life experiences are different.