← Back to context

Comment by rdevilla

16 hours ago

> Meaning is abstract. We can't express meaning: we can only signify it. An expression (sign) may contain the latent structure of meaning (the writer's intention), but that structure can only be felt through a relevant interpretation.

I'm reminded immediately of the Enochian language which purportedly had the remarkable property of having a direct, unambiguous, 1-to-1 correspondence with the things being signified. To utter, and hear, any expression in Enochian is to directly transfer the author's intent into the listener's mind, wholly intact and unmodified:

    Every Letter signifieth the member of the substance whereof it speaketh.
    Every word signifieth the quiddity of the substance.

    - John Dee, "A true & faithful relation of what passed for many yeers between Dr. John Dee ... and some spirits," 1659 [0].

The Tower of Babel is an allegory for the weak correspondence between human natural language and the things it attempts to signify (as opposed to the supposedly strong 1-to-1 correspondence of Enochian). The tongues are confused, people use the same words to signify different referents entirely, or cannot agree on which term should be used to signify a single concept, and the society collapses. This is similar to what Orwell wrote about, and we have already implemented Orwell's vision, sociopolitically, in the early 21st century, through the culture war (nobody can define "man" or "woman" any more, sometimes the word "man" is used to refer to a "woman," etc).

LLMs just accelerate this process of severing any connection whatsoever between signified and signifier. In some ways they are maximally Babelian, in that they maximize confusion by increasing the quantity of signifiers produced while minimizing the amount of time spent ensuring that the things we want signified are being accurately represented.

Speaking more broadly, I think there is much confusion in the spheres of both psychology and religion/spirituality/mysticism in their mutual inability to "come to terms" and agree upon which words should be used to refer to particular phenomenological experiences, or come to a mutual understanding of what those words even mean (try, for instance, to faithfully recreate, in your own mind, someone's written recollection of a psychedelic experience on erowid).

[0] https://archive.org/details/truefaithfulrela00deej/page/92/m...

That's always been a fun idea. Even a thousand years ago, when most people couldn't read or write, we yearned for more. Even without a description of the problem and its domain, it's immediately obvious that perfect communication would be magic.

The problem is that it's impossible. Even if you could directly copy experience from one mind to the other, that experience would be ungrounded. Experience is just as subjective as any expression: that's why we need science.

> through the culture war (nobody can define "man" or "woman" any more, sometimes the word "man" is used to refer to a "woman," etc).

That's a pretty mean rejection of empathy you've got going on there. People are doing their best to describe their genuine experiences, yet the only interpretations you have bothered to subject their expression to are completely irrelevant to them. Maybe this is a good opportunity to explore a different perspective.

> LLMs just accelerate this process of severing any connection whatsoever between signified and signifier.

That's my entire point. There was never any connection to begin with. The sign can only point to the signified. The signified does not actually interact with any semantics. True objectivity can only apply to the signified: never the sign. Even mathematics leverage an arbitrary canonical grammar to model the reality of abstractions. The semantics are grounded in objectively true axioms, but the aesthetics are grounded in an arbitrary choice of symbols and grammar.

The words aren't our problem. The problem is relevance. If we want to communicate effectively, we must find common ground, so that our intentions can be relevant to each others' interpretations. In other words, we must leverage empathy. My goal is to partially automate empathy with computation.