Comment by thomastjeffery

3 hours ago

That's always been a fun idea. Even a thousand years ago, when most people couldn't read or write, we yearned for more. Even without a description of the problem and its domain, it's immediately obvious that perfect communication would be magic.

The problem is that it's impossible. Even if you could directly copy experience from one mind to the other, that experience would be ungrounded. Experience is just as subjective as any expression: that's why we need science.

> through the culture war (nobody can define "man" or "woman" any more, sometimes the word "man" is used to refer to a "woman," etc).

That's a pretty mean rejection of empathy you've got going on there. People are doing their best to describe their genuine experiences, yet the only interpretations you have bothered to subject their expression to are completely irrelevant to them. Maybe this is a good opportunity to explore a different perspective.

> LLMs just accelerate this process of severing any connection whatsoever between signified and signifier.

That's my entire point. There was never any connection to begin with. The sign can only point to the signified. The signified does not actually interact with any semantics. True objectivity can only apply to the signified: never the sign. Even mathematics leverage an arbitrary canonical grammar to model the reality of abstractions. The semantics are grounded in objectively true axioms, but the aesthetics are grounded in an arbitrary choice of symbols and grammar.

The words aren't our problem. The problem is relevance. If we want to communicate effectively, we must find common ground, so that our intentions can be relevant to each others' interpretations. In other words, we must leverage empathy. My goal is to partially automate empathy with computation.