← Back to context

Comment by garte

5 days ago

I'm not sure it's that direct of a connection.

There's something to be said about the structuralist part of it: using large amounts of text as a rule set to return a semblance of truth seems to be a structuralist's wet dream.

It's like drawing the map for the king: the real is being represented by reducing a huge number of data points to a mixture of randomness and hard rules that pretend to be real.

At the very least it's a form of hyperreality as far as I understand it.

Indeed this is what I was aiming at, however the concern for (a semblance of) truth seems rooted in a view that locates meaning in what signs refer to. This view feels incomplete when faced with a dyadic model where the relationship between signifier and signified takes precedence over reference. The notion of simulacrum only emerges in a technical culture that has elevated 'reality' to a special status. After all, what is 'reality' in technical systems if not itself a simulacrum? Hilbert's program, symbolic AI, rule systems, ontologies, the semantic web - they all struggled to capture reality as a whole precisely because they tried to grasp it through formal objects claiming universal scope via the machinery of said formalisms.

  • What does that have to do with LLMs?

    • The structuralist unsuccessfully tried to find similarities between different symbol systems in different cultures. They were convinced they could come up with some sort of formula of how a culture can be categorized and in what development state it is.

      Fundamentally it's about language. What does a word or a sentence represent? How do you go from a spoken or written text to something that is meaningful to you if it's presented to you?

      This intermediating process of communication is highly complicated and fraught with misconceptions which lead to lots of fuzzy logic being applied by your brain when trying to understand something.

      The stuff that's happening between hearing or reading something and you actually taking it in as something meaningful is a vast space which is as of yet unexplored and gives a lot of room for speculation. This is what Beaudrillard (and others) tried to describe and analyze.

      And it has nothing to do with math. Math is a whole other story and won't solve the problem for you because it's a different kind of medium (or text if you will). Math sits between you and the other while language is something in yourself, so to speak.

      LLM's try to gather meaning from text stochastically. This is not the way we gather meaning as humans and in a sense this is not how communication works in the real.

      But Beaudrillard (and others) reasoned that we left the real and live in hyperreality. The most famous example of his is Disney World: As soon as it existed it started to infuse itself into our everyday lives. The simulation of a fantasy world (the real existing Disney World theme park) has started to become real outside of it but not rooted in reality (it became a simulacrum): it is an emulation of reality. In that sense it is virtual. It's like virtual reality in the real world, it's a fantasy you can touch (and that can be sold and be molded to be sold more successfully).

      The idea of sociability in social media is another example: it does not exist in the real sense, it is mediated by technology. Its origins are hinted at by using terminology of social interactions but in the end it's a transactional empty sort of sociability which promotes attention seeking and fast, easily digested pieces of symbolism over actual interactions. And more and more this kind of "new" sociability becomes part of our actual social lives.