Comment by kelseyfrog
3 months ago
"Oracularizing AI" has a lot of mileage.
It's not too much to say that AI, LLMs in particular, satisfy the requisites to be considered a form of divination. ie:
1. Indirection of meaning - certainly less than the Tarot, I Ching, or runes, but all text is interpretive. Words in a Saussurian way are always signifiers to the signified, or in Barthes's death of the author[2] - precise authorial intention is always inaccessible.
2. A sign system or semiotic field - obvious in this case: human language.
3. Assumed access to hidden knowledge - in the sense that LLM datasets are popularly known to contain all the worlds knowledge, this necessarily includes hidden knowledge.
4. Ritualized framing - Approaching an LLM interface is the digital equivalent to participating in other divinatory practices. It begins with setting the intention - to seek an answer. The querent accesses the interface, formulates a precise question by typing, and commits to the act by submitting the query.
They also satisfy several of the typical but not necessary aspects of divinatory practices:
5. Randomization - The stochastic nature of token sampling naturally results in random sampling.
6. Cosmological backing - There is an assumption that responses correspond to the training set and indirectly to the world itself. Meaning embedded in the output correspond in some way - perhaps not obviously - to meaning in the world.
7. Trained interpreter - In this case, as in many divinatory systems, the interpreter and querent are the same.
8. Feedback loop - ChatGPT for example is obviously a feedback loop. Responses naturally invite another query and another - a conversation.
It's often said that sharing AI output is much like sharing dreams - only meaningful to the dreamer. In this framework, sharing AI responses are more like sharing Tarot card readings. Again, only meaningful to the querent. They feel incredibly personalized like horoscopes, but it's unclear whether that meaning is inherent to the output or simply the querents desire to imbue the output with by projecting their meaning onto it.
Like I said, I feel like there's a lot of mileage in this perspective. It explains a lot about why people feel a certain way about AI and hearing about AI. It's also a bit unnerving; we created another divinatory practice and a HUGE chunk of people participate and engage with it without calling it such and simply believing it, mostly because it doesn't look like Tarot or runes, or I Ching even though ontologically it fills the same role.
Notes: 1. https://en.wikipedia.org/wiki/Signified_and_signifier
No comments yet
Contribute on Hacker News ↗