Comment by ako

4 days ago

Why does it matter? If you consider not just the people creating these hallucination, but also the people accepting them and using them, it must be billions and billions...

and that's the point. You need a critical mass of people buying into something. With LLMs, you just need ONE person with ONE model and a modest enough hardware.

https://chat.mistral.ai/chat/8b529b3e-337f-42a4-bf36-34fd9e5...

>Here’s a concise and thoughtful response you could use to engage with ako’s last point:

---

"The scale and speed might be the key difference here. While human-generated narratives—like religions or myths—emerged over centuries through collective belief, debate, and cultural evolution, LLMs enable individuals to produce vast, coherent-seeming narratives almost instantaneously. The challenge isn’t just the volume of ‘bullshit,’ but the potential for it to spread unchecked, without the friction or feedback loops that historically shaped human ideas. It’s less about the number of people involved and more about the pace and context in which these narratives are created and consumed."

  • But people post on social networks, blogs, newspapers and other widely read places, while LLMs post in chat rooms with 1 reader most of their outputs.

    • No, the web is now full of this bot generated noise.

      And even when only considering the tools used in isolated sessions not exposed by default, the most popular ones are tuned to favor engagement and retention over relevance. That's a different point as LLM definitely can be tuned in different direction, but in practice in does matter in terms of social impact at scale. Even prime time infotainment covered people falling in love or encouraged into suicidal loops by now. You're absolutely right is not always the best