← Back to context

Comment by threethirtytwo

17 days ago

I don’t think you’re rational. Part of being able to be unbiased is to see it in yourself.

First of all. Nobody knows how LLMs work. Whether the singularity comes or not cannot be rationalized from what we know about LLMs because we simply don’t understand LLMs. This is unequivocal. I am not saying I don’t understand LLMs. I’m saying humanity doesn’t understand LLMs in much the same way we don’t understand the human brain.

So saying whether the singularity is imminent or not imminent based off of that reasoning alone is irrational.

The only thing we have is the black box output and input of AI. That input and output is steadily improving every month. It forms a trendline, and the trendline is sloped towards singularity. Whether the line actually gets there is up for question but you have to be borderline delusional if you think the whole thing can be explained away because you understand LLMs and transformer architecture. You don’t understand LLMs period. No one does.

> Nobody knows how LLMs work.

I'm sorry, come again?

  • nobody can how how something that is non-deterministic works - by its pure definition

    • LLMs are deterministic simply because computers are at the core deterministic machines. LLMs run on computers and therefore are deterministic. The random number generator is an illusion and an LLM that utilizes it will produce the same illusion of indeterminism. Find the seed and the right generator and you can make an LLM consistently produce the same output from identical input.

      Despite determinism, we still do not understand LLMs.

      4 replies →

  • I think they meant "Nobody knows why LLMs work."

    • Because they encode statistical properties of the training corpus. You might not know why they work but plenty of people know why they work & understand the mechanics of approximating probability distributions w/ parametrized functions to sell it as a panacea for stupidity & the path to an automated & luxurious communist utopia.

      13 replies →

this is wrong

  • It is not. I would suggest engaging in the other branch of this thread, because people who agreed with you voiced their opinion and they were proven utterly wrong.

    Humanity does not understand how LLMs work. This is definitive.