← Back to context

Comment by polytely

2 months ago

> What separates this from human.

A lot. Like an incredible amount. A description of a thing is not the thing.

There is sensory input, qualia, pleasure & pain.

There is taste and judgement, disliking a character, being moved to tears by music.

There are personal relationships, being a part of a community, bonding through shared experience.

There is curiosity and openeness.

There is being thrown into the world, your attitude towards life.

Looking at your thoughts and realizing you were wrong.

Smelling a smell that resurfaces a memory you forgot you had.

I would say the language completion part is only a small part of being human.

All of these things arise from a bunch of inscrutable neurons in your brain turning off and on again in a bizarre pattern though. Who’s to say that isn’t what happens in the million neuron LLM brain.

Just because it’s not persistent doesn’t mean it’s not there.

Like, I’m sort of inclined to agree with you, but it doesn’t seem like it’s something uniquely human. It’s just a matter of degree.

  • Sure in some ways it's just neurons firing in some pattern. Figuring out and replicating the correct sets of neuron patterns is another matter entirely.

    Living creatures have fundamental impetus to grow and reproduce that LLMS and AIS simply do not have currently. Not only that but animals have a highly integrated neurology that has billions of years of being tune to that impetus. For example the ways that sex interacts with mammalian neurology is pervasive. Same with need for food, etc. That creates very different neural patterns than training LLMS does.

    Eventually we may be able to re-create that balance of impetus, or will, or whatever we call it, to make sapience. I suspect we're fairly far from that, if only because the way LLMs we create LLMs are so fundamentally different.

"I would say the language completion part is only a small part of being human" Even that is only given to them. A machine does not understand language. It takes input and creates output based on a human's algorithm.

  • > A machine does not understand language

    You can't prove humans do either. You can see how many times actual people with understanding something that's written for them. In many ways, you can actually prove that LLMs are superior to humans right now when it comes to understanding text.

    • > In many ways, you can actually prove that LLMs are superior to humans right now when it comes to understanding text

      Emphasis mine.

      No, I don't think you can, without making "understanding" a term so broad as to be useless.

    • "You can't prove humans do either." Yes you can via results and cross examination. Humans are cybernetic systems(the science not the sci-fi). But you are missing the point. LLMs are code written by engineers. Saying LLMs understand text is the same as saying a chair understands text. LLMs' 'understanding' is nothing more than the engineers synthesizing linguistics. When I ask an A'I' the Capital of Ireland, it answers Dublin. It does not 'understand' the question. It recognizes the grammar according to an algorithm, and matches it against a probabilistic model given to it by an engineer based on training data. There is no understanding in any philosophical nor scientific sense.

      1 reply →

That's a lot of words shitting on a lot of words.

You said nothing meaningful that couldn't also have been spat out by an LLM. So? What IS then the secret sauce? Yes, you're a never resting stream of words, that took decades not years to train, and has a bunch of sensors and other, more useless, crap attached. It's technically better but, how does that matter? It's all the same.