← Back to context

Comment by wewewedxfgdf

12 hours ago

Its software. Software is not conscious.

If your brain is hardware then what are your thoughts?

Is a sperm conscious? Or an egg? When they come together the eventual brain is not conscious immediately.

  • LLMs are word prediction engines.

    They clearly are not conscious, they are just guessing what words should come next.

    • The human brain is an electrical signal prediction machine.

      Anything that looks like intelligence will look like a prediction machine because the alternative is logic being hardcoded apriori.

    • > They clearly are not conscious

      Consciousness is emergent. A human is not conscious by our definition until the moment they are. How will we be able to identify the singularity when it comes? I feel like this is what the article is really addressing.

      > LLMs are word prediction engines

      Humans can also do this too, so what are the missing parts for consciousness? Close a few loops on learning pipeline and we might be there.

I do appreciate how AI has been taught to spell properly as in the difference between its and it's. Here, initially I thought you'd left out the apostrophe in its, but then I realized you might be saying 'the reason it is not conscious is because of -its- software - the latter not being conscious. Context and interpretation are rather critical. (I know - a truism!)