← Back to context

Comment by rng-concern

6 days ago

I've only skimmed the article so I may have missed something but they seem to be equating AGI with consciousness, or that consciousness is required for AGI. I'm not convinced it's required. I'm also not convinced that matching the biology of neurons is required for AGI, which this article seems to assume as well. That said, a really close match to human-style intelligence might be tricky if we simplify the model too much, as we're doing. Perhaps we'll arrive at a different sort of intelligence, that is just as general.