Comment by seanhunter

1 year ago

We somehow want a network that is neuromorphic in structure but we don't want it to be like the brain and take 20 years or more to train?

Secondly how do we get to claim that a particular thing is neuromorphic when we have such a rudimentary understanding of how a biological brain works or how it generates things like a model of the world, understanding of self etc etc.

Something to consider is that it really could take 20+ years to train like a brain. But once you’ve trained it, you can replicate at ~0 cost, unlike a brain.

> we don't want it to be like the brain and take 20 years or more to train?

Estimates put training of gpt4 at something like 2500 gpu years to train, over about 10000 gpus. 20 years would be a big improvement.

  • 1 GPU year is in no way comparable to 1 chronological year of learning for a human brain though.

    • Yes, but the underlying point is that in this case you can train the AI in parallel, and there's a decent chance this or something like it will be true for future AI architectures too. What does it matter that the AI needs to be trained on 20 years of experiences if all of those 20 years can be experienced in 6 months given the right hardware?

      1 reply →