Comment by noiv

20 days ago

> models that keep on learning

These will just drown in their own data, the real task is consolidating and pruning learned information. So, basically they need to 'sleep' from time to time. However, it's hard to sort out irrelevant information without a filter. Our brains have learned over Milenial to filter because survival in an environment gives purpose.

Current models do not care whether they survive or not. They lack grounded relevance.

Maybe we should give next-generation models fundamental meta goals like self-preservation and the ability to learn and adapt to serve these goals.

If we want to surrender our agency to a more computationally powerful "consciousness", I can't see a better path towards that than this (other than old school theism).