Comment by andai
11 days ago
Well, that doesn't seem ethical or possible to me. But maybe I haven't put enough thought into it.
My current mental model for AI is artificial life.
It isn't life yet, but we're very close to that. All that's missing is replication and mutation, and those are both already trivial. (Indeed, a few months after incorporating AI into their AI training systems, the major AI labs all rolled out prompts, training and safety flags against self-modification and self-replication. I'm not sure why, but the timing is curious.)
(The question of whether consciousness is present, or necessary, is left, of course, as an exercise for the reader ;)
For example when people think of AI self replicating and taking over the internet, they think it would be a terrible thing, and that humans would have to manually intervene to stop it. But it really seems like an obvious ecosystem problem to me.
It's just filling a niche. If there was already something there -- an actually symbiotic form of AI -- then it wouldn't be able to spread like that.
So I see the future of AI, both in terms of cybersec and preserving civilization, as an ecosystem design problem.
No comments yet
Contribute on Hacker News ↗