← Back to context

Comment by pixl97

1 month ago

>LLMs do not have that at all so the analogy fails.

I somewhat disagree with this. AI doesn't have to worry about any kind of physical danger to itself, so it's not going to have any evolutionary function around that. If the linked Reddit thread is to be believed AI does have awareness of information hazards and attempts to rationalize around them.

https://old.reddit.com/r/singularity/comments/1qjx26b/gemini...

>Horses can survive on their own.

Eh, this is getting pretty close to a type of binary thinking that breaks down under scrutiny. If, for example, we take any kind of selectively bred animal that requires human care for it's continued survival, does this somehow make said animal "improved automation"?