← Back to context

Comment by Closi

3 hours ago

I think your definition of it being 'human level' is sensible - definitely a lower bar to hit than 'as long as people can do work that a robot cannot do, we don't have AGI'.

There is certainly a lot road between current technology and driving a car through a construction zone during rush hour, particularly with the same amount of driving practice a human gets.

Personally I think there could be an AGI which couldn't drive a car, but has genuine sentience - an awareness of being alive, although not necessarily the exact human experience. Maybe this isn't AGI, which more implies problem-solving and thinking rather than sentience, but in my gut if we got something sentient but that couldn't drive a car, we would still be there if that makes sense?

In theory I see what you're saying. There are physical things an octopus could conceivably do that I never could on account of our physiology rather than our intelligence. So you can contrive an analogous scenario involving only the mind where something that is clearly an AGI is incapable of some specific task and thus falls short of my definition. This makes it clear that my definition is a heuristic rather than rigorous.

Nonetheless, it's difficult to imagine a scenario where something that is genuinely human level can't adapt in the field to a novel task such as driving a car. That sort of broad adaptability is exactly what the "general" in AGI is attempting to capture (imo).