Comment by righthand
17 hours ago
AGI currently is an intentionally vague and undefined goal. This allows businesses to operate towards a goal, define the parameters, and relish in the “rocket launches”-esque hype without leaving the vague umbrella of AI. It allows businesses to claim a double pursuit. Not only are they building AGI but all their work will surely benefit AI as well. How noble. Right?
It’s vagueness is intentional and allows you to ignore the blind truth and fill in the gaps yourself. You just have to believe it’s right around the corner.
"If the human brain were so simple that we could understand it, we would be so simple that we couldn’t." - without trying to defend such business practice, it appears very difficult to define what are necessary and sufficient properties that make AGI.
What about if the human brain were so complex that we could be complex enough to understand it?