Comment by andy99
3 hours ago
I think if you took an LLM of today and showed it to someone 20 years ago, most people would probably say AGI has been achieved.
I’ve got to disagree with this. All past pop-culture AI was sentient and self-motivated, it was human like in that it had it’s own goals and autonomy.
Current AI is a transcript generator. It can do smart stuff but it has no goals, it just responds with text when you prompt it. It feels like magic, even compared to 4-5 years ago, but it doesn’t feel like what was classically understood as AI, certainly by the public.
Somewhere marketers changed AGI to mean “does predefined tasks with human level accuracy” or the like. This is more like the definition of a good function approximator (how appropriate) instead of what people think (or thought) about when considering intelligence.
> Current AI is a transcript generator. It can do smart stuff but it has no goals
That's probably not because of an inherent lack of capability, but because the companies that run AI products don't want to run autonomous intelligent systems like that