Comment by Deegy
6 hours ago
We currently have human-in-the-loop AGI.
While it doesn't seem we can agree on a meaning for AGI, I think a lot of people think of it as an intelligent entity that has 100% agency.
Currently we need to direct LLM's from task to task. They don't yet posses the capability of full real world context.
This is why I get confused when people talk about AI replacing jobs. It can replace work, but you still need skilled workers to guide them. To me, this could result in humans being even more valuable to businesses, and result in an even greater demand for labor.
If this is true, individuals need to race to learn how to use AI and use it well.
> Currently we need to direct LLM's from task to task.
Agent-loops that can work from larger scale goals work just fine. We can't letting them run with no oversight, but we certainly also don't need to micro-manage every task. Most days I'll have 3-4 agent-loops running in parallel, executing whole plans, that I only check in on occasionally.
I still need to review their output occasionally, but I certianly don't direct them task to task.
I do agree with you we still need skilled workers to guide them, so I don't think we necessarily disagree all that much, but we're past the point where they need to be micromanaged.
If we can't agree on a definition of AGI, then what good is it to say we have "human-in-the-loop AGI"? The only folks that will agree with you will be using your definition of AGI, which you haven't shared (at least in this posting). So, what is your definition of AGI?