← Back to context

Comment by cyanydeez

11 hours ago

it is. itll still not create AGI without some breakthrough in instruction vs data separation of concerns

You can park a lot there. No offence but I love how AGI doesn't mean anything. It used to be that AI was a goal post. Now it is AGI. We could use characters from sci-fi culture to describe milestones. In order to achieve robocop level, we must solve the instruction vs data problem.

  • Thus it always was. I’m old enough to remember when “if AI could beat a grandmaster at chess” was considered the finish line.

    • Well, yeah… turns out that goal wasn’t a good indicator for AGI, so we re-evaluated. That’s changing your hypothesis in the face of evidence, not “moving the goalposts” in the fallacious sense.