← Back to context

Comment by ccppurcell

1 year ago

I would argue that the G in AGI means it can't require better prompting.

We should probably draw a distinction between a human-equivalent G, which certainly can require better prompting (why else did you go to school?!) and god-equivalent G, which never requires better prompting.

Just using the term 'General' doesn't seem to communicate anything useful about the nature of intelligence.

  • School is not better prompting, it's actually the opposite! It's learning how to deal with poorly formed prompts!

That would like saying that because humans’ output can be better or worse based on better or worse past experience (~prompting, in that it is the source of the equivalent of “in-context learning”), humans lack general intelligence.

  • This is more like the distinction of a Jr and Sr dev. One needs the tasks the be pre-chewed and defined “good prompts” while the latter can deal with very ambiguous problems

    • The entirety of a human's experience is the “prompt”. Current LLMs rely on the analog of instinct (pre-context in-built training) a lot more than humans for their behavior because they have itty bitty tiny context windows, but humans have really big context windows for in-context learning.

  • No, it's saying that I have general intelligence in part because I am able to reason about vague prompts