Comment by jncfhnb

1 year ago

I think one should feel comfortable arguing that AGI must be stateful and experience continuous time at least. Such that a plain old LLM is definitively not ever going to be AGI; but an LLM called in a do while true for loop might.

I don't understand why you believe it must experience continuous time. If you had a system which clearly could reason, which could learn new tasks on its own, which didn't hallucinate any more than humans do, but it was only active for the period required for it to complete an assigned task, and was completely dormant otherwise, why would that dormant period disqualify it as AGI? I agree that such a system should probably not be considered conscious, but I think it's an open question whether or not consciousness is required for intelligence.

  • Active for a period is still continuous during that period.

    As opposed to “active when called”. A function, being called repeatedly over a length of time is reasonably “continuous” imo

    • I don't see what the difference between "continuous during that period" and "active when called" is. When an AI runs inference, that calculation takes time. It is active during the entire interval during which it is responding to the prompt. It is then inactive until the next prompt. I don't see why a system can't be considered intelligent merely because its activity is intermittent.

      1 reply →

  • I think its note worthy that humans actually fail this test... We have to go dormant for 8 hours every day.

    • Yes, but our brain is still working and processing information at those times as well, isn't it? Even if not in the same way as it does when we're conscious.

      5 replies →

A consistent stateful experience may be needed, but not sure about continuous time. I mean human consciousness doesn't do that.

  • Human consciousness does though, e.g. the flow state. F1 drivers are a good example.

    We tend to not experience continuous time because we repeatedly get distracted by our thoughts, but entering the continuous stream of now is possible with practice and is one of the aims of many meditators.

    • Human consciousness is capable of it, but since most humans aren't in it much of the time, it would appear that it's not a prerequisite for true sentience.

    • What does it mean to “experience continous time”?

      How do you know that F1 drivers experience it?

  • I would argue it needs to be at least somewhat continuous. Perhaps discrete on some granularity but if something is just a function waiting to be called it’s not an intelligent entity. The entity is the calling itself.

I try my best not to experience continuous time for at least eight hours a day.

  • Then for at least eight hours a day you don’t qualify as a generally intelligent system.

    • If I spend some amount of the day bathing, some amount of it scratching, some amount of it thinking vaguely about racoons without any clear conclusions, and a lot of it drinking tea, I wonder how many seconds remain during which I qualified as generally intelligent.

      3 replies →

You could imagine an LLM being called in a loop with a prompt like

You observe: {new input}

You remember: {from previous output}

React to this in the following format:

My inner thoughts: [what do you think about the current state]

I want to remember: [information that is important for your future actions]

Things I do: [Actions you want to take]

Things I say: [What I want to say to the user]

...

Not sure if that would qualify as an AGI as we currently define it. Given a sufficiently good LLM with good reasoning capabilities such a setup might be able to It would be able to do many of the things we currently expect AGIs to be able to do (given a sufficiently good LLM with good reasoning capabilities), including planning and learning new knowledge and new skills (by collecting and storing positive and negative examples in its "memory"). But its learning would be limited, and I'm sure as soon as it exists we would agree that it's not AGI

  • This already exists (in a slightly different prompt format); it's the underlying idea behind ReAct: https://react-lm.github.io

    As you say, I'm skeptical this counts as AGI. Although I admit that I don't have a particularly rock solid definition of what _would_ constitute true AGI.

  • (Author here). I tried creating something similar in order to solve wordle etc, and the interesting part is that it is insufficient still. That's part of the mystery.

  • It works better to give it access to functions to call for actions and remembering stuff, but this approach does provide some interesting results.