Comment by kfarr
1 year ago
Sorry you’re being voted down, I think you make some interesting points.
I think LLMs miss a true feedback loop required for consciousness because their knowledge is fixed. Funny enough embodiment as a robot is one forcing function for a feedback loop and it’s not so crazy to think that the combination of the above is more likely to result in machine consciousness than LLM alone.
a robot body for sensory input + GPT4o + an SSD to store its own context + repeatedly calling the LLM solves the feedback loop issue, doesn’t it? Can’t it have expansive context via a large storage pool that it fully controls and can use to store and refine its own thoughts?
Maybe allow it to take newly collected data and fine-tune the base model with it, maybe once a day or so.
Some day our phones will dream.
I am sure someone is built/building now. Their should be a discord for this.
I agree.