Comment by layer8
2 months ago
Same. And the next step is that it must feed back into training, to form long-term memory and to continually learn.
2 months ago
Same. And the next step is that it must feed back into training, to form long-term memory and to continually learn.
I analogize this with sleep. Perhaps that is what is needed, 6 hours offline per day to LoRa the base model on some accumulated context from the day.
LLMs need to sleep too. Do they dream of electric sheep?