Comment by jen729w
7 months ago
Well you can't risk Claude quitting overnight. It forgets everything it did the day before and now you have to start over ... must ... finish ... tonight ... within ... context ... window.
7 months ago
Well you can't risk Claude quitting overnight. It forgets everything it did the day before and now you have to start over ... must ... finish ... tonight ... within ... context ... window.
Fortunately LLMs are stateless thus not affected by passage of time - your context stays exactly as it was while the tool maintaining it is running.
(Prompt caches are another thing; leaving it for the night and resuming the next day will cost you a little extra on resume, if you're using models via API pay-as-you-go billing.)