Comment by TeMPOraL
15 hours ago
Fortunately LLMs are stateless thus not affected by passage of time - your context stays exactly as it was while the tool maintaining it is running.
(Prompt caches are another thing; leaving it for the night and resuming the next day will cost you a little extra on resume, if you're using models via API pay-as-you-go billing.)
No comments yet
Contribute on Hacker News ↗