Comment by protocolture
3 days ago
I think what I want to do, is have a dodgy local LLM that picks up the context that the user is speaking to the LLM, and then enables it for 20 minutes or so.
But even thats a bit of a wild tradeoff.
3 days ago
I think what I want to do, is have a dodgy local LLM that picks up the context that the user is speaking to the LLM, and then enables it for 20 minutes or so.
But even thats a bit of a wild tradeoff.
No comments yet
Contribute on Hacker News ↗