Comment by nineteen999
12 hours ago
That's just not a good use of my Claude plan. If you can make it so a self-hosted Lllama or Qwen 7B can query it, then that's something.
12 hours ago
That's just not a good use of my Claude plan. If you can make it so a self-hosted Lllama or Qwen 7B can query it, then that's something.
If you're not willing to pay for your own LLM usage to try a free resource offered by the author, that's up to you. But why complain to the author about it? How does your comment enrich the conversation for the rest of us?
It's ultimately just a prompt, self-hosted models can use the system the same way, they just might struggle to write good SQL+vector queries to answer your questions. The prompt also works well with Codex, which has a lot of usage.
I think that’s just a matter of their capabilities, rather than anything specific to this?