Comment by lysace
1 year ago
Sent via your servers, I assume, since you are providing the Anthropic api key.
So we must trust both Anthropic's and your infrastructure with our code.
I agree that a local LLM is the way to go.
1 year ago
Sent via your servers, I assume, since you are providing the Anthropic api key.
So we must trust both Anthropic's and your infrastructure with our code.
I agree that a local LLM is the way to go.
That's correct, our server just routes calls directly to Anthropic. Some users requested an option to input their own API key and talk to Anthropic directly. I'll add this by the end of the week, maybe today if time.
Local LLM is still the end goal
Ok. I think you should be more transparent about this, until everything is local.
I mean, more than stating "We don't store any of your code."
Got it, how about a manual trigger for diagram creation and an explicit description of LLM/server usage on the page with the trigger?
4 replies →