Comment by LiamPrevelige
1 year ago
That's correct, our server just routes calls directly to Anthropic. Some users requested an option to input their own API key and talk to Anthropic directly. I'll add this by the end of the week, maybe today if time.
Local LLM is still the end goal
Ok. I think you should be more transparent about this, until everything is local.
I mean, more than stating "We don't store any of your code."
Got it, how about a manual trigger for diagram creation and an explicit description of LLM/server usage on the page with the trigger?
Maybe just the truth in plain text, up front?
3 replies →