Comment by LiamPrevelige

1 year ago

That's correct, our server just routes calls directly to Anthropic. Some users requested an option to input their own API key and talk to Anthropic directly. I'll add this by the end of the week, maybe today if time.

Local LLM is still the end goal

Ok. I think you should be more transparent about this, until everything is local.

I mean, more than stating "We don't store any of your code."