Comment by gavmor

3 days ago

ie inextricably coupled to their services? Or is it a matter of swapping out a few "provider" modules?

Completely agnostic, if you run it locally, we provide a docker compose, if you have other deployment preferences pointing to your DB is a matter of changing env var https://github.com/appdotbuild/agent/blob/main/agent/trpc_ag...

We have baseline cursor rules included in case you want to hack on this manually https://github.com/appdotbuild/agent/tree/main/agent/trpc_ag...

Where we are tied is the LLM provider - you will need to supply your own keys for Anthropic / Gemini.

We did a couple runs on top of Ollama + Gemma - expect support for local LLMs. Can't swear on the timeline, but one of our core contributors recently built a water cooled rig with a bunch of 3090s so my guess is "pretty soon".