← Back to context Comment by zombot 6 days ago It doesn't say how to configure a local ollama model. 2 comments zombot Reply ethan_smith 6 days ago You can configure Ollama by setting OPENCODE_MODEL=ollama/MODEL_NAME and OPENCODE_BASE_URL=http://localhost:11434/api in your environment variables. stocksinsmocks 6 days ago You can’t edit files with Ollama served models. Codex has the same problem. This is not an issue with Aider.
ethan_smith 6 days ago You can configure Ollama by setting OPENCODE_MODEL=ollama/MODEL_NAME and OPENCODE_BASE_URL=http://localhost:11434/api in your environment variables.
stocksinsmocks 6 days ago You can’t edit files with Ollama served models. Codex has the same problem. This is not an issue with Aider.
You can configure Ollama by setting OPENCODE_MODEL=ollama/MODEL_NAME and OPENCODE_BASE_URL=http://localhost:11434/api in your environment variables.
You can’t edit files with Ollama served models. Codex has the same problem. This is not an issue with Aider.