← Back to context

Comment by jasonm23

7 days ago

Crush has an open issue (2 weeks) to add Ollama support - it's in progress.

FYI it works already even without this feature branch (you'll just have to add your provider and models manually)

```

{

  "providers": {

    "ollama": {

      "type": "openai",

      "base_url": "http://localhost:11434/v1",

      "api_key": "ollama",

      "models": [

        {

          "id": "llama3.2:3b",

          "model": "Llama 3.2 3B",

          "context_window": 131072,

          "default_max_tokens": 4096,

          "cost_per_1m_in": 0,

          "cost_per_1m_out": 0

        }

      ]

    }

  }

}

```

why?

it's basic, edit the config file. I just downloaded it, ~/.cache/share/crush/providers.json add your own or edit an existing one

Edit api_endpoint, done.