Comment by amdivia
7 days ago
FYI it works already even without this feature branch (you'll just have to add your provider and models manually)
```
{
"providers": {
"ollama": {
"type": "openai",
"base_url": "http://localhost:11434/v1",
"api_key": "ollama",
"models": [
{
"id": "llama3.2:3b",
"model": "Llama 3.2 3B",
"context_window": 131072,
"default_max_tokens": 4096,
"cost_per_1m_in": 0,
"cost_per_1m_out": 0
}
]
}
}
}
```
No comments yet
Contribute on Hacker News ↗