Comment by aiscoming
17 hours ago
vs code supports local models (bring your own key/model)
you need a model server - ollama/llama.cpp/lm studio
17 hours ago
vs code supports local models (bring your own key/model)
you need a model server - ollama/llama.cpp/lm studio
> bring your own key
Do you mean supporting oai-compatible api URLs in copilot? If so then you need either VS Code Insiders, or a VS Code extension I believe?
[dead]