← Back to context Comment by psychoslave 3 months ago Can it run local LLM with quick parameters? 2 comments psychoslave Reply iagooar 3 months ago I would like to add support, but I do not have a computer powerful enough to run an LLM fast enough, so I am not able to test.Is it possible to use an OpenAI-compatible API locally, or how does that work? psychoslave 3 months ago https://github.com/simonw/llm proposes some hints to run in local
iagooar 3 months ago I would like to add support, but I do not have a computer powerful enough to run an LLM fast enough, so I am not able to test.Is it possible to use an OpenAI-compatible API locally, or how does that work? psychoslave 3 months ago https://github.com/simonw/llm proposes some hints to run in local
I would like to add support, but I do not have a computer powerful enough to run an LLM fast enough, so I am not able to test.
Is it possible to use an OpenAI-compatible API locally, or how does that work?
https://github.com/simonw/llm proposes some hints to run in local