Comment by dymk
2 months ago
Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.
2 months ago
Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.
https://github.com/chr15m/runprompt/blob/main/runprompt#L9
seems like it would be, just swap the openai url here or add a new one
I've implemented this now. You can set it with BASE_URL or OPENAI_BASE_URL which seems to vaguely be the standard. I also plan to use this with local LLMs. Thanks for the suggestion!
Good idea. Will figure out a way to do this.
Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm.
I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder.
2 replies →
simple solution: honor OPENAI_API_BASE env var