Comment by dymk
1 day ago
Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.
1 day ago
Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.
https://github.com/chr15m/runprompt/blob/main/runprompt#L9
seems like it would be, just swap the openai url here or add a new one
Good idea. Will figure out a way to do this.
simple solution: honor OPENAI_API_BASE env var
Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm.
I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder.
1 reply →