← Back to context

Comment by dymk

1 day ago

Can the base URL be overridden so I can point it at eg Ollama or any other OpenAI compatible endpoint? I’d love to use this with local LLMs, for the speed and privacy boost.

Good idea. Will figure out a way to do this.

  • Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm.

    • I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder.

      1 reply →