← Back to context Comment by chr15m 1 day ago Good idea. Will figure out a way to do this. 4 comments chr15m Reply khimaros 1 day ago simple solution: honor OPENAI_API_BASE env var benatkin 1 day ago Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm. chr15m 1 day ago I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder. threecheese 7 hours ago Can you explain this decision a bit more? I’m using ‘llm’ and I find your project interesting.
benatkin 1 day ago Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm. chr15m 1 day ago I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder. threecheese 7 hours ago Can you explain this decision a bit more? I’m using ‘llm’ and I find your project interesting.
chr15m 1 day ago I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder. threecheese 7 hours ago Can you explain this decision a bit more? I’m using ‘llm’ and I find your project interesting.
threecheese 7 hours ago Can you explain this decision a bit more? I’m using ‘llm’ and I find your project interesting.
simple solution: honor OPENAI_API_BASE env var
Perhaps instead of writing an llm abstraction layer, you could use a lightweight one, such as @simonw's llm.
I don't want to introduce a dependency. Simon's tool is great but I don't like the way it stores template state. I want my state in a file in my working folder.
Can you explain this decision a bit more? I’m using ‘llm’ and I find your project interesting.