← Back to context Comment by dur-randir 17 hours ago How do I connect it to a local llama.cpp instance? 1 comment dur-randir Reply GodelNumbering 15 hours ago It supports LMStudio or you can start a local endpoint, then runOPENAI_COMPATIBLE_CUSTOM_KEY="xxx" dirac -y --provider "https://localhost/v1" --model <model_name> "hi..."
GodelNumbering 15 hours ago It supports LMStudio or you can start a local endpoint, then runOPENAI_COMPATIBLE_CUSTOM_KEY="xxx" dirac -y --provider "https://localhost/v1" --model <model_name> "hi..."
It supports LMStudio or you can start a local endpoint, then run
OPENAI_COMPATIBLE_CUSTOM_KEY="xxx" dirac -y --provider "https://localhost/v1" --model <model_name> "hi..."