Comment by evilduck
18 hours ago
It does, it's just a bit annoying.
I have this set up as a shell script (or you could make it an alias):
codex --config model="gpt-oss-120b" --config model_provider=custom
with ~/.codex/config.toml containing:
[model_providers.custom]
name = "Llama-swap Local Service"
base_url = "http://localhost:8080/v1"
http_headers = { "Authorization" = "Bearer sk-123456789" }
wire_api = "chat"
# Default model configuration
model = "gpt-oss-120b"
model_provider = "custom"
https://developers.openai.com/codex/config-advanced#custom-m...
No comments yet
Contribute on Hacker News ↗