Comment by I_am_tiberius 20 hours ago Pity it doesn't support other llms. 1 comment I_am_tiberius Reply evilduck 20 hours ago It does, it's just a bit annoying.I have this set up as a shell script (or you could make it an alias): codex --config model="gpt-oss-120b" --config model_provider=custom with ~/.codex/config.toml containing: [model_providers.custom] name = "Llama-swap Local Service" base_url = "http://localhost:8080/v1" http_headers = { "Authorization" = "Bearer sk-123456789" } wire_api = "chat" # Default model configuration model = "gpt-oss-120b" model_provider = "custom" https://developers.openai.com/codex/config-advanced#custom-m...
evilduck 20 hours ago It does, it's just a bit annoying.I have this set up as a shell script (or you could make it an alias): codex --config model="gpt-oss-120b" --config model_provider=custom with ~/.codex/config.toml containing: [model_providers.custom] name = "Llama-swap Local Service" base_url = "http://localhost:8080/v1" http_headers = { "Authorization" = "Bearer sk-123456789" } wire_api = "chat" # Default model configuration model = "gpt-oss-120b" model_provider = "custom" https://developers.openai.com/codex/config-advanced#custom-m...
It does, it's just a bit annoying.
I have this set up as a shell script (or you could make it an alias):
with ~/.codex/config.toml containing:
https://developers.openai.com/codex/config-advanced#custom-m...