Comment by Squarex

7 days ago

I wonder why they named it so similiarly to the normal codex model while it much worse, while cool of course.

Not sure what you mean. It IS the same model, just a smaller version of it. And gpt-5.3-codex is a smaller version of gpt-5.3 trained more on code and agentic tasks.

Their naming has been pretty consistent since gpt-5. For example, gpt-5.1-codex-max > gpt-5.1-codex > gpt-5.1-codex-mini.

  • what do you mean by the same model, just smaller version? Codex should be finetune of the "normal" version, where did you get it's smaller? It's not that simple as to take some weights from the model and create a new model, normaly the mini or flash models are separately trained based on the data from the larger model.