← Back to context

Comment by kingstnap

5 hours ago

> GPT‑5.3-Codex was co-designed for, trained with, and served on NVIDIA GB200 NVL72 systems. We are grateful to NVIDIA for their partnership.

This is hilarious lol

How so?

  • Its kind of a suck up that more or less confirms the beef stories that were floating around this past week.

    In case you missed it. For example:

    Nvidia's $100 billion OpenAI deal has seemingly vanished - Ars Technica

    https://arstechnica.com/information-technology/2026/02/five-...

    Specifically this paragraph is what I find hilarious.

    > According to the report, the issue became apparent in OpenAI’s Codex, an AI code-generation tool. OpenAI staff reportedly attributed some of Codex’s performance limitations to Nvidia’s GPU-based hardware.

    • There was never a $100 billion deal. Only a letter of intent which doesn't mean anything contractually.

    • > OpenAI staff reportedly attributed some of Codex’s performance limitations to Nvidia’s GPU-based hardware.

      They should design their own hardware, then. Somehow the other companies seem to be able to produce fast-enough models.