← Back to context

Comment by zild3d

3 hours ago

whats surprising about that? most of the minor version updates from all the labs are post training updates / not changing knowledge cutoff

Thanks for letting me know, I will be waiting for the major update.

  • It's been like this since GPT 3.5. This is not a limitation and is generally considered a natural outcome of the process.

    So there's no major update in the sense that you might be thinking. Most of the time there's not even an announcement when/if training cut offs are updated. It's just another byline.

    A 6 month lag seems to be the standard across the frontier models.

    • I've actually started worrying that the amount of false data produced with LLMs on the public internet might provoke a situation where the knowledge cutoff becomes permanently (and silently) frozen. Like we can't trust data after 2025 because it will poison training data at scale, and models will only cover major events without capturing the finer details.

      1 reply →