Comment by uoaei

13 hours ago

He's speaking to the entire feedforward Transformer-based paradigm. He sees little point in continuing to try to squeeze more blood out of that stone and instead move on to more appropriate ways to model ontologies per se rather than the crude-for-what-we-use-them-for embedding-based methods that are popular today.

I really resonate with his view due to my background in physics and information theory. I for one welcome his new experimentation in other realms while so many still hack away at their LLMs in pursuit of SOTA benchmarks.

If the LLM hype doesn't cool down fast, we're probably looking at another AI winter. Appears to me like he's just trying to ensure he'll have funding for chasing the global maximum going forward.

  • > If the LLM hype doesn't cool down fast, we're probably looking at another AI winter.

    Is the real bubble ignorance? Maybe you'll cool down but the rest of the world? There will just be more DeepSeek and more advances until the US loses its standing.

    • How is it a foregone conclusion that squeezing the stone will continue to produce blood?