Comment by phren0logy
14 days ago
That is how I read it. Transformer based LLMs have limitations that are fundamental to the technology. It does not seem crazy to me that a guy involved in research at his level would say that they are a stepping stone to something better.
What I find most interesting is his estimate of five years, which is soon enough that I would guess he sees one or more potential successors.
In our field (AI) nobody can see even 5 months ahead, including people who are training a model today to be released 5 months from now. Predicting something 5 years from now is about as accurate as predicting something 100 years from now.
Which would be nice if LeCun hadn't predicted the success of neural networks more broadly about 30 years before most others.
That could be survivor bias. What else has he predicted?
1 reply →