← Back to context

Comment by HarHarVeryFunny

3 months ago

Depends on how far back you are going. There was the whole 1969 Minsky Perceptron flap where he said ANNs (i.e Perceptrons) were useless because they can't learn XOR (and no-one at the time knew how to train multi-layer ANNs), which stiffled ANN research and funding for a while. It would then be almost 20 years until the 1986 PDP handbook published LeCun and Hinton's rediscovery of backpropagation as a way to train multi-layer ANNs thereby making them practical.

The JEPA parallel is just that it's not a popular/mainstream approach (at least in terms of well funded research), but may eventually win out over LLMs in the long term. Modern GPUs provide plenty of power for almost any artifical brain type approach, but of course are expensive at scale, so lack of funding can be a barrier in of itself.