Comment by famouswaffles

2 years ago

You're right but I didn't say anything about guarantees since that wasn't really the point of the argument. Yes, you can guarantee nothing but the point of discussion was whether a forward pass would deny specific classes of computations simply because it was just a forward class. It won't.

Yes in the same way that a lookup table can compute anything if you make it large enough (and where you need to know the size beforehand).

"Transformers are not Turing complete" https://www.lifeiscomputation.com/transformers-are-not-turin...

See here for example for a survey of work on the limits of computational expressivity of transformers https://www.semanticscholar.org/paper/Transformers-as-Recogn...

  • No, Transformers with memory are turing complete. Like i said, the modifications are trivial.

    https://arxiv.org/abs/2301.04589

    • First the discussion seemed to be about systems as implemented today (presumably you now concede that those actually do have practical computational limitations?) not systems that could theoretically be implemented. (Moreover adding memory is no longer the simple "forward pass" system that you were arguing had no significant computational limitations).

      Second, and more importantly, the fact that through clever human manipulation you can express a Turing machine using an LLM does not mean that such a machine is learned through gradient descent training.

      There is no basis to the claim that today's systems have converged on model weights that implement "higher-order" computation.

      5 replies →