← Back to context

Comment by majormajor

2 years ago

It would be nice if authors wouldn't use a loaded-as-fuck word like "transcendence" for "the trained model can sometimes achieve better performance than all [chess] players in the dataset" because while certainly that's demonstrating an impressive internalization of the game, it's also something that many humans can also do. The machine, of course, can be scaled in breadth and performance, but... "transcendence"? Are they trying to be mis-interpreted?

It transcends the training data, I get the usage intended but it certainly is ripe for misinterpretation

  • The word for that is "generalizes" or "generalization" and it has existed for a very long time.

    • I've been very confidently informed that these AIs are not AGIs, which makes me wonder what the "General" in AGI is supposed to mean and whether generalization is actually the benchmark for advanced intelligence. If they're not AGI, then wouldn't another word for that level of generalization be more accurate than "generalization"? It doesn't have to be "transcendence" but it seems weird to have a defined step we claim we aren't at but also use the same word to describe a process we know it does. I don't get the nuance of the lingo entirely, I guess. I'm just here for the armchair philosophy

  • That's trivial though, conceptually. Every regression line transcends the training data. We've had that since Wisdom of Crowds.