Comment by mherrmann

4 years ago

A nice definition of intelligence I've heard is exactly the ability to form models of the world with predictive power. And a model is essentially a compression of real-world data. Physical laws are a great example of this.

Creating models with predictive power is also a precise definition of science.

  • Can you recommend any philosophy of science (or life) treatises about this?

    I long considered myself a Popperian. A few years ago, I decided that I'm a "Predictionist" (a placeholder made up word until I learn better). I'm struggling to figure out what that even means.

    I still agree with Popper. Empathetically.

    I'm just tired of arguing. I forfeit. I give up. I no longer believe that discourse is helpful, that people are persuadable, that we can share Truth.

    Instead, I just want to know the predictive strength of someone's Truth.

    For example:

    The Earth is flat? Oh? Cool. Please, tell me, how does that Truth help me?

    • My research involve applying Popper's epistemology to natural language processing. So I am quite involved in this.

      As far as I can tell, almost all of what Popper tried to do with quantification measures of information are exactly what you are talking about.

      In particular Conjectures and Refutations covers this really extensively so I'd recommend reading or re-reading that. Though Logic of Scientific Probability covers an early form. David Miller's Critical Rationalism covers it well too and some of it's problems.

      I.e: His notion (shared with positivists like Carnap and others) that science is a set of logical statements. A collection of statements is a theory, a theory entails a set of predictions which is called the information content of the theory (sometimes I(c) or C(I) in his notation).

      If the I(c) > I(c') where c' is a competing theory then it is said to have more explanitory power. I.e. it makes more predictions.

      This is part of his defnition of what makes a good explanation and what david desutch calls "hard to vary".

      The other main part of the definitition is about whether these statements reflect Truth in anyway.. that is covered by his notion of verisimiltude or truthlikeness which is quantified as the degree to which the information content of a theory I(c) can be corroborated.

      Both of these are essentailly "The predictive strength of someone's Truth"

      The problem you and many other have probably encountered is the information content of an explanation is *intractable* it's an open set of statements which cannot full by fleshed out. So instead we can never have a perfect quantification of whether my theories or your theories are better... there may indeed be statements entailed by flat earth theory that have yet to be discovered and could indeed be more corroborated and provide better information content than a non-flat earth theory! Popper revels in this fact and fully embraces it.

      Beyond Popper though, we need to understand more of the dynamics of "predictive strength" - I am finding causality a great source of literature that for which I would recommend Judea Perl and the Book of Why among other things.

      For Philosophy of Science in particular there are ton's of great articles on stanford encyclopedia of philosophy about Explanation that go into this in depth - in fact the positivists like Carnap wrote amazing things about this which I would recommend.

  • Slight tweak to this imo: models that can predict which new reframings/samples of current scientific-community-consensus SOTAs/benchmarks/datasets will disprove contemporary consensus is science :)

  • Not necessarily, since models that predict correctly can still be wrong. Science is figuring out the real mechanism

    • I disagree with this definition. We have yet to produce a perfect model of the world (aka, a theory of everything). All models produced by "science" thus far are "wrong", at least on some level (ex. Newton's model doesn't cover relativity). I think "Creating models with predictive power is also a precise definition of science." is a fair description.

      1 reply →

    • I think that most work in quantum physics negates that claim.

      While we are improving our predictive power, we’re still baffled by the underlying nature of reality. We don’t know the “mechanism” by which the quantum world works.

I like to define intelligence as knowing data, but knowing data only creates idiot savants. What is lacking in AI today is artificial comprehension. What we're calling "artificial intelligence" lacks comprehension. Until the concepts handled by AI are composable, forming virtual operating logical mechanisms, and an AI comprehends by trial combinations of concepts (virtual operating logical mechanisms) we are only creating idiot savants incapable of comprehending what they do.

How do you tell if something you're trying to determine as intelligent or not has formed a model?

  • If it efficiently ingests data with a non-trivial signal-to-noise ratio and returns actions/reactions that contain more signal and less noise.