← Back to context

Comment by Workaccount2

6 months ago

There is something awesome about incredibly large finite numbers. People gush about infinity, but I find it to be a pretty boring concept compared to finite numbers too large to even be written in this universe.

Infinity is aspirational. Infinity is a concept, simple and self-evident, yet paradoxical and oxymoronic.

I get kind of intrigued by these large number things at first but ultimately it feels like kind of a less elegant version of the same thing? It's all this mucking about with multiples and powers of multiples and powers when it's like...we've already taken this to the limit, we can just stand back and marvel at that, what are we looking for? We already know you can always add another digit, why invent more and more complicated ways to do that?

This isn't meant to be contradictory to what you're saying or anything, just interesting to explore these different perspectives on what mathematical concepts capture the imagination.

  • I'm wondering if there's a connection between large number hunters, unwritten rule proponents in sports and games, and modular synth collectors. There's a sort of satisfaction derived from finding and defining order according to aesthetic structures that are largely arbitrary but also emotionally resonant.

    Meanwhile, infinity is for those that embrace chaos, minimalism, nothingness.

Yeah!

Like, there is a perfectly finite number, but is so large that there simply isn’t enough information in the universe to encode it in any format. How cool is that to just think about for a while?

  • I think such a number is going to have strange properties like, some number bigger than that unencodable number is encodable because of a special pattern that allows a special non-surjective recursive function to encode it. I am just wondering if there really is smallest number for which no number greater than it is encodable.

    It is not obvious to me that the definition of an encodable function has bounded growth: is it true that f(1) - f(0) for encodable f always has a bound given by the amount of data used to encode f? What is that bound?

    • The parent suggested that the number couldn't be encoded due to its largeness rather than its value. So while any number n with Kolmogorov complexity K(n) > 10^100 cannot be recursively encoded in the known universe, that number n need only be 10^100 bits long. On the other hand a number that's too large to be recursively encoded in the known universe would have to exceed BBλ2(10^100), where BBλ2 is an optimal busy beaver for prefix-free binary programs [1].

      [1] https://oeis.org/A361211

      3 replies →

  • but couldn't you encode it as "the smallest number that cannot be encoded within the universe's matter/entropy"?

    • If you could encode it that way, then it's incoherent. After all, that encoding exists within the universe. If it resolved to a value, that would disqualify that value from being correct because of the self reference.

    • Not really as that implies that you have a list of numbers that can be encoded within the universe, but the universe would run out of room keeping that list.

  • There is enough information if you assume reality is continuous. Pick a point A to be the origin. Then then you can encode the number by placing something at 1/N meters away from the origin.

  • For me it is far more to consider the number of atoms in the solar system, as it implies a pretty obvious limit on the data storage that humanity can meaningfully use. Obviously only a tiny fraction of those atoms can be used to store information & furthermore the energy required to actually perform that storage is enormous compared to the energy we have available at present.

  • Would you like to also be able to communicate about this number? You might have to reserve some atoms to form a being that could actually enjoy it. Considering such a perspective, the decoder for observing it should probably be embedded in the number itself.

  • You are stretching the definition of "is" here. Almost all finite numbers are impossible to actually count up to, so they exist only in the same sense that infinite numbers exist.

Indeed! Funnily enough, there seems to be something similar going on between defining large finite numbers and defining large countable cardinals.

And I’m assuming they can be converted to incredibly small finite numbers just as easily?