Comment by lisper
11 hours ago
This analysis is not quite fair. It takes into account locality (i.e. the speed of light) when designing UUID schemes but not when computing the odds of a collision. Collisions only matter if the colliding UUIDs actually come into causal contact with each other after being generated. So just as you have to take locality into account when designing UUID trees, you also have to take it into account when computing the odds of an actual local collision. So a naive application of the birthday paradox is not applicable because that ignores locality. So an actual fair calculation of the required size of a random UUID is going to be a lot smaller than the ~800 bits the article comes up with. I haven't done the math, but I'd be surprised if the actual answer is more than 256 bits.
(Gotta say here that I love HN. It's one of the very few places where a comment that geeky and pedantic can nonetheless be on point. :-)
Reminds me of a time many years ago when I received a whole case of Intel NICs all with the same MAC address.
It was an interesting couple of days before we figured it out.
This is the right critique. The whole article is a fun thought experiment but it massively overestimates the problem by ignoring causality. In practice, UUID collisions only matter within systems that actually talk to each other, and those systems are bounded by light cones. 128 bits is already overkill for anything humans will build in the next thousand years. 256 bits is overkill for anything that could physically exist in this universe.
You must consider both time and locality.
From now until protons decay and matter does not exist anymore is only 10^56 nanoseconds.
If protons decay. There isn't really any reason to believe they're not stable.
And recent DESI data suggests that dark energy is not constant and the universe will experience a big crunch in a little more than double its current age, for a total lifespan of 33 billion years, no need to get wild with the orders of magnitude on years into the future. The infinite expansion to heat death over 10^100 years is looking less likely, 10^11 years should be plenty.
https://www.sciencedaily.com/releases/2026/02/260215225537.h...
1 reply →
Protons can decay because the distinction between matter and energy isn't permanent.
Two quarks inside the proton interact via a massive messenger particle. This exchange flips their identity, turning the proton into a positron and a neutral pion. The pion then immediately converts into gamma rays.
Proton decayed!
1 reply →
That's such an odd way to use units. Why would you do 10^56 * 10^-9 seconds?
This was my thought. Nanoseconds are an eternity. You want to be using Planck units for your worst-case analysis.
9 replies →
Nanoseconds is a natural unit for processors operating around a GHz, as it's roughly the time of a clock cycle.
If a CPU takes 4 cycles to generate a UUID and the CPU runs at 4 GHz it churns out one every nanosecond.
If we think of the many worlds interpretation, how many universes will we be making every time we assign a CCUID to something?
> many worlds interpretation
These are only namespaces. Many worlds can have all the same (many) random numbers and they will never conflict with each other!
In that interpretation the total number of worlds does not change.
We don't "make" universes in the MWI. The universal wavefunction evolves to include all reachable quantum states. It's deterministic, because it encompasses all allowed possibilities.
1 reply →
Protons (and mass and energy) could also potentially be created. If this happens, the heat death could be avoided.
Conservation of mass and energy is an empirical observation, there is no theoretical basis for it. We just don't know any process we can implement that violates it, but that doesn't mean it doesn't exist.
Conservation laws result from continuous symmetries in the laws of physics, as proven by Noether's theorem.
1 reply →
Proton decay is hypothetical.
So is the need for cosmologically unique IDs. We're having fun.
I got a big laugh at the “only” part of that. I do have a sincere question about that number though, isn’t time relative? How would we know that number to be true or consistent? My incredibly naive assumption would be that with less matter time moves faster sort of accelerating; so, as matter “evaporates” the process accelerates and converges on that number (or close it)?
Times for things like "age of the universe" are usually given as "cosmic time" for this reason. If it's about a specific object (e.g. "how long until a day on Earth lasts 25 hours") it's usually given in "proper time" for that object. Other observers/reference frames may perceive time differently, but in the normal relativistic sense rather than a "it all needs to wind itself back up to be equal in the end" sense.
The local reference frame (which is what matters for proton decay) doesn't see an outside world moving slower or faster depending on how much mass is around it to any significant degree until you start adding a lot of mass very close around.
Would this take into account IDs generated by objects moving at relativistic speeds? It would be a right pain to travel for a year to another planet, arrive 10,000 years late, and have a bunch of id collisions.
I have to confess I have not actually done the math.
Oh no! We should immediately commence work on a new UUID version that addresses this use case.
Maybe the definitions are shifting, but in my experience “on point” is typically an endorsement in the area of “really/precisely good” — so I think what you mean is “on topic” or similar.
Pedantry ftw.
:-)
The answer is 42. Have it from good source!
Hanson's Grabby Aliens actually fits really well here if you're looking for some math to base off of.