← Back to context

Comment by lisper

2 days ago

This was my thought. Nanoseconds are an eternity. You want to be using Planck units for your worst-case analysis.

Planck units are a mathematical convenience, not a physical limit. For instance, the Planck mass is on the order of an eyelash or grain of sand.

  • Planck units are physical limits. The Planck mass is the limit of the mass of an elementary particle before it would form a black hole.

    • "Plank units are not physical limits on reality itself" is what I should have said. We can obviously have larger or smaller masses.

      The plank time is a limit on a measurement process, not the smallest unit of time.

      1 reply →

    • Nope. What you say is a myth.

      The Planck mass is just the square root of the quotient of dividing the product between the natural units of angular momentum and velocity, by the Newtonian constant of gravitation.

      This Planck mass expresses a constant related to the conversion of the Newtonian constant of gravitation from the conventional system of units to a natural system of units, which is why it appears instead of the classic Newtonian constant inside a much more complex expression that computes the Chandrasekhar limit for black holes.

      The Planck mass has absolutely no physical meaning (otherwise than expressing in a different system of units a constant equivalent with the Newtonian constant of gravitation), unlike some other true universal constants, like the so-called constant of fine structure (or constant of Sommerfeld), which is the ratio between the speed of an electron revolving around a nucleus of infinite mass in the state with the lowest total energy, and the speed of light (i.e. that electron speed measured in natural units). The constant of fine structure is a measure of the intensity of the electromagnetic interaction, like the Planck mass or the Newtonian constant of gravitation are measures of the intensity of the gravitational interaction.

      The so-called "Planck units" have weird values because they are derived from the Newtonian constant of gravitation, which is extremely small. Planck has proposed them in 1899, immediately after computing for the first time what is now called as Planck's constant.

      He realized that Planck's constant provides an additional value that would be suitable for a system of natural fundamental units, but his proposal was a complete failure because he did not understand the requirements for a system of fundamental units. He has started from the proposals made by Maxwell a quarter of century before him, but from 2 alternatives proposed by Maxwell for defining a unit of mass, Planck has chosen the bad alternative, of using the Newtonian constant of gravitation.

      Any system of fundamental units where the Newtonian constant of gravitation is chosen by convention, instead of being measured, is impossible to use in practice. The reason is that this constant can be measured only with great uncertainties. Saying by law that it has a certain value does not make the uncertainties disappear, but it moves them into the values of almost all other physical quantities. In the Planck system of units, no absolute value is known with a precision good enough for modern technology. The only accurate values are relative, i.e. the ratios between 2 physical quantities of the same kind.

      The Planck system of units is only good for showing how a system of fundamental units MUST NOT be defined.

      Because the Planck units of length and time happen by chance to be very small, beyond the range of any experiments that have ever been done in the most powerful accelerators, absolutely nobody knows what can happen if a physical system could be that small, so claims that some particle could be that small and it would collapse in a black hole are more ridiculous than claiming to have seen the Monster of Loch Ness.

      The Einsteinian theory of gravitation is based on averaging the distribution of matter, so we can be pretty sure that it cannot be valid in the same form at elementary particle level, where you must deal with instantaneous particle positions, not with their mass averaged over a great region of empty space.

      It has become possible to use Planck's constant in a system of fundamental units only much later than 1899, i.e. after 1961, when the quantization of magnetic field was measured experimentally. However, next year, in 1962, an even better method was discovered, by the prediction of the Josephson effect. The Josephson effect would have been sufficient to make the standard kilogram unnecessary, but metrology has been further simplified by the discovery of the von Klitzing effect in 1980. Despite the fact that this would have been possible much earlier, only since 2019 the legal system of fundamental units depends on Planck's constant, but in a good way, not in that proposed by Planck.

If you go far beyond nanoseconds, energy becomes a limiting factor. You can only achieve ultra-fast processing if you dedicate vast amounts of matter to heat dissipation and energy generation. Think on a galactic scale: you cannot have even have molecular reaction speeds occurring at femtosecond or attosecond speeds constantly and everywhere without overheating everything.

  • Maybe. It's not clear whether these are fundamental limits or merely technological ones. Reversible (i.e. infinitely efficient) computing is theoretically possible.

    • Reversible computing is not infinitely efficient, because irreversible operations, e.g. memory erasing, cannot be completely avoided.

      However, the computing efficiency could be greatly increased by employing reversible operations whenever possible and there are chances that this will be done in the future, but the efficiency will remain far from infinite.