← Back to context

Comment by a-dub

4 years ago

i've always wondered if all the weirdness that occurs at small scales where classical mechanics breaks down was just the effect of a sort of spatial aliasing where the continuous universe is undersampled onto some kind of discrete substrate.

this confirms that i'm ignorant of real physics.

You might find this[1] interesting:

> However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.

They did their experiment and did not find anything that would indicate an underlying lattice, but I can't find the paper right now.

[1]: https://www.washington.edu/news/2012/12/10/do-we-live-in-a-c...

I'm also ignorant of (though extremely curious about!) real physics, and have had the exact same thought since learning audio DSP principles! I'm always amazed that quantum mechanics was developed before Claude Shannon's theory of information and Hartley and Nyquist's sampling theory. I guess the question is whether Planck time is an actual physical limit or "just" about measurement.

  • i come from the same angle. linear systems and dsp (initially audio).

    basically it's how computers percieve the physical world.

    it's also where a lot of theory arises from in neuroscience.

In general, I'm curious devices that detect/measure things at super small scales can have both very high accuracy ratings and very high confidence.

Presumably these devices measure things previously unmeasurable - or at least with as good of accuracy.

I mean, I get that we have hypothesis and have reason to believe nature is going to behave in some way. And then you build the device to measure it, and it comes within some range that's not surprising, and it is inline with previous devices that weren't as accurate.

If you're building a conventional scale - it just seems more reasonable that you can have high confidence and high accuracy because the stuff you're measuring is big enough to physically see and interact with and there's almost limitless things you could use to cross-reference, etc.

Is there an ELI5 for how you can measure things subatomic with ridiculously high accuracy and confidence?

I guess I'm just in complete awe of how this is possible - not doubting that it is.

  • At a large scale it is also very difficult to create high accuracy. Simply defining a meter or kilogram is a high difficulty task, however there is a canonical "kilogram" that you can go and visit in France. Along with a carefully maintained set of proofs that were built off of the canonical "kilogram" which in turn are used to make the calibration weights we all work with. We then measure/calibrate the accuracy of any weight measuring system by how well it tracks to the canonical kilogram.

    Similarly, for any "new" measurement, there will be a number of "calibration" measurements performed to ensure that the results are in-line with other measurements of well known things.

Ultimately everything has to be convertible to everything else. Kinetic force must be convertible to particles and even time must be inherently material if it can interact with space and material.

  • I am not sure why you were down-voted. What you are saying is a consequence of Einstein's General Theory of Relativity.