i've always wondered if all the weirdness that occurs at small scales where classical mechanics breaks down was just the effect of a sort of spatial aliasing where the continuous universe is undersampled onto some kind of discrete substrate.
> However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.
They did their experiment and did not find anything that would indicate an underlying lattice, but I can't find the paper right now.
I'm also ignorant of (though extremely curious about!) real physics, and have had the exact same thought since learning audio DSP principles! I'm always amazed that quantum mechanics was developed before Claude Shannon's theory of information and Hartley and Nyquist's sampling theory. I guess the question is whether Planck time is an actual physical limit or "just" about measurement.
In general, I'm curious devices that detect/measure things at super small scales can have both very high accuracy ratings and very high confidence.
Presumably these devices measure things previously unmeasurable - or at least with as good of accuracy.
I mean, I get that we have hypothesis and have reason to believe nature is going to behave in some way. And then you build the device to measure it, and it comes within some range that's not surprising, and it is inline with previous devices that weren't as accurate.
If you're building a conventional scale - it just seems more reasonable that you can have high confidence and high accuracy because the stuff you're measuring is big enough to physically see and interact with and there's almost limitless things you could use to cross-reference, etc.
Is there an ELI5 for how you can measure things subatomic with ridiculously high accuracy and confidence?
I guess I'm just in complete awe of how this is possible - not doubting that it is.
At a large scale it is also very difficult to create high accuracy. Simply defining a meter or kilogram is a high difficulty task, however there is a canonical "kilogram" that you can go and visit in France. Along with a carefully maintained set of proofs that were built off of the canonical "kilogram" which in turn are used to make the calibration weights we all work with. We then measure/calibrate the accuracy of any weight measuring system by how well it tracks to the canonical kilogram.
Similarly, for any "new" measurement, there will be a number of "calibration" measurements performed to ensure that the results are in-line with other measurements of well known things.
Ultimately everything has to be convertible to everything else. Kinetic force must be convertible to particles and even time must be inherently material if it can interact with space and material.
If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This seems unlikely for a number of reasons.
This doesn't mean spacetime is a nice even grid, but it does suggest it comes in discrete lumps of something, even if that something is actually some kind of substrate that holds the information which defines relationships between lumps.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This would be true if objects existed at perfectly local points. However we know that a perfectly localised wavefunction has spatial frequency components that add up to infinite energy. Any wavefunction with finite energy is band-limited. At non-zero temperature the Shannon-Hartley theorem will give a finite bit rate density over frequencies, and since the wavefunction is band limited it will therefore only have the ability to carry a finite amount of information.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
Even if space is continuous, that doesn't mean we can get information in and out of it in infinite precision.
Look at quantum physics. Maxwell's equations don't suggest existence of photons (quantized information). But atoms being atoms, they can only emit and absorb in quanta.
Obviously that "something" is the float type used to calculate the simulation. Probably some ultra dimensional IEEE style spec that some CPU vendor intern booked anyways. ;)
Someone else asked about their hand moving in a pixelated versus continuous way in reality, and it occurred to me that if spacetime were discrete that would be a good reason for entanglement, intuitively speaking. Otherwise there wouldn't be an obvious way for information to be transferred across the discrete points? Maybe I'm wrong about this but it seems that way on first thought.
That doesn't mean anything has to be a particular way, but it at least would be intuitively consistent to me.
When I think about that, I wonder if that 'quantization of space' is what determines the speed of light. And perhaps explains why inertial and gravitational mass are identical.
Since gravity is the curvature of spacetime, quantizing gravity would mean quantizing spacetime (or quantizing geometry) also, which would lead to their being smallest units of space and time, perhaps somewhere around the Planck length and Planck time.
If you look at it with the analogy of vinyl records and digital representations of that music. You can reach a point in which you are quantizing something that equally can represent the original in a way that is not discernable in difference (maybe 192khz 32 bit float for some but still quantized).
You equally, hit limits in human perception and technology/physics limitations more so.
Maybe the universe is always N+1 with N being the best sampling rate known to man. Sure we can infer, but when you want to know the answer to the exact decimal point, sometimes you have to accept that 1/3 is 1/3 and never exactly 0.333333333333 however much recurring you have.
Given that the real number line exists (as a mathematical construct), and has powerful properties (as a mathematical construct), it would be surprising if 'nature' did not take advantage of it.
If space is relative (ie. not a separate thing itself) it seems so, you have the shortest possible distance, so that delimits what the chunk of space is.
One thing that's always irritated me about popular explanations of string theory, like Brian Greene's, is that they'll use phrases like "It turns out that..."
I think, "Wait a second! Nothing has 'turned out', because there are no experiments."
So now there will be. I applaud this effort. If something is untestable, then ignorant people who equate "faith in religion" with "faith in science" are right. Let's get some data that can only be explained by Theory X, and then see if Theory X predicts more things that also turn out to be true. If Theory X turns out to be one of the many variations of string theory, then you've got something there.
In the same vein it bugs me when physicists “explain” the interior of black holes. Until we have a quantum theory of gravity we really don’t know what’s inside a black hole.
Question in this regard: current theory seems to predict a singularity with “infinite” space-time curvature (IIUC) in the center of a black hole. This seems utterly improbable to me. Is it common sense to treat this as a given or is it considered a weird quirk of general relativity that is sought to be overcome?
i've always wondered if all the weirdness that occurs at small scales where classical mechanics breaks down was just the effect of a sort of spatial aliasing where the continuous universe is undersampled onto some kind of discrete substrate.
this confirms that i'm ignorant of real physics.
You might find this[1] interesting:
> However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.
They did their experiment and did not find anything that would indicate an underlying lattice, but I can't find the paper right now.
[1]: https://www.washington.edu/news/2012/12/10/do-we-live-in-a-c...
I'm also ignorant of (though extremely curious about!) real physics, and have had the exact same thought since learning audio DSP principles! I'm always amazed that quantum mechanics was developed before Claude Shannon's theory of information and Hartley and Nyquist's sampling theory. I guess the question is whether Planck time is an actual physical limit or "just" about measurement.
i come from the same angle. linear systems and dsp (initially audio).
basically it's how computers percieve the physical world.
it's also where a lot of theory arises from in neuroscience.
In general, I'm curious devices that detect/measure things at super small scales can have both very high accuracy ratings and very high confidence.
Presumably these devices measure things previously unmeasurable - or at least with as good of accuracy.
I mean, I get that we have hypothesis and have reason to believe nature is going to behave in some way. And then you build the device to measure it, and it comes within some range that's not surprising, and it is inline with previous devices that weren't as accurate.
If you're building a conventional scale - it just seems more reasonable that you can have high confidence and high accuracy because the stuff you're measuring is big enough to physically see and interact with and there's almost limitless things you could use to cross-reference, etc.
Is there an ELI5 for how you can measure things subatomic with ridiculously high accuracy and confidence?
I guess I'm just in complete awe of how this is possible - not doubting that it is.
At a large scale it is also very difficult to create high accuracy. Simply defining a meter or kilogram is a high difficulty task, however there is a canonical "kilogram" that you can go and visit in France. Along with a carefully maintained set of proofs that were built off of the canonical "kilogram" which in turn are used to make the calibration weights we all work with. We then measure/calibrate the accuracy of any weight measuring system by how well it tracks to the canonical kilogram.
Similarly, for any "new" measurement, there will be a number of "calibration" measurements performed to ensure that the results are in-line with other measurements of well known things.
>In general, I'm curious devices that detect/measure things at super small scales can have both very high accuracy ratings and very high confidence.
In many cases you cannot. Speed and position for example.
https://en.m.wikipedia.org/wiki/Uncertainty_principle
Interferometers are nature's vernier calipers.
Ultimately everything has to be convertible to everything else. Kinetic force must be convertible to particles and even time must be inherently material if it can interact with space and material.
I am not sure why you were down-voted. What you are saying is a consequence of Einstein's General Theory of Relativity.
As far as quantum gravity theories go I’m personally quite partial to Causal Dynamical Triangulation [1]
[1] https://arxiv.org/pdf/1905.08669.pdf#page79
is there any reason to believe space is in fact quantized? why should it be? genuine question!
If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This seems unlikely for a number of reasons.
This doesn't mean spacetime is a nice even grid, but it does suggest it comes in discrete lumps of something, even if that something is actually some kind of substrate that holds the information which defines relationships between lumps.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This would be true if objects existed at perfectly local points. However we know that a perfectly localised wavefunction has spatial frequency components that add up to infinite energy. Any wavefunction with finite energy is band-limited. At non-zero temperature the Shannon-Hartley theorem will give a finite bit rate density over frequencies, and since the wavefunction is band limited it will therefore only have the ability to carry a finite amount of information.
1 reply →
People complain about infinite information, but the theory of real fields is complete and consistent, while arithmetic is not:
https://en.wikipedia.org/wiki/Decidability_of_first-order_th...
16 replies →
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
Even if space is continuous, that doesn't mean we can get information in and out of it in infinite precision.
Look at quantum physics. Maxwell's equations don't suggest existence of photons (quantized information). But atoms being atoms, they can only emit and absorb in quanta.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
Without some smallest resolution, we'd need infinite amounts of information to track point particle displacement in one dimension.
Obviously that "something" is the float type used to calculate the simulation. Probably some ultra dimensional IEEE style spec that some CPU vendor intern booked anyways. ;)
4 replies →
Care to to enlighten us these number of reasons?
That there is a cosmic speed limit may imply one unit of space per one unit of time. Nothing faster.
Someone else asked about their hand moving in a pixelated versus continuous way in reality, and it occurred to me that if spacetime were discrete that would be a good reason for entanglement, intuitively speaking. Otherwise there wouldn't be an obvious way for information to be transferred across the discrete points? Maybe I'm wrong about this but it seems that way on first thought.
That doesn't mean anything has to be a particular way, but it at least would be intuitively consistent to me.
1 reply →
If it's true that the Planck length (1.616e-35 meters) is the smallest possible Length, [https://futurism.com/the-smallest-possible-length] then so far as we'll ever know, space is quantized.
When I think about that, I wonder if that 'quantization of space' is what determines the speed of light. And perhaps explains why inertial and gravitational mass are identical.
https://en.wikipedia.org/wiki/Cutoff_(physics)
Since gravity is the curvature of spacetime, quantizing gravity would mean quantizing spacetime (or quantizing geometry) also, which would lead to their being smallest units of space and time, perhaps somewhere around the Planck length and Planck time.
If you look at it with the analogy of vinyl records and digital representations of that music. You can reach a point in which you are quantizing something that equally can represent the original in a way that is not discernable in difference (maybe 192khz 32 bit float for some but still quantized).
You equally, hit limits in human perception and technology/physics limitations more so.
Maybe the universe is always N+1 with N being the best sampling rate known to man. Sure we can infer, but when you want to know the answer to the exact decimal point, sometimes you have to accept that 1/3 is 1/3 and never exactly 0.333333333333 however much recurring you have.
Hint: 44100 Hz is plenty to exceed any human's perceptual resolution.
4 replies →
It would be hard to do aliasing without losing or gaining energy wouldn't it?
Given that the real number line exists (as a mathematical construct), and has powerful properties (as a mathematical construct), it would be surprising if 'nature' did not take advantage of it.
> real number line exists (as a mathematical construct)
Not according to constructivism it does not.
I’ve always wondered what happens when I move my hand.
If I zoom in close on my hand, does it move like a sprite in a 2D video game? Always on a “grid”, with no way to be in between pixels?
> does it move like a sprite in a 2D video game? Always on a “grid”, with no way to be in between pixels?
There is no compelling evidence to suggest that this is the case, at least down the scales we have been able to zoom into.
The Plank scale is very tiny and we are a long way from being able to see near that level.
If space is relative (ie. not a separate thing itself) it seems so, you have the shortest possible distance, so that delimits what the chunk of space is.
One thing that's always irritated me about popular explanations of string theory, like Brian Greene's, is that they'll use phrases like "It turns out that..."
I think, "Wait a second! Nothing has 'turned out', because there are no experiments."
So now there will be. I applaud this effort. If something is untestable, then ignorant people who equate "faith in religion" with "faith in science" are right. Let's get some data that can only be explained by Theory X, and then see if Theory X predicts more things that also turn out to be true. If Theory X turns out to be one of the many variations of string theory, then you've got something there.
'it turns out that' usually means that it follows from a bunch of math that's not worth talking about. Not that it's supported by experiment.
Rather a subtle distinction, when a noted scientist is writing for a lay audience.
1 reply →
In the same vein it bugs me when physicists “explain” the interior of black holes. Until we have a quantum theory of gravity we really don’t know what’s inside a black hole.
Question in this regard: current theory seems to predict a singularity with “infinite” space-time curvature (IIUC) in the center of a black hole. This seems utterly improbable to me. Is it common sense to treat this as a given or is it considered a weird quirk of general relativity that is sought to be overcome?
3 replies →