TrappedLight Posted September 26, 2013 Posted September 26, 2013 (edited) This is strictly a question and not a conjecture. Let me start off with what we know for sure. It is well-known that in quantum Fermi-Dirac statistics, that fermionic matter interacts in such a way that spin cancels out; this creates the so-called, atomic chemistry of the periodic scale. The exclusion principle truly is responsible for the vast elemental differences in this world we observe. But perhaps more interesting than that, is that Dirac statistics provide a measurable effect in which matter cannot occupy the same space - this will mean literally, the same physical space parameter, a point in the vacuum. However, there is a huge exception: massless radiation, which are bosons, actually can occupy the same space! You could theoretically squeeze as much radiation into a point without limit and without any consequence quantum mechanically-speaking. Or is radiation really free of such uncertainty? Remember, the uncertainty principle is about being located in a specific position, squeezing as many photons into a point would seem to violate this; but as explained, it doesn't because photons do not generally interact with each other - notwithstanding, the quantum wave function which can interfere with each other statistically-speaking. However, the form of the uncertainty principle, given as [math]\Delta P \Delta x[/math] is in fact one of possibly many different types of uncertainty conditions. One particular condition which could limit the radiation into a point, is a geometric interpretation of the uncertainty principle, called the Cauchy-Schwartz inequality. Basically, one can even find uncertainty in the dimensions of space, the uncertainty principle is not bound to physical singular systems like a particle, but can be seen as a geometric condition on the vacuum itself. In fact, the role takes a part in the loop quantum gravity theories in which there is called triangulation, the geometric placement of systems in space and time on very small levels. Triangulation, is what it says on the tin, it measures three systems as geometric properties of the three dimensional vacuum, and is actually, very similar to Julian Barbours approach to describing a three-point triangle in space to measure dynamical change to replace the conventional understanding of time. I actually have no doubt, he is aware how similar if not the same thing he tries to describe from different branches of physics. If there is an uncertainty relation in space, it would characterize how much uncertainty is present at the big bang. Since all the sides of the space triangle converge to a point (in relativity this is treated as an infinite curvature), there would be theoretically an infinite amount of uncertainty inherent in the vacuum. We also know that radiation was probably the fundamental material present when the big bang first expanded, not matter, especially within the geometrogenesis model of low energy and high energy physics. Even though this radiation could overlap, unlike ordinary fermion particles with a mass, is it possible there was still an application of this infinite uncertainty at the beginning of the universe? We know the uncertainty principle is the corner stone of quantum mechanics, but quantum mechanics breaks down at the Planck scale. Even relativity fails to answer what happens, but what if that has to do with the physics gaining a large amount of uncertainty in all physical parameters? Edited September 26, 2013 by TrappedLight
ADreamIveDreamt Posted September 26, 2013 Posted September 26, 2013 A Vacuum was once a Broom... You could answer your own questions from my Design.
swansont Posted September 26, 2013 Posted September 26, 2013 We know the uncertainty principle is the corner stone of quantum mechanics, but quantum mechanics breaks down at the Planck scale. We know this? The Planck energy is ~ 2 x 109 J. The Planck mass is ~ 22 micrograms. QM works just fine many, many orders of magnitude below this.
TrappedLight Posted September 26, 2013 Author Posted September 26, 2013 It does? I didn't know that. Oh, no, I was thinking of relativity, I think. Einstein-Dirac Statistics? lol... I was thinking of Bose-Einstein statistics vs Fermi-Dirac statistics and must have mixed the two.
MigL Posted September 28, 2013 Posted September 28, 2013 Special relativity is directly responsible for degeneracy or the inability of two particles to occupy the same state. When you put an electron in a box and start making the box smaller and smaller, you are making its position more and more definite. According to the HUP, then, its momentum must get more and more indeterminate. If you make the box small enough, the momentum has the possibility to be so high that its speed would be faster than light. This is obviously not possible, so there is a lower limit to the size of the box you can put an electron in. If you force the electron into the nucleus ( making neutrons ), the mass component of the momentum is larger so you can still stay below light speed and keep decreasing the size of the box. This is the progression from white dwarf star to neutron star and the limits that were first worked out by S. Chandrasekhar on the ship from India to England to do his graduate work.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now