Jump to content

Mordred

Resident Experts
  • Posts

    10078
  • Joined

  • Last visited

  • Days Won

    37

Everything posted by Mordred

  1. Each topological space is invariant and thus rigid in any of its geometric degrees of freedom. Any variations of say length generates a different topological space. The transformations between topological space will vary between observers. When it comes to amplitudes in the vector space there is no propagation as H=0 within that space \[\Sigma^{d^{n-1}}\] however you will have non trivial tunneling amplitudes between spaces \[\Sigma_0^{d^{n-1}},,, \Sigma_1^{d^{n-1}}\] through an intervening manifold M \[\partial M=\Sigma_0^\ast \cup \Sigma_1\] hope that helps answer that question a simplistic descriptive is that there is no local degrees of freedom (local defined by the space propagation is on the global topography
  2. I don't think you fully understand Swansont's question. All particle creation must obey all conservation laws. That list includes the following. (charge, lepton number, linear and angular momentum, isospin, color, flavor, mass) you must apply all the applicable conservation laws in your examination. You cannot choose one and ignore others. That would be rather tricky to do with your proposal once you consider the experiment has also been done using quantum dot emitters (single quanta aka photons)
  3. You can eliminate any issue with action at a distance by applying the mathematics of the Euler Langrangian.
  4. The reason I am happy with the field excitation view is that I have yet to encounter any form of particle interaction that QFT cannot adequately explain. Things such as electron spin, the distribution of spin statistics in the particle view would require superluminal angular momentum, where as in the field view with a greater effective radius this is easily accounted for. Particles popping in and out of existence is easily described through the creation/annihilation operators. Other factors tricky to describe with particle view where its easily described in the QFT view include Quantum tunneling, Bose and Fermi condensate, electron creation using photons (has been experimentally done) rather interesting in the method. The above is just some examples however even given the above I still feel its wrong to consider fields as being fundamental regardless of its accuracy and range of predictive ability
  5. which term didn't you understand Field: as set of values under a geometric treatment. energy; the ability to perform work potential energy the energy (ability to perform work) due to location. (geometry) example gravitational potential energy at sea level vs top of Mt Everest the graph I posted is the potential energy and how it evolves prior to electroweak symmetry breaking and after electroweak symmetry breaking. Prior to electroweak symmetry breaking elementary particles did not have mass. After symmetry breaking leptons and neutrinos gain mass due to the Higgs field potential. https://en.wikipedia.org/wiki/Higgs_boson perhaps this might help its a very straightforward FAG by Professor Matt Strassler https://profmattstrassler.com/articles-and-posts/the-higgs-particle/the-higgs-faq-2-0/
  6. The bowl your referring to is the potential energy levels that correspond to the vacuum expectation value https://en.wikipedia.org/wiki/Spontaneous_symmetry_breaking equation 2 on this link in a plot lloks like this without the full 3d rotation https://www.wolframalpha.com/input?i=plot+V(\phi)%3D-5|\phi|^2%2B|\phi|^4 the high point is called the false vacuum potential prior to electroweak symmetry breaking. As the potential rolls from the top point to the potential at the bottom either left or right lower this is the current Vacuum expectation value of 246g GeV today that gives rise to the mass term. It is a potential energy graph so nothing is within it
  7. I agree on that at times I see numerous posters on various multimedia preferring no probability or statistics. Unfortunately they take it to such extremes that they refuse any theory that involves probability. May even be as simple as the preference for easy mathematics that are more readily understandable.
  8. While I don't particularly bother with attempting to define a fundamental reality, I do support the view of particles being field excitations. At one time this would not have been the case. I used to be an avid supporter of the particle view. That view gradually changed as I studied QFT and various researches into the subject. However even then I recognize that the field excitation isn't fundamental either. In truth I cannot name anything truly fundamental. The thing is wave particle duality does describe what we observe, depending upon examination the point-like or wavelike characteristics will be involved. Sometimes the point-like is better suited other times the wave-like. The reason I don't feel fields are fundamental is that a field is simply a set of values under a geometric treatment. This describes all forms of fields, including physical vs mathematical. Many often forget that physical has the meaning of any measurable property as one of its numerous definitions. By that definition any measurable quantity can be described as physically real. You often see that argument used in the distinction between real vs virtual particles, a real particle must be measurable and hence have a quanta of action. as far as I can tell that's likely as close to fundamental as will ever be possible. Though we allow for virtual particles many consider them to be more a mathematical convenience than actuality. In QFT they don't typically refer to the term particle but rather a field state so virtual particles are simply field permutations that cannot be localized with well defined boundaries.
  9. As you stated that's to the perspective of the observer at infinity. The infalling observer sees no difference. The flipping of the space vs time marks a point where the mathematics breaks down into a mathematical singularity condition.
  10. I've always been curious as to why so many have issues with probability. If you have a system or state where you have more than one possible outcome it's only natural to model all possible outcomes and give the probability of those possible outcomes. This is true in classical as well as quantum mechanics. So why is probability in quantum mechanics an issue?
  11. Janus did the last post but your welcome lol. Good post Janus well detailed +1
  12. String theory uses path integrals. The difference is the strings of string theory are fully mathematically described. If your serious about trying to define your strings I would recommend you study how String theory does so and then apply mathematics to your String.
  13. There have been numerous estimates using a wide variety of methods generally involved in measuring the total mass and factoring out all known baryonic mass sources. This article for example estimates 90% the total mass be dark. However the values I have come across are fairly varied. the link is more the textbook answer than a research paper. I don't know the mass estimate bounds are https://sites.astro.caltech.edu/~george/ay20/eaa-darkmatter-obs.pdf
  14. Fermi's Golden Rule \[\Gamma=\frac{2\pi}{\hbar}|V_{fi}|^2\frac{dN}{DE_f}\] density of states \[\langle x|\psi\rangle\propto exp(ik\cdot x)\] with periodic boundary condition as "a"\[k_x=2\pi n/a\] number of momentum states \[dN=\frac{d^3p}{(2\pi)^2}V\] decay rate \[\Gamma\] Hamilton coupling matrix element between initial and final state \[V_{fi}\] density of final state \[\frac{dN}{dE_f}\] number of particles remaining at time t (decay law) \[\frac{dN}{dt}=-\Gamma N\] average proper lifetime probability \[p(t)\delta t=-\frac{1}{N}\frac{dN}{dt}\delta t=\Gamma\exp-(\Gamma t)\delta t\] mean lifetime \[\tau=<t>=\frac{\int_0^\infty tp (t) dt}{\int_0^\infty p (t) dt}=\frac{1}{\Gamma}\] relativistic decay rate set \[L_o=\beta\gamma c\tau\] average number after some distance x \[N=N_0\exp(-x/l_0)\]
  15. While both answers provided are good the QFT viewpoint may help. As Lorentz Jr and Exchemist mentioned you have the fields involved. The EM field being a composite field as per Maxwell. As that field becomes perturbed the anisotropic disturbances generate potential energy differences. This in turn generates an increase in the particle number density of gauge photons for the EM field as per the momentum force mediator. So in essence a light beam generates its own medium comprised of gauge bosons the gauge photon however it will be offshell in this case. The photon becomes real as per the quanta of action terms. So in essence one could view it as the photon generates its own localized medium via the EM field permutations. In QFT that's done via the creation/annihilation operators via the Lorentz invariant Klein_Gordon equations.
  16. I wouldn't know about most instances. Its enough to be aware that as both modelling and interpretations lead to testing they both are useful tools in model development QM as well others.
  17. I would have to say a combination of both. Sometimes the tests developed to test an interpretation has led to QM development. Other times its the tests of a mathematical model that led to QM development.
  18. If your applying the QM mathematical method then yes that follows the Copenhagen interpretation. However QM isn't the only methodology other methodologies can have their own subsequent interpretations. In essence the choice breaks down to which mathematical method best describes the state or evolution of states. If you don't religiously apply one theory over another but apply the aspects that best suit the situation you will invariably gain a far better understanding of how physics describe physical processes. However QM in an of itself has several different interpretations as to the deterministic and stochastic aspects. I lost track of the numerous QM based interpretations there are years ago lol
  19. Well as I typically choose to ignore any interpretation aspects in any article or video. I found the paper useful in so far as the mathematics being applied. Which essentially breaks down an examination of error margin elimination to the weak limit. The paper examines the method of using a combination of preselection and post selection without causality violation as its not involving any causation signal sent from post selection to the past. Rather its making predictions of the past events based on the post selection results, as well as making predictions of the future events from the preselection correlations. the paper suggests this dual methodology will eliminate errors and minimize the error margin to improve the error margin to better understand the evolutionary history of the entangled particle pair.
  20. Do what I do ignore interpretations stick to the numbers. Its too easy to for a choice of interpretation to be followed with religious like zeal
  21. I'll have to study the article in greater detail however if you recall above I described how non locality vs local defined in regards to Bell type experiments. You happened to find an article that clearly states that post. "2.5. Dynamical nonlocality and the whole-part dialogue Dynamical nonlocality [37] impacts the dialogue concerning the relationship between parts and wholes. Motivated by the AB non-locality and by weak measurements, we look for new manifestations of the dynamics of QM which are not predicted by the dynamics of classical mechanics. The key difference is that the equation of motion of QM exhibits a new kind of non-locality, which is best described by using modular variables." equations are 2.32 and 2.33 for this examination for the local (classical) vs non local (quantum). As I have time tomorrow will read it in more detail thanks for sharing
  22. Its amazing at the amount of output a star like our sun emits per second. Our sun outputs roughly \[3.8*10^{26}\] watts per second. 1 watt per second equals 1 joule per second The Earth receives roughly 1400 watts/m^3 of that energy. At peak emission given by wiki the peak wavelength of our sun is approx 883 nm convert to joules per second gives roughly 2.25 *10^{-19} joules . A quick back of envelope calculation gives roughly 10^{46} photons but that's a very rough estimate (granted I also only applied the peak wavelength not the entire ensemble of wavelengths) the total photons radiated is far far higher. So yes a star emits an incredible amount of EM radiation. However I wouldn't advise thinking of light in the fashion of a stream of bullet like photons. Instead your better off understanding light as a superposition of EM waves. Where the sum of energy levels of the waves at a given volume correspond to a probable number density of photons as per Bose Einstein statistics. The photons themselves of that wave do not necessarily have to originate from the star but can be generated on route as well as interfered with on route. The number density will still correspond with the mean energy density or blackbody temperature
  23. your welcome
  24. One of the things about mathematics is the symbology often takes a second place to the relations ( I could literlly hand someone 10 different articles covering precisely the same thing and no two papers apply the same symbology) outside of common standardized forms) However P is simply the entangled particle state with the A and B identifying each particle state whether is spin up, down, left, right, etc. M is simply the probability density matrix of each. with rho \[\rho\] being the density of the operator from that matrix. ie the momentum, position operator. the trace of matrix M can best be covered here https://en.wikipedia.org/wiki/Trace_(linear_algebra) however the above was simply an FYI in so far as one has to take care on how the term local vs non local applies for a given examination the usage can often vary
  25. You can't really separate the two gravity is spacetime curvature the time dilation itself results from that curvature where inertial mass and gravitational mass have equivalency ( m_i=m_g). As I mentioned they did test the weak equivalence principle on the moon "Lunar Laser Ranging Tests of the Equivalence Principle with the Earth and Moon" https://arxiv.org/abs/gr-qc/0507083 the strong equivalence principle is also inclusive, however as the test involves laser ranging you are in actuality also testing time dilation via the rate of signals. it might help to further understand that with spacetime curvature you get distortions in light rays from an object. Take two lasers fie them in parallel. If the beams stay parallel you have a flat geometry with no time dilation. If the beams converge you have positive curvature aka gravity with time dilation. So its quite possible to test for time dilation without having to place a clock on the moon. You can literally use lasers or other EM signals.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.