Jump to content

Duda Jarek

Senior Members
  • Posts

    587
  • Joined

  • Last visited

Everything posted by Duda Jarek

  1. I haven't looked at these models closer yet, but the number of reviewers of world class journals who verified and approved these papers showing good agreement with different types of experiments (e.g. different scatterings, energy levels, magnetic properties, ... ) of what consequences of Coulomb and Lorentz force really are, suggests that a simple counterargument won't 'prove' that they are just wrong ... To do it, in science there are needed e.g. further simulations and comparisons with experiments ... it's why I'm asking if someone has an experience with them ... ? I'm not saying that nature has to be understandable, but rather that there is dangerous and well known from history social phenomenon: that when people believe that they cannot understand something, they for example introduce Zeus to explain lightning ... having such 'explaination' suppresses further search ... I believe scientists should be very careful about such giving up in difficult situations - it's kind of accepting 'intelligent design' ... do you disagree? So before giving up understanding, we should for example take what we do understand to its real limits of applicability - in my opinion situation in which the general belief of scientific society is that history of models of atoms we can understand has ended almost a century ago ... while there in fact are practically unknown much better modern models ... is just sick. As a scientist, I believe that to understand something 'inconceivable', the basic approach is to really deeply understand consequences of what we do understand - the essence of 'inconceivability' has to be hidden somewhere in what this fully exhausted picture is still missing. Modern history of physics shows that this basic approach not only have been neglected ... but even seems like it has been silenced ... it's not about obsolete Bohr model people should be learned first! So do you believe we should just give up trying to understand inconceivable dogmas of QM? If not - isn't really deep understanding of full consequences of what we can be really sure of (like Coulomb and Lorentz force) the basic approach?
  2. The problem with QM is that Feynman words: "I think I can safely say that nobody understands quantum mechanics" probably still applies - don't you see a problem in that the foundation of our understanding is unconceivable? From the other side we have classical mechanics - intuitive, natural, without any controversies - don't you think that to finally understand QM it should be helpful to see what using just classical Coulomb and Lorentz law really leads to - take it to the limits of applicability and look closely to see what is still missing for 'full QM'? Especially that there is still used Bohr model to understand, calculate different phenomenas - there is commonly used Bohr radius ... So I think it should be useful to be at least aware that the history doesn't end on Bohr and Sommerfeld what is generally believed, but there are also modern classical models, which have been showed in well peer-reviewed way to get much better agreement to experimental results ... sometimes even better than QM (like in http://www.cyf.gov.pl/gryzinski/teor7ang.html ) In 'classical' picture, electrons are localized object, like on these photos of atoms: http://www.mizozo.com/tech/09/2009/15/first-picture-of-an-atom.html - we can measure where exactly single electrons were before being tear off. Classically point-like electron starts moving on some trajectory around, stabilizing thermodynamically own statistics (using some complicated deterministic motion) to expected probability density (maximizing entropy) and finally is tear off by potential - natural thermodynamical model: Boltzmann distribution among possible trajectories says that this stabilized probability density (time average) is exactly the same as for the lowest quantum state (similar to Feynman path integrals). Brownian motion is good enough approximation of this finally mathematically correct thermodynamical model - it works for diffusion in liquids, but is no longer sufficient for fixed structure of defects in solids: http://physicsworld.com/cws/article/news/41659 We can look at coupled pendulums through their positions (classical picture), but also through their normal modes - that their evolution is 'superposition of rotations of phases' in this eigenbase of evolution operator (quantum picture). Taking a lattice of such pendulums, we get crystal with phonons. Now make infinitesimal limit - we get a field theory, like waves on water, GRT, EM, Klein-Gordon, QFT - if we go to 'normal modes' - eigenbase of evolution differential operator - we get 'superposition of rotations' - quantum picture - like interference of 'classical' waves on water. It's because these PDE are hyperbolic - 'wavelike' - in all these theories the basic excitations are waves. We can see classical mechanics as a result of quantum (like in Ehrenfest theorem) - maybe it's also in the opposite way - maybe they are just equivalent? These modern classical atomic models make some of the way for seeing QM no longer as only inconceivable dogmatic theory, but for example as naturally emerging in mathematically clear and natural field theories.
  3. I've recently found that after Bohr model there was introduced by Gryzinski classical model in which electrons make almost radial free-fall trajectory to the nucleus, which due to magnetic moments is bent by Lorentz force and so the electron goes back to the initial distance. This model is a natural consequence of classical scattering theory developed by the author. In almost 20 peer-reviewed papers in the best journals he claims to show that these using just Coulomb and Lorentz force models give really good agreement with experiment (in opposite to Bohr). These conceptually simple calculations were verified and approved by many world class reviewers, so one could think that such impressive models should be well known ... ... but surprisingly I cannot even find any constructive comments about them ??? http://en.wikipedia.org/wiki/Free-fall_atomic_model I'm very interested at finding some serious comments about these finally agreeing with experiments modern classical models? Have you even heard about them? About someone working on them?
  4. Sounds similiar? http://www.sciencenews.org/view/generic/id/61673/title/Behold%2C_the_antilaser "(...)cause the device to start and then stop absorbing light" ...
  5. Standard random walk on a graph – that for every point each outgoing edge is equally probable, doesn’t really maximize entropy as mathematics expects from thermodynamical models, but do it only locally. Such models lead to Brownian motion in continuous limit, which is good enough approximation to model diffusion in fluids, but isn’t longer appropriate within fixed structure of solids, like for recently measured electron stationary probability density on a defected lattice of potential wells of semiconductor surface: http://physicsworld.com/cws/article/news/41659 We would rather say that this probability density is quantum mechanical ground state ... but this sample is macroscopic, so we should expect some current flow behind – some thermodynamical behavior of these 'quants of charge'. It occurs that when we use stochastic model which finally do what mathematics expects from us – really maximize entropy, we get going to exactly quantum mechanical ground state stationary probability density, like we would thermodynamically expect from quantum mechanics. So maybe starting from such models we could better understand dynamics of current flow in quantum scale... I’ve just made Mathematica demonstration which allow to compare electron conductance through defected lattice using both – based on standard Generic Random Walk (classical) and these new models based on Maximal Entropy Random Walk. It allows to observe both stationary probability distribution and dynamics of current flow for different defect densities and applied potential gradient: http://demonstrations.wolfram.com/preview.html?draft/93373/000008/ElectronConductanceModelUsingMaximalEntropyRandomWalk or https://docs.google.com/leaf?id=0B7ppK4IyMhisMTRiNGZjYWItMDU0NS00OTFjLTg0NmQtOWE4ZTg5ZTkzMTJk&hl=en They give completely different qualitative picture – I would like to ask which of them better correspond to conductance at quantum level? For example in standard model for even the smallest potential applied, we immediately get almost uniform current flow through the whole sample, while in this new models we usually require some nonzero minimal potential gradient to 'soak' out of entropy wells through some complicated entropic landscape. And generally I would be grateful for any remarks and comments about the demonstration.
  6. In the only models I can think as theories of everything (the reason): deterministic, in each point of the spacetime, there is some single situation - there is no point in talking about probabilities and so entropy. While building some THERMODYNAMICAL MODEL OVER THIS SOLUTION - in each point we usually consider a ball and average over it, getting some effective local parameters like entropy or temperature - for our history of Universe, it allows to assign to each point of the spacetime local entropy - and 2nd law of thermodynamics says that we move along four-dimensional gradient of this entropy - so there had to be entropy minimum in our Big Bang (or maybe Bounce) and it probably created entropy gradient giving 2nd law of thermodynamics. Quantum mechanics by definition ignores the dynamics behind wavefunction collapse and just says probability distribution of its result - like thermodynamical models ... what is interpreted by some people that spcetime is infinitely quickly branching tree of parallel universes ... I completely disagree - four-dimensional nature of our spacetime already leads to many nonintuitiveness, like (confirmed) Wheeler's experiment or that to translate amplitudes we are working for into real probability we should square it against Bell's intuition, or allows for powerful 'quantum' computers: http://www.scienceforums.net/forum/showthread.php?p=569143
  7. I always thought that thermodynamics/statistical physics is effective theory – statistical result of some fundamental physics below, but recently there became popular theories starting from ‘entropic force’ as fundamental (basing on holographic scenarios, like in http://arxiv.org/abs/1001.0785 ). I was taught that to introduce effective local thermodynamical parameters to given concrete situation, for each point we average inside some ball around it to get for example local entropy or temperature. For a simple mathematician like me it sounds like a nonsense – in fundamental theory describing evolution of everything there should be one concrete history of our universe – there is no place for direct probabilities of scenarios required to define e.g. entropy. So I wanted to ask if someone could explain why we can even think about fundamental ‘entropic’ theories? To start the discussion I would like to briefly remind/discuss looking clear for me distinction between deterministic and stochastic/thermodynamical models: DETERMINISTIC models – the future is completely determined - evolution of gas in a tank is full dynamics of all its particles - for given valve opening there escaped concrete number of particles, - it's usually Lagrangian mechanics of some field – there is some scalar/vector/tensor/’behavior of functional'(QFT) in each point of our spacetime, such that ‘the action is optimized’ – each point is in equilibrum with its four-dimensional neighborhood (spacetime is kind of ‘4D jello’), - evolution equations (Euler-Lagrange) are HYPERBOLIC PDE - linearized behavior of coordinates in the eigenbase of the differential operator is d_tt x = - lambda x (0 < lambda = omega^2 ) so in linear approximation we have superposition of rotation of coordinates – ‘unitary’ evolution – and so such PDE are called wavelike – the basic excitations on water surface, in EM, GR, Klein-Gordon are just waves, - the model has FULL INFORMATION – there is no place for direct probability/entropy in electromagnetism, general relativity, K-G etc. – the model has some TIME (CPT) SYMMETRY INVARIANCE (no 2nd law of thermodynamics – there is still unitary evolution in thermalized gas or a black hole) THERMODYNAMICAL/STOCHASTIC models – there is some probability distribution among possible futures - gas in a tank is usually seen as thermalized, what allows to describe it by a few statistical parameters like entropy (like sum of –p*lg(p) ) or temperature (average energy per degree of freedom) - for a specific valve opening, the number of escaped particles is given by a probability distribution only, - it is used when we don’t have full information or want to simplify the picture – so we assume some mathematically universal STATISTICAL ENSEMBLE among POSSIBLE SCENARIONS (like particle arrangements) – optimizing entropy (uniform distribution) or free energy (Boltzmann distribution), - thermodynamical/stochastic evolution is usually described by difussion-like: PARABOLIC PDE – linearized behavior of coordinates in the eigenbase of the differential operator is d_t x = - tau x (tau - ‘mean lifetime’ ) so in linear approximation we have exponential decay (forgetting) of coordinates – evolution is called thermalization: in the limit there survive only ones with the smallest tau – we call it thermodynamical equilibrium and usually can be describe using just a few parameters, - these models don’t have time symmetry – we cannot fully trace the (unitary?) behavior so we have INFORMATION LOST – entropy growth – 2nd law of thermodynamics. Where I’m wrong in this distinction? I agree that ‘entropic force’ is extremely powerful, but still statistical result – for example if while random walk instead of maximizing entropy locally what leads to Brownian motion, we do it right: globally, we thermodynamically get going to the lowest quantum state probability density – single defects create macroscopic entropic barriers/wells/interactions: http://demonstrations.wolfram.com/GenericRandomWalkAndMaximalEntropyRandomWalk/ For me the problem with quantum mechanics is that it’s between these pictures – we usually have unitary evolution, but sometimes entropy grows while wavefunction collapses – there is no mystical interpretation needed to understand it: entropy maximizing from mathematically universal uncertainty principle is just enough ( http://arxiv.org/abs/0910.2724 ). What do you think about this distinction? Can thermodynamical models be not only effective (result), but fundamental (reason)? Can quantum mechanics alone be fundamental?
  8. In modern view on quantum mechanics wavefuntion collapse is no longer a ‘mystical out of physics’ phenomena, but is seen as a result of interaction with the environment (‘einselection’) – there is still some concrete unitary evolution behind. So there should exist ‘Hamiltonian of the Universe’ describing evolution of everything. We have similar situation in (classical!) field theories: for Euler-Lagrange equations (like Klein-Gordon: [math] d_{tt} \psi = \Delta\psi - m^2 \psi [/math] ) the evolution operator is self-adjoint – can be diagonalized (spectral theorem). The evolution on the [math] \lambda [/math] coordinate is: [math] d_{tt} x = \lambda x [/math]. So this operator should be non-positive, because otherwise some coordinates would explode. For negative eigenvalues, we get unitary evolution – like in quantum mechanics, we can imagine it as superposition of different eigenfunctions, ‘rotating’ with different speeds. And so such hyperbolic PDE are called wave-like. We have limited knowledge: cannot fully trace these unitary evolutions – from our perspective they 'loose their coherence': - we don’t/can’t know precise parameters, like initial conditions, - we cannot fully trace complicated motion (chaos), - thermodynamically stable state usually have own dynamics, like atomic orbitals or quantum phases. If we model such our lack of knowledge with proper statistical ensemble among possible scenarios - maximize uncertainty not locally like in Brownian motion, but globally - we get thermodynamical going to quantum mechanical ground state probability density. These new models also show why to translate from amplitude we are working on to the probability, we should take ‘the square’ ( http://arxiv.org/abs/0910.2724 ). To understand the strength of quantum computers, it should be enough to focus on models with constant (fixed) number of particles, for what classical field theory is enough. What is nonituitive about them is that natural picture for such Lagrangian mechanics is ‘static 4D’ – particles are no just ‘moving points’, but rather their trajectories in the spacetime ... let’s look what gives it us for computational capabilities. Quantum algorithm usually look like: - initialize qbits, - use Hadamar gates to get superposition of all possible inputs, - calculate classical function of the input, - extract some information from the superposition of results, look at the classical function calculation – it has to use reversible gates, like (x,y,z)->(x,y,z XOR f(x,y) ) they are also reversible classically, so we can easily reverse the whole function calculation on standard computer. Unfortunately it’s not so simple: there is a problem about it – such reversible calculations usually requires quite large number of auxiliary (q)bits, which had been initialized (to zero). While taking classical reverse of such function, we rather cannot control that these auxiliary (q)bits are zeros – they would usually be just random – so we hadn’t really calculated what we wished. If we could for example calculate square of a number modulo N or multiplicate of two numbers using ‘small’ number of auxiliary bits, we could guess their final value (e.g. randomly) and in a small number of trials we would be able to reverse such function (getting all zeros), what would allow to factorize N – so probably simple multiplication requires linear number of auxiliary bits. The strength of quantum computers is that they can ‘mount qbits trajectories’ in both past and future – simultaneously initialize auxiliary qbits and using measurement focus only on scenarios having the same final value (the measured one). In Shor’s algorithm case, we wouldn’t even need to know all the scenarios to make Fourier transform – knowing two would be already enough: if these powers gives the same value modulo N, their difference gives 1 modulo N. On the 18th page of my presentation is diagram for Shor’s algorithm: https://docs.google.com/fileview?id=0B7ppK4IyMhisODI5ZTU4YjYtNmU0MC00ZTM3LTg5MWQtMTJiYTY4MWVkOTJk&hl=en For physics it’s natural to find global minimum of action, but simulating such computer in classical field theory, even after simplifications probably still would be difficult, but anyway it suggests that to attack algorithmically difficult problems, we should translate them into continuous ones. For example in 3SAT problem we have to valuate variables to fulfill all alternatives of triples of these variables or their negations – look that x OR y can be changed into optimizing [math] ((x-1)^2+y^2)((x-1)^2+(y-1)^2)(x^2+(y-1)^2)[/math] and analogously seven terms for alternative of three variables. Finding global minimum of sum of such polynomials for all terms, would solve our problem. I’ve just found information that it looks like it is successfully done for a few years – enforcing that there is only one minimum, so local gradient would show the way to the solution: http://en.wikipedia.org/wiki/Cooperative_optimization What do you think about it?
  9. Electromagnetic properties like charge of magnetic momentums implicate some forces reverse proportional to some power of distance - so accordingly to such simplified picture, there appear infinities inside/near particles - so at least they are some kind of singularities of EM field - I would call it itself (a part?) of their internal structure, don't you? Returning to the topic, as I've written there - I totally agree that such interactions would be extremely small - but such lens for neutrinos with focal distance of e.g. millions kilometers could be still useful ... For example to finally determine their magnetic moment, what as you've suggested - could be even impossible with used today detection approach ...
  10. Neutrinos are extremely difficult to catch because they interact extremely weakly with the matter ... but they probably have internal magnetic structure and magnetic moment - so shouldn't they interact a bit with strong EM fields? In accelerators we use magnetic devices focusing beams of charged particles, but maybe there could be constructed analog for particles having only magnetic moment? I know - the first problem could be that their spin direction is random - but there are ways to order it, like Stern-Gerlach experiment (... for low energy) ... ? The other problem is that focal distance of such lens should rather depend on their energy ... but maybe there are ways to handle with such 'chromatic aberrations' ... ? Let's imagine we could build such extremely weak magnetic lens for neutrino and place relatively small detector in its focal point - for some scale it should became more effective than standard detectors ... The interesting fact is that it could have extremely long focal lengths - 1km: we could place detector under ground ... 12700 km: we could place detector on the other side of the Earth ... or much more while placing the lens on a spaceship ... What do you think about it? Is such lens doable (for low energy neutrinos)? Additionally they should allow to say much more about neutrinos than standard detectors ...
  11. While argumenting quantification of magnetic flux going through superconducting ring, we say that 'quantum phase' has to 'enclose itself' - make some integer number of 'rotations' along this ring. But in such considerations (or e.g. Josephson junction) nobody rather thinks about this phase as really the quantum phase of these single electrons, but as ORDER PARAMETER - some local value of statistical physics model, describing relative 'quantum' phase - or more generally phase of some periodic motion (e.g. http://rmf.fciencias.unam.mx/pdf/rmf-s/53/7/53_7_053.pdf ). So what about solutions of Schroedinger equation for hydrogen? We know well that there are better approximation of physics, like Dirac equations ... which still ignores internal structure of particles ... So doesn't it make Schroedinger's phase also only just an order parameter of some approximate statistical model? ... describing relative phase of internal periodic motion of electrons ("zitterbewegung") ... What I'm trying to convince is that when we get below these statistical models - to 'the real quantum phase' (which for example describes the phase of particle's internal periodic motion) - we won't longer have to 'fuzzy' everything, but will be able to work on deterministic picture. These statistical models works only on relative phase of some periodic motions - cannot rather distinguish absolute phase ... but generally we should be careful about implying this gauge invariance as fundamental assumption - that physics doesn't really cares about its local values. I was recently told that prof. Gryzinski also didn't like the approach that physicists couldn't handle with some problems, so they have hidden everything behind mystical cape of quantum mechanics and said that it's the fundamental level. He spent his life (died in 2004) explaining atomic physics, 'problems' which lead to the belief that QM is the lowest level using classical physics. There are many his papers in good journals. Here can be found his lectures: http://www.cyf.gov.pl/gryzinski/ramkiang.html Another argument are recently mapped electron densities on the surface of semiconductors: http://physicsworld.com/cws/article/news/41659 So we have some potential wells and electrons jumping between them. These wells create lattice with defects - we can approximately model it using graph on which these electrons make some 'walks'. There are huge amount of small interactions there, so we should rather look for statistical model - some random walk on this graph. Which random walk? Standard (maximizing entropy locally) gives just Brownian motion - without localization properties ... From quantum mechanics we should also expect going to the quantum ground state of this lattice ... and random walk maximizing entropy globally also gives such quantum ground state probability density ... and similar 'fractal patterns' as on the pictures from experiment.
  12. Ok - You say perturbative, but forgot to add: approximation ... What about nonpertubative picture - in all of these theories time is continuous - when e.g. particle decays, it's not that we have one particle in the first moment and then there is a magical 'pooof' and we have two particles -physics don't like rapid changes and so especially such discontinuities - it would make this process smooth - continuous transformation from one particle into two... What is perturbative expansion is considering different scenarios in some probability distribution. If they are not made of the field - so 'where' are they? It would suggest that they 'live' parallelly to the field (?) - so how can they affect it? Why does EM field 'cares' about them? In QFT they are excitations of harmonic potential well - so still they have some momentum structure and so after Fourier transform - some spatial structure, don't they? If they really don't have any internal structure - are they 'pointwise'? Have infinite density? So e.g. electric field goes to infinity near them? Quantum mechanics is used as a magical cape protecting from inconvenient questions ... ... but there are also nonperturbative field theories - deterministic (!) mechanics of density functionals - governed by concrete Euler-Lagrange equations ...
  13. I just don't like the picture that the field is one thing and particles something completely different - abstract beings, which somehow can influence the field ... while they can be just built of the same field as some special local solutions - for example topological singularites, which we see in many fields. So how do You imagine particles? There appears also question - if particles have some internal structure, is it affected by fields? Can they change a bit its properties like mass, charge, magnetic moment? It also could look like time dilation ... Thanks for the paper.
  14. Ohh .. I believe they are much more than only a background fields - look at the Gauss law - it sums charges inside - charges which are almost pointwise - so we can almost define such electron as near pointwise topological singularity of electric field ... maybe particles are just such special local solutions of some field ... for example spin is often defined that quantum phase makes something like that around: http://demonstrations.wolfram.com/SeparationOfTopologicalSingularities/
  15. I couldn't download the paper, but still if time dilation would correspond e.g. just to acceleration, it would suggest that there is nothing so special about gravity to make it completely qualitatively different than electromagnetism, is there?
  16. Why SR? Both gravitomagnetism and 'the default theory' of relativity are Lorentz invariant. The question is if electromagnetism and gravity are so qualitatively different? How to cope it with expected unification theories? What about different intristic curvature problems, like renormalization?
  17. Yes - it's 30 years old paper from "General Relativity and Gravitation" - looking good peer-review journal. And it's also very surprising for me that I couldn't find any commenting papers - neither negative nor positive??? There was only "Despite experimental conformation it appears to have been ignored for three decades." comment in Howard A. Landman's paper http://www.riverrock.org/~howard/QuantumTime4.pdf Another citing of that paper is of the same author (David Apsel) about using this effect for pulsars: http://arxiv.org/abs/gr-qc/0104025
  18. It's far from widely known fact that before Einstein's theory, there was Heaviside's simpler approach to make gravitation Lorentz invariant - by using a second set of Maxwell's equations - with e.g. density of mass instead of density of charge http://en.wikipedia.org/wiki/Gravitomagnetism This much less philosophically controversial theory (matter is not prisoned in infinitely thin submanfold of something with which it doesn't interact ... not allowing for wormhole-like solutions...) agrees well with most of observations (?), even with Gravity Probe B. Some papers says that even in better way: http://www.mrelativity.net/Papers/14/tdm5.pdf There can be also found strong arguments, that electromagnetic field also causes time dilation - for example while measuring muon lifetime in muonic atoms: http://www.springerlink.com/content/wtr11w113r22g346/ My interest on this subject started while I was working on some model in which the main dynamics was local rotations in 4D and it occurred that it leads to natural unification of electromagnetism and gravitomagnetism - spatial rotations gives Maxwell's equations, while small rotations of time axis (kind of central axis of light cone), gives the second set of Maxwell's equation - for gravity (5th section of http://arxiv.org/abs/0910.2724 ). What do you think about it? Why 'the only proper approach': intristic curvature is better than gravitomagnetism?
  19. While deexcitating into the ground state e.g. electrons just take the lowest orbitals and they are indistinguishable from QM point of view - we also loose entanglement - it's thermodynamical process. About phonons - mechanical waves of atoms, they are qualitatively very similar to photons ("mechanical waves of electromagnetic field"). Metals have crystalic - periodic structure - transmit sound well and generally what You hear are resonances, while rubber is random structure of long organic chains - diffusing phonons ... there is also large difference in sound velocity and so resonant frequencies ...but generally it's not for this discussion and You should look at some solid matter book ... About the 'squares' in probabilistic theories ... If for given Markov process we would focus on infinite in one direction chains - probability distribution would be the eigenstate of the stochastic matrix (without the squares). When we focus on a position inside chains infinite in both directions, we get the squares - intuitively: it's because there meet two random variables (chains) - from past and from future and while this gluing they have to give the same - the square is because of multiplying both probabilities. Please look at my paper - there is expanded this argument that it's just a natural result of 4D nature of our world ... generally while trying to predict 'charged points' idealization using some measurements - in any scale: microscopic(QM) or macroscopic(deterministic) there should appear these squares ... The simplest model - maximal entropy random walk on graph in opposite to standard random walk http://demonstrations.wolfram.com/GenericRandomWalkAndMaximalEntropyRandomWalk/ has strong localization properties as quantum mechanics (Anderson localization)
  20. Yes, it occurs that pure Boltzmann distribution among trajectories leads to probability densities exactly as squares of eigenfunctions of Schrödinger operator. In this thermodynamic picture time propagator is not unitary, but stochastic - thermodynamically everything wants to deexcitate as in QM. To introduce interference to this picture, there is required some rotation of some integral degree of freedom of particles (in ellipsoid field it's caused by particle's electric charge)
  21. Thanks for the book, I'll look at it. In approach from my paper we just take thermodynamics among trajectories (Boltzmann distribution) and we automatically get thermodynamical behavior of quantum mechanics - that everything wants to deexcitate to the ground state - we get concrete trajectories which statistically average to quantum mechanical probability distribution of the ground state.
  22. Bell inequalities based argumentation says that quantum mechanical 'squares' cannot be a result of some 'hidden variables' which would give QM as statistical result. Imagine two charged points idealized systems - if they are macroscopic, they can be described deterministically, while if they are microscopic - because of these 'squares' they cannot be described deterministically ? So while rescalling these 'squares' would have to somehow emerge ... how? I have seen only one trial to create probabilistic model for macroscopic system about which we can measure only some of its properties (for example because of distance) - and in this model: thermodynamics among trajectories, these 'squares' appears naturally in any scale (see my paper).
  23. Imagine we hold a flat surface and there is a spinning top (gyroscopic toy) on it. While changing the angle of the surface, the top generally follows the change, but it additionally makes some complicated 'precessive sinusoid/cycloid like motion' around the expected trajectory. Electron's spin is something different, but it sometimes is imagined as a spinning charge ... it's quantum mechanical phase rotates while time change ... there is Lamor precession ... Let's look at Bohr model of atom - quantum mechanics made it obsolete, but it still gives quite a good predictions http://en.wikipedia.org/wiki/Bohr_model It's main (?) lack is that it says that the lowest energy state should be spherically asymmetric (an orbit), while quantum mechanics says that the ground state is symmetric. Generally higher angular momentum states in Bohr model corresponds to quantum mechanical states with angular momentum lower by 1 as in this case. What if we would extend Bohr model by treating electron as 'a top'? Electron's spin projection while such precessive motion could be changing from -1/2 to +1/2, so intuitively it should 'fuzzy' angular momentum by 1 - exactly as in the difference between Bohr model and quantum mechanics, e.g. forgetting about the orbit for the ground state... Quantum mechanical probability density of states can be seen as naturally appearing thermodynamically ( http://arxiv.org/abs/0910.2724 ). Deterministic, but chaotically looking precessive motion could be the main source of statistical noise this model require for this thermodynamical behavior. What do you think about it? Have you heard about extending Bohr model by considering electron precession? Merged post follows: Consecutive posts mergedThere is so called (Bohr's) correspondence principle, which says that quantum mechanics reproduces classical physics in the limit of large quantum numbers: http://en.wikipedia.org/wiki/Correspondence_principle so for large orbits, especially in Rydberg atoms http://en.wikipedia.org/wiki/Rydberg_atom electrons looks like just moving on classical trajectories - this 'quantum noise' is no longer essential. To extend Bohr mode to different angular momentums, there was introduced Bohr-Sommerfeld model: more 'elliptical' orbits http://en.wikipedia.org/wiki/File:Sommerfeld_ellipses.svg one source of loosing these simple orbits, can be found in something like Mercury precession, which allows such orbit to rotate. It doesn't only have to be seen as mass related effect, there are arguments that electric field can also cause GR related effects, like time dilation: http://www.springerlink.com/content/wtr11w113r22g346/ The other source of nonstandard behavior and so this statistical noise can be precessive motion I mentioned - angular momentum conservation says that the total: orbital angular momentum + spin is conserved. Precession - rotating spin of electron is allowed and is compensated by electron's angular momentum to conserve 'j'. So such rotations should 'fuzzy' orbital angular momentum by 1, as in the difference between Bohr model and QM. There is also e.g. very complicated magnetic interaction between particles in atom and finally the only practical model to work on such extremely complicated system could be through probability densities and so quantum mechanics...
  24. Because I'm still thinking about its qualitative consequences - like that it also allows for proton decay in extremely high temperatures, what could solve problems with black holes with its infinite densities ... Merged post follows: Consecutive posts mergedAnother argument for proton decay: how nonzero total baryon number in observed universe (matter-antimatter asymmetry) could be created, if baryon number was always conserved?
  25. For many different models we can fit parameters so that for example it's first approximations suit well observations ... I value higher theory's integrity - when it's full consequences qualitatively agrees with what we observe, then it's worth to fit it's parameters ... For example: has general relativity been confirmed better than up to the first approximation (gravity, time dilation, gravitational lensing, Mercury precession)? The consequences of intrinsic curvature it introduce are enormous - it allows for wormholes ... it says that we live in infinitely thin submanifold of something, completely don't interacting with it ... Lorentz invariant gravitation can be introduced much simpler in flat spacetime - by second set of Maxwell's equations (with e.g. mass density instead of charge density), time dilation can be explained e.g. by rescaling a bit masses, charges, spins in gravitational potential, what would in first approximation rescale the whole matter making that EM interactions are transfered faster (5th section of http://arxiv.org/abs/0910.2724 )
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.