aaabha Posted July 9, 2010 Posted July 9, 2010 Can we be sure that decay times are constant? Particle decay is clearly some statistical process. Generally speaking, particles are some stable solutions of some physics (like a field theory) - they are some local/global energy minimums for given constrains like spin or charge. So from energetic point of view, particle decay should be getting out of some local energy minimum by crossing some energy barrier and finally reaching some lower energy minimum - just like in thermodynamics (?) Energy required to cross such energy barrier usually comes from thermal noise - in case of particle decay there would be required some temperature of vacuum ... Generally the universe is built not only of particles, but also can carry different interactions - EM, weak, strong, gravitational. This possibility itself gives vacuum huge amount of degrees of freedom - some fundamental excitations, which not necessarily have nonzero mass like photons ... and if there is some interaction between them, thermodynamics says that they should thermalize - their energy should equilibrate. We can measure thermal noise of EM part of these degrees of freedom - 2.725K microwave background, but degrees of freedom corresponding to the rest of interactions (weak, strong, gravitational) had billions of years to thermalize - should have similar temperature. The EM part gives about 6*10^-5 of energy of vacuum required to obtain expected cosmological constant, maybe the rest of interactions carries the rest of it ... Anyway we believe that this microwave background is cooling - so 'the temperature of universe' should so. Shouldn't it became more difficult for particles to cross the energy barrier to get to a lower energy minimum? It would increase decay times ... We have experimental evidence that physical constants like e,G are unchanged with time, but is it so with decay times? Maybe radiometric dated things are a bit younger than expected... Similar situation is for example for excited electrons ... .
swansont Posted July 9, 2010 Posted July 9, 2010 Radioactive decay dependence on temperature is something that can be tested and has been tested. The energy involved is typically of order 1 MeV. Room temperature is a fraction of an eV, and the current microwave background is a couple orders of magnitude smaller than that.
Farsight Posted July 11, 2010 Posted July 11, 2010 Radioactive decay rates aren't necessarily constant. I imagine they vary with gravitational time dilation, but that apart, see the wiki article for changing decay rates and have a browse on decay+neutrino+seasonal to look further than CMBR photons. Note that most of the "particle zoo" particles with unambiguous mass aren't stable. Only the electron, the proton, and their antiparticles are stable. A charged pion lasts about a nanosecond, a muon lasts about a microsecond, and a free neutron lasts about 15 minutes. As far as I know you don't need to add any energy for these decays to occur. The neutron is usually stable inside an atom, but not always. Some half-lifes are so very short that they surely have to be associated with nuclear structure rather than the addition of energy or disruption by ambient photons or neutrinos.
swansont Posted July 11, 2010 Posted July 11, 2010 Radioactive decay rates aren't necessarily constant. I imagine they vary with gravitational time dilation In which case time is dilated. The decay rates don't vary with respect to the nucleus's frame. but that apart, see the wiki article for changing decay rates and have a browse on decay+neutrino+seasonal to look further than CMBR photons. What do neutrinos have to do with temperature?
Farsight Posted July 12, 2010 Posted July 12, 2010 We should talk about time properly Swanson, and what clocks actually measure. Neutrinos have nothing to do with temperature. But some people claim that they can influence decay rates. I'll ask around and see if anybody has conducted any tests, say at ArgoNeuT.
Mr Skeptic Posted July 12, 2010 Posted July 12, 2010 Decay rates can change depending on the conditions, but really the conditions needed to make a noticeable difference are the sort of thing you can make in the lab, with some difficulty, not really the sort of conditions found in distinctly non-vaporized mummies or fossils. Dating methods depend on more assumptions than just decay rates, just so you know. Decay rates are the least likely to be at fault, unless the physical constants really do change. You need to know the initial concentrations, correctly measure the final concentrations, and know the amount of contamination/dilution (or assume none). The initial concentrations of C-14 (relative to C-12) might not be all that stable, seeing as they are constantly created from nitrogen by radiation and carbon is used by living things. We changed the relative C-14 concentrations by our nuclear testing and also by releasing old carbon from coal and peat. But it's usually the dating methods that are for longer timescales that people are upset about.
swansont Posted July 12, 2010 Posted July 12, 2010 We should talk about time properly Swanson, and what clocks actually measure. Neutrinos have nothing to do with temperature. But some people claim that they can influence decay rates. I'll ask around and see if anybody has conducted any tests, say at ArgoNeuT. The issue is not what some people claim, it is that the OP asks a specific question, and your response is not on point.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now