Mr Skeptic Posted April 27, 2008 Posted April 27, 2008 Radioactive carbon, C-14, is made when a cosmic ray strikes nitrogen atoms and replaces a proton with a neutron. Thus, the rate of C-14 production is (almost completely) independent of the concentration of carbon in the atmosphere. But by burning fossil fuels which have been in the ground for long enough for the C-14 to decay, we are increasing the concentration of non-radioactive carbon, without increasing the rate of C-14. So, the relative concentration of radioactive carbon should be decreasing. Obviously, the actual amount of C-14 would be the same. Does this mean that we will be exposed to less background radiation (since our bodies would have less C-14) due to our burning of fossil fuels?
Cap'n Refsmmat Posted April 27, 2008 Posted April 27, 2008 We caused a huge increase in C-14 levels via nuclear testing, so I doubt the effect would make much difference.
swansont Posted April 27, 2008 Posted April 27, 2008 I recall reading that certain recently-living organisms that were collected near a large fossil-fuel-burning source were dated (IIRC it was grass near a major highway). It gave the wrong values, because much of the carbon was very old and it skewed the results. So, in principle, the answer is yes. If we double the amount of CO2 in the atmosphere from fossil fuel burning, we can expect the concentration of C-14 to drop in half, on average, and that would hold true for those organisms in equilibrium with the atmosphere.
Mr Skeptic Posted April 27, 2008 Author Posted April 27, 2008 We caused a huge increase in C-14 levels via nuclear testing, so I doubt the effect would make much difference. Wow. I looked at a graph of the spike, and it was quite impressive. (The spike I saw was for local C-14 though). It does seem to have had a global effect, though, and apparently allows determination of a person's age to about 1.6 years if it is known whether they lived in the northern or southern hemisphere, and born after 1950. I recall reading that certain recently-living organisms that were collected near a large fossil-fuel-burning source were dated (IIRC it was grass near a major highway). It gave the wrong values, because much of the carbon was very old and it skewed the results. So, in principle, the answer is yes. If we double the amount of CO2 in the atmosphere from fossil fuel burning, we can expect the concentration of C-14 to drop in half, on average, and that would hold true for those organisms in equilibrium with the atmosphere. So I was right that we could reduce that. However, after doing some research, it seems that C-14 accounts for only about 1/40th of the radiation as potassium-40, and then there are other larger sources such as radon, so even eliminating C-14 would have a negligible effect on our exposure to background radiation. So much for that idea.
swansont Posted April 27, 2008 Posted April 27, 2008 So I was right that we could reduce that. However, after doing some research, it seems that C-14 accounts for only about 1/40th of the radiation as potassium-40, and then there are other larger sources such as radon, so even eliminating C-14 would have a negligible effect on our exposure to background radiation. So much for that idea. That would be because K-40 has about 50% more activity in your body, and a decay energy around 10x higher than C-14.
Mr Skeptic Posted April 27, 2008 Author Posted April 27, 2008 That would be because K-40 has about 50% more activity in your body, and a decay energy around 10x higher than C-14. So, would it be possible to reduce the amount of K-40 in the body? And more importantly, would that have any health benefits? Obviously, radon would be the easiest to reduce.
thedarkshade Posted April 27, 2008 Posted April 27, 2008 There are approximately 1000 atoms of K-40 exploding every second, and about 90% percent of which produce beta rays and most of the other 10% is gamma rays. And there are approximately 12 atoms of C-14 exploding every minute for every gram of carbon in your body.
swansont Posted April 27, 2008 Posted April 27, 2008 There are approximately 1000 atoms of K-40 exploding every second, and about 90% percent of which produce beta rays and most of the other 10% is gamma rays. And there are approximately 12 atoms of C-14 exploding every minute for every gram of carbon in your body. I went through the math last time you posted that number and got ~4400 dps for K-40 (vs ~3000 for C-14)
Mr Skeptic Posted April 27, 2008 Author Posted April 27, 2008 I went through the math last time you posted that number and got ~4400 dps for K-40 (vs ~3000 for C-14) http://en.wikipedia.org/wiki/Potassium 40K occurs in natural potassium (and thus in some commercial salt substitutes) in sufficient quantity that large bags of those substitutes can be used as a radioactive source for classroom demonstrations. In healthy animals and people, 40K represents the largest source of radioactivity, greater even than 14C. In a human body of 70 kg mass, about 4,400 nuclei of 40K decay per second.[2] The activity of natural potassium is 31 Bq/g. Unfortunately, the half-life of K-40 is in the billions of years, so I don't see any simple way to remove it from the diet.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now