seriously disabled Posted October 19, 2011 Posted October 19, 2011 (edited) How hot are stellar cores really? The Sun's has a core temperature of about 15 million Kelvin but what if a star is 500 or even 1000 times more massive than the sun? What would its core temperature be then? How do you calculate this? Although it's very obvious to me that no electronic component and nothing solid can survive in these temperatures but I would still like to know. Edited October 19, 2011 by seriously disabled
Schrödinger's hat Posted October 20, 2011 Posted October 20, 2011 How hot are stellar cores really? The Sun's has a core temperature of about 15 million Kelvin but what if a star is 500 or even 1000 times more massive than the sun? What would its core temperature be then? How do you calculate this? Although it's very obvious to me that no electronic component and nothing solid can survive in these temperatures but I would still like to know. There's a few ways. You can look at the surface (the size, temp etc) and the mass of the star and extrapolate using thermodynamics. You can look at the spectra of the reactions happening in it. By seeing which elements are fusing and at what rates you can form a relation between temp and pressure.
Airbrush Posted October 20, 2011 Posted October 20, 2011 (edited) They see very few stars more massive than 150 solar masses. The most massive yet discovered at over 250 solar masses, may be a binary. The question is how hot is the core of these most massive of stars? How much heat does it take to fuse the element before iron into iron? Edited October 20, 2011 by Airbrush
baric Posted October 20, 2011 Posted October 20, 2011 (edited) The best way to get a feel for stellar interior temperatures is to understand that increasingly higher temperatures are required to fuse heavier elements in the stellar core. For example, silicon burning requires about 3 billion Kelvin. Check out this link for more info: http://en.wikipedia....nucleosynthesis Edited October 20, 2011 by baric
Airbrush Posted October 20, 2011 Posted October 20, 2011 (edited) Thanks for the link Barac. This is a very interesting question. True that the silicon burning reaches 3.5 Billion Kelvin, but Wiki also says the highest temperatures that can be reached at the center of the most massive of stars is up to 5 Billion Kelvin, which lasts for only seconds, just before a very dramatic moment, a Type II supernova occurs, which synthesizes half of all the elements heavier than iron. These elements are created in a second. I'm more familiar with heat measured in Farenheit, so 5 Billion Kelvin is about 9 Billion F, a very hot moment (F = 1.8(K-273) + 32). What was the highest temperature at the 1st moments of the Big Bang? Wiki says after several minutes the universe was about 1 Billion K. Wikipedia: "...The entire silicon-burning sequence lasts about one day and stops when nickel–56 has been produced. Nickel–56 (which has 28 protons) has a half-life of 6.02 days and decays via beta radiation (in this case, "beta-plus" decay, which is the emission of a positron) to cobalt–56 (27 protons), which in turn has a half-life of 77.3 days as it decays to iron-56 (26 protons). However, only minutes are available for the nickel–56 to decay within the core of a massive star. At the end of the day-long silicon-burning sequence, the star can no longer release energy via nuclear fusion because a nucleus with 56 nucleons has the lowest mass per nucleon (proton and neutron) of all the elements in the alpha process sequence. Although iron–58 and nickel–62 have slightly higher binding energies per nucleon than iron–56,[2] the next step up in the alpha process would be zinc–60, which has slightly more mass per nucleon and thus, would actually consume energy in its production rather than release any. The star has run out of nuclear fuel and within minutes begins to contract. The potential energy of gravitational contraction heats the interior to 5 GK/430 keV and this opposes and delays the contraction. However, since no additional heat energy can be generated via new fusion reactions, the contraction rapidly accelerates into a collapse lasting only a few seconds. The central portion of the star gets crushed into either a neutron star or, if the star is massive enough, a black hole. The outer layers of the star are blown off in an explosion known as a Type II supernova that lasts days to months. The supernova explosion releases a large burst of neutrons, which synthesizes in about one second roughly half the elements heavier than iron, via a neutron-capture mechanism known as the r-process (where the “r” stands for rapid neutron capture)." http://en.wikipedia.org/wiki/Silicon_burning_process Edited October 20, 2011 by Airbrush
Airbrush Posted October 27, 2011 Posted October 27, 2011 The question how hot it can get at the center of the most massive stars led me to the Wiki article I posted above that said at the moment of a Type II supernova the temperature reaches and incredible 9 Billion degrees Farenheit. That made me wonder how hot the Big Bang started out at. I finally found this by using ask.com. This article says at 1/100 of a second after the Big Bang the temperature was about 100 Billion degrees Kelvin. So I immediately convert it to Farenheit, because I like a scale I am familiar with, and I get 180 Billion degrees F. Anyone know how hot the Big Bang was after a Millionth, Billionth, or Trillionth of a second? http://csep10.phys.utk.edu/astr162/lect/cosmology/hotbb.html
baric Posted October 27, 2011 Posted October 27, 2011 So I immediately convert it to Farenheit, because I like a scale I am familiar with, and I get 180 Billion degrees F. How often do you deal with temperatures of billions of degrees so that you prefer Fahrenheit? Everything in science is measure in Kelvins, you should probably adjust to that and just use Fahrenheit for the normal everyday stuff (weather, etc). It will save you a lot of conversion time!
Airbrush Posted October 27, 2011 Posted October 27, 2011 (edited) You are correct. And yet, in the spirit of wanting science to be better understood, I prefer to use terms that are most recognizable to myself and my audience, the average non-scientist. Scientists are smart people who can easily convert units in their head. I and others may lack that ability. Go ahead and use the Kelvin scale, and the average non-scientist will wonder if that is a lot hotter or colder than Fahrenheit. But they won't say it because they are embarassed that they don't know what the Kelvin scale means (because they only spent a little time discussing it in high school) and spent the rest of their lives hearing temperatures in Fahrenheit on a daily basis. In discussing temperatures of Billions of degrees, the difference between Fahrenheit and Kelvin is significant. Fahrenheit is approx 1.8 times Kelvin, almost DOUBLE. My question still stands for an expert here. If the Big Bang was 100 Billion degrees Kelvin at 1/100 of a second after the Big Bang, how hot was it at one trillionth of a second after the Big Bang? Just how hot can hot get? Edited October 27, 2011 by Airbrush
baric Posted October 27, 2011 Posted October 27, 2011 My question still stands for an expert here. If the Big Bang was 100 Billion degrees Kelvin at 1/100 of a second after the Big Bang, how hot was it at one trillionth of a second after the Big Bang? Just how hot can hot get? Temperature is a measure of the average speed of the particles contained within the sample being measured. At some early point after the Big Bang you are no longer dealing with particles so the notion of temperature doesn't apply, at least in a traditional sense. Also, neutron stars start out at about one trillion Kelvin before cooling down, which is 10 times the temperature you listed.
Airbrush Posted October 27, 2011 Posted October 27, 2011 (edited) Temperature is a measure of the average speed of the particles contained within the sample being measured. At some early point after the Big Bang you are no longer dealing with particles so the notion of temperature doesn't apply, at least in a traditional sense. Also, neutron stars start out at about one trillion Kelvin before cooling down, which is 10 times the temperature you listed. Very interesting. Thank you. Can you give me a link to this incredible info? A neutron star starts out hotter than a Type II supernova during the one second it creates half the elements heavier than iron? Wow. I am going to do some wiki search to see if I can verify this info. Here it is, you are correct Sir. Ask.com: "According to the Wikipedia, a newly formed neutron star would have a temperature of 10^11 - 10^12 Kelvin, but after a year, it will cool down to 10^6 (a million) Kelvin, due to the large number of neutrinos it emits.) Read more: http://wiki.answers.com/Q/How_hot_is_a_neutron_star#ixzz1c0akxuU4 Then is it correct to say that the hottest anything can get in the universe, including at the earliest moments of its' birth, is about a Trillion Kelvin? Edited October 27, 2011 by Airbrush
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now