xastorm Posted November 2, 2011 Share Posted November 2, 2011 suppose we have a gas that becomes completely ionized at room temperature, only energy in atmospheric gases are enough to convert it into plasma, since plasma loses energy by radiation to the surroundings, we could harness the generated electromagnetic energy violating the second law of thermodynamics. Link to comment Share on other sites More sharing options...
mathematic Posted November 2, 2011 Share Posted November 2, 2011 How did it get to be a plasma at room temperature? Link to comment Share on other sites More sharing options...
questionposter Posted November 3, 2011 Share Posted November 3, 2011 (edited) suppose we have a gas that becomes completely ionized at room temperature, only energy in atmospheric gases are enough to convert it into plasma, since plasma loses energy by radiation to the surroundings, we could harness the generated electromagnetic energy violating the second law of thermodynamics. It won't work, because you still need to put that energy into making the plasma in the first place. So lets say you put x energy into making a plasma. Because of a bunch of small factors such as some light being radiated at un-usable wavelengths, you would only get less than x usable energy back no matter what, it's the conservation of matter and energy. If you could generate plasma from light from the Sun here on Earth, then that would be efficient, but it's pretty hard to do that. Edited November 3, 2011 by questionposter Link to comment Share on other sites More sharing options...
xastorm Posted November 3, 2011 Author Share Posted November 3, 2011 How did it get to be a plasma at room temperature? well, I'm not trying to make a crank violation of the second law, but this is a kind of thought experiment, I am assuming we created a gas that ionizes at room temperature due to the thermal energy of the atmosphere alone, although far from existence, there is no connection between the validity of the second law and the ability to create such a kind of gas, Or there is? It won't work, because you still need to put that energy into making the plasma in the first place. So lets say you put x energy into making a plasma. Because of a bunch of small factors such as some light being radiated at un-usable wavelengths, you would only get less than x usable energy back no matter what, it's the conservation of matter and energy. If you could generate plasma from light from the Sun here on Earth, then that would be efficient, but it's pretty hard to do that. well, my assumption is: we have some type of gas which becomes a plasma at -say- 0 degrees, you need no energy input to make plasma in the first place because the thermal energy in the atmosphere is enough to ionize our gas and turn it into plasma, so, whatever energy you get it won't be useless, if you could harness only 1% of the radiated power you have absorbed some energy from the atmosphere, thus deviated from thermal equilibrium without exerting external work, which necessarily violates the second law of thermodynamics. Link to comment Share on other sites More sharing options...
swansont Posted November 3, 2011 Share Posted November 3, 2011 suppose we have a gas that becomes completely ionized at room temperature, only energy in atmospheric gases are enough to convert it into plasma, since plasma loses energy by radiation to the surroundings, we could harness the generated electromagnetic energy violating the second law of thermodynamics. Does plasma normally lose energy to its surroundings simply because it's a plasma, or because it's at a higher temperature? Under this scenario I expect that the plasma would get as much radiant energy from the surroundings as it loses. Link to comment Share on other sites More sharing options...
xastorm Posted November 3, 2011 Author Share Posted November 3, 2011 Does plasma normally lose energy to its surroundings simply because it's a plasma, or because it's at a higher temperature? Under this scenario I expect that the plasma would get as much radiant energy from the surroundings as it loses. I think the major losses from the plasma are radiation losses, thermal Bremsstrahlung, in that case plasma is not losing energy because the surrounding medium have a lower temperature, am I right? assuming that such a low energy plasma will be optically thin enough to allow most of the radiated energy, which would have low frequency, to pass through it without re-absorption. the point now is that plasma will get radiant energy from the surroundings to balance these losses, since it is surrounded by black bodies, so a thermal equilibrium occurs. if we tried to make a one way shield for radiation, so that we block surrounding radiation be reflecting them, the shield itself would act as a black body and lose energy by radiation to the plasma, and then it will absorb energy from the surroundings, so plasma would get as much heat as it loses again. so what defends the second law is: we can't stop black bodies from absorbing and losing radiation to get to attain thermal equilibrium. I think case closed Link to comment Share on other sites More sharing options...
swansont Posted November 3, 2011 Share Posted November 3, 2011 I think the major losses from the plasma are radiation losses, thermal Bremsstrahlung, in that case plasma is not losing energy because the surrounding medium have a lower temperature, am I right? assuming that such a low energy plasma will be optically thin enough to allow most of the radiated energy, which would have low frequency, to pass through it without re-absorption. I'm not sure how good of an assumption that is. If the plasma is optically thin, then the scattering rate is also small, so the amount of energy radiated is small. Also, the Compton scattering cross section increases as you lower the energy. http://en.wikipedia.org/wiki/Klein–Nishina_formula the point now is that plasma will get radiant energy from the surroundings to balance these losses, since it is surrounded by black bodies, so a thermal equilibrium occurs. Yes, that was my point. if we tried to make a one way shield for radiation, so that we block surrounding radiation be reflecting them, the shield itself would act as a black body and lose energy by radiation to the plasma, and then it will absorb energy from the surroundings, so plasma would get as much heat as it loses again. so what defends the second law is: we can't stop black bodies from absorbing and losing radiation to get to attain thermal equilibrium. I think case closed Yes. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now