jones123 Posted May 12, 2013 Posted May 12, 2013 Hi, I'm having troubles understanding the principles of dew point temperature. From wikipedia:"The dew point is the temperature below which the water vapor in a volume of humid air at a given constant pressure will condense into liquid water" "Relative humidity is the ratio of the partial pressure of water vapor in an air-water mixture to the saturated vapor pressure of water at a prescribed temperature" so RH = e/es. My question is: If air cools, it could hold less water vapor, so es will decrease (Clausius Clapeyron law) so that RH must increase. But how do I have to interpret dew point temperature? The definition says air must cool at a constant pressure (es). But I just mentioned that es must decrease when air cools so basically it can never be constant when air cools? It seems a bit confusing to me. I hope I made my problem clear and that someone can explain! Thanks!
John Cuthber Posted May 12, 2013 Posted May 12, 2013 Cooling the air leaves the dew point unchanged until the air gets to (or just below) the dew point. Then the water condenses out as fog.
studiot Posted May 12, 2013 Posted May 12, 2013 The Clausius-Clapeyron equation is far too heavy to understand dew point (DP). To understand DP and relative humidity (RH) you really need to start with the difference between saturated and unsaturated vapours. Are you familiar with these?
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now