alpha2cen Posted December 25, 2010 Posted December 25, 2010 We can measure visible light intensity using photoelectric effect. Electromagnetic wave hits the electrons on the solid surface, and the emitted electrons make current. Then, How we can measure the intensity of the short length electromagnetic wave or long length one?
swansont Posted December 25, 2010 Posted December 25, 2010 The photoelectric current is proportional to the number of photons.
lemur Posted December 25, 2010 Posted December 25, 2010 The photoelectric current is proportional to the number of photons. Related question: if visible light carries many times more energy than infrared, and UV even more, why doesn't visible light burn us up or fry our optic nerves? I suppose with the optic nerve, the cornea only receives an amount of light determined by the size of the pupil, but what about all the visible light on our skin, etc.? Does it simply reflect most of the energy and thereby avoid getting affected by it? Is there just not that much energy in sunlight despite the broadness of its spectrum?
swansont Posted December 25, 2010 Posted December 25, 2010 Visible light can burn you and/or fry your eyes. It's a matter of the rate of photons hitting you, and the area involved. i.e. the intensity. Don't stare at the sun or use a magnifying glass with your skin as the target.
lemur Posted December 25, 2010 Posted December 25, 2010 Visible light can burn you and/or fry your eyes. It's a matter of the rate of photons hitting you, and the area involved. i.e. the intensity. Don't stare at the sun or use a magnifying glass with your skin as the target. I know this. I phrased my question wrong. What I was really trying to ask is why the infrared of sunlight is strong enough to feel warm to the skin but the visible light doesn't (or does it and I'm just assuming the warmth is due to infrared?). Also, an incadescent 60w bulb is too hot to touch but a cfl is not too hot, even though it's making as much visible light. Since visible light is higher-frequency than infrared, I would expect it to be delivering substantially greater amounts of energy/heat than the infrared. Is it that the sunlight and incadescent bulb are emitting infrared waves in greater amounts than visible light for some reason? Does sunlight and tungsten or whatever metal is in the bulb have a frequency curve whose mean is in the infrared range or something like that?
Mr Skeptic Posted December 25, 2010 Posted December 25, 2010 Each photon of UV carries more energy than each photon of visible which carries more energy than each photon of IR. But you have to consider the intensity too to find the total energy. IR is associated with heat because hot objects are so bright in the IR spectrum, not because of the amount of energy IR photons carry. When photons get to the energy of UV or higher, they will damage your molecules regardless of intensity, like a bunch of tiny bullets.
swansont Posted December 25, 2010 Posted December 25, 2010 http://www.spectralcalc.com/blackbody_calculator/blackbody.php Put in ~600K and have the band between 0.1 microns (100 nm) and 15 microns. Visible light is ~0.4 - 0.7 microns. You'll see that there's a lot of energy present in the IR Incandescent lights have luminous efficiencies of only a few percent. http://en.wikipedia.org/wiki/Incandescent_light_bulb#Efficiency_comparisons
lemur Posted December 25, 2010 Posted December 25, 2010 (edited) Each photon of UV carries more energy than each photon of visible which carries more energy than each photon of IR. But you have to consider the intensity too to find the total energy. IR is associated with heat because hot objects are so bright in the IR spectrum, not because of the amount of energy IR photons carry. When photons get to the energy of UV or higher, they will damage your molecules regardless of intensity, like a bunch of tiny bullets. So could you say that there's more energy in sunlight from visible light than from IR, only that energy is not vibrating molecules in a way that causes the temperature of the object to rise? That doesn't make sense, because all energy must get converted to heat if it is absorbed, no? Do black objects absorb IR as well as visible light more? Do light-colored object reflect both spectra the same as well? Or so some light object absorb more IR and others less? Do some black objects absorb visible light but reflect other wavelengths? edit: maybe this chart from wikipedia will be beneficial: (why won't it insert the image using the url?) http://en.wikipedia.org/wiki/File:Solar_Spectrum.png Edited December 25, 2010 by lemur
swansont Posted December 25, 2010 Posted December 25, 2010 Objects that are black in the visible may not be in other parts of the spectrum, and vice-versa.
lemur Posted December 25, 2010 Posted December 25, 2010 Objects that are black in the visible may not be in other parts of the spectrum, and vice-versa. So if you had two objects, one black and the other white, would the difference in their temperatures reflect only the amount of energy in visible light, or would the black object also be absorbing more IR and UV? Also, if you had an IR camera, is there any way to distinguish between IR light reflecting off something and the IR radiation being emitted because of the object's latent heat?
swansont Posted December 25, 2010 Posted December 25, 2010 Ideally, something that is black absorbs all radiation incident on it and something that is white absorbs none of it. For two identical objects, the amount of energy they radiate depends on T^4. The difference in radiation vs reflection might be detectable by looking at the spectrum, since the blackbody spectrum has a distinct shape.
alpha2cen Posted December 26, 2010 Author Posted December 26, 2010 So if you had two objects, one black and the other white, would the difference in their temperatures reflect only the amount of energy in visible light, or would the black object also be absorbing more IR and UV? Also, if you had an IR camera, is there any way to distinguish between IR light reflecting off something and the IR radiation being emitted because of the object's latent heat? So we can detect IR intensity by using the temperature difference between detector and the reference temperature. Is it right?
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now