Jump to content

Recommended Posts

Posted

It may be just me but can you explain this

"This is not the case on the moon, where the photons arrive effectively one at a time in a very narrow stream and hence only excite a small number of receptors. Each receptor only registers the signal for a short time so you only get a faint and largely incomplete image."

As far as my eyes are concerned the stars are point sources. The light from them arrives at my eyes along a very narrow line.

The moon has no atmosphere to scatter photons. A slightly larger number of photons would reach my eyes on the Moon than on Earth.

The " we never went to the Moon" nutters say that the lack of stars in the background of pictures taken on the Moon proves that we weren't there.

Cleverer people point out that it's just the same as on earth- you don't normally see stars in a picture where the exposure is set for daylight.

If you face away from the sun and look up at the stars while on the Moon they should be just as visible as here on a dark night.

The stars are just as bright (actually slightly brighter) and the background just as black (actually slightly blacker).

Just as many photons (strictly, slightly more) hit the retina and, I suspect that they pretty much all hit the same receptor cell- much as they do on earth.

What's the difference?

Posted
Oh, that would be interesting indeed. If the entire universe were rotating, then it would have a center and we would be able to find it, right?

Conventional "wisdom" says the universe has no centre, but in fact I have arguments against this (All proprietary info I'm afraid at the moment). The indications from CMBR images is that if there is rotation it is very small, but if there is any rotation at all, then in principle it must be about something.


Merged post follows:

Consecutive posts merged
It may be just me but can you explain this

"This is not the case on the moon, where the photons arrive effectively one at a time in a very narrow stream and hence only excite a small number of receptors. Each receptor only registers the signal for a short time so you only get a faint and largely incomplete image."

As far as my eyes are concerned the stars are point sources. The light from them arrives at my eyes along a very narrow line.

The moon has no atmosphere to scatter photons. A slightly larger number of photons would reach my eyes on the Moon than on Earth.

The " we never went to the Moon" nutters say that the lack of stars in the background of pictures taken on the Moon proves that we weren't there.

Cleverer people point out that it's just the same as on earth- you don't normally see stars in a picture where the exposure is set for daylight.

If you face away from the sun and look up at the stars while on the Moon they should be just as visible as here on a dark night.

The stars are just as bright (actually slightly brighter) and the background just as black (actually slightly blacker).

Just as many photons (strictly, slightly more) hit the retina and, I suspect that they pretty much all hit the same receptor cell- much as they do on earth.

What's the difference?

Then I'm afraid you have unique eyes - everyone else on earth sees them twinkling which is caused exactly as I have described. The stars themselves do not change it is only the amount of light which is registered exactly the same as I have told you. Seek an alternative source if you do not believe me. Photos on the moon would have to be considerably time lapsed to register images.

I defy you to hold your head still enough to get all the photons striking anywhere near to just one receptor- here or on the moon.

Posted

I personally do not see twinkling of stars. However, they say it is caused by our atmosphere and fluctuations in it, because the stars are pointlike sources. The planets are not supposed to twinkle since they are not pointlike sources. But maybe I'm blind cause I can't see any difference.

Posted
I personally do not see twinkling of stars. However, they say it is caused by our atmosphere and fluctuations in it, because the stars are pointlike sources. The planets are not supposed to twinkle since they are not pointlike sources. But maybe I'm blind cause I can't see any difference.

I apologise, I understood all human eyes to register the twinkle similarly, but I accept this could be only a majority. Logically it might depend on where you live and where you observe from. I can see that people at altitude or in areas of low atmospheric disturbance, would see less twinkling than at sea level or heavy atmosphere generally. The planets and stars all twinkle for me too, though I understand that the planets are not supposed to. Maybe atmospheric pollution is starting to vary this phenomenon differently?

Posted

I still don't see how the twinkling makes a lot of odds. In the absence of an atmosphere the photons will all hit the same receptor cell. They will, therefore trigger it. You will see a bright spot.

On a clear night the stars don't twinkle much- occasionally you get clear spots of air. The stars don't "go out" when this happens

Posted
I still don't see how the twinkling makes a lot of odds. In the absence of an atmosphere the photons will all hit the same receptor cell. They will, therefore trigger it. You will see a bright spot.

On a clear night the stars don't twinkle much- occasionally you get clear spots of air. The stars don't "go out" when this happens

John, your'e not listening to either me or Mr Skeptic, earths atmosphere makes all the difference!!! Stars (at the distances involved) are much tinier than you can clearly conceive and their light behaves only as a point source. Earths atmosphere, whether twinkling or not, smears out this tiny amount of light (as I have described) and makes them appear much bigger than they actually look in space.

You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell. Even if they did, receptors are so small you would barely notice.

Posted

I can't tell what this argument is about.

 

Twinkling is caused by the atmosphere. Sure.

 

Is the claim that stars are invisible without the distortion of the atmosphere? Because that is simply not true. Astronauts can see the stars, as long as their eyes aren't adjusted for brighter lights, just like on the ground. So basically if anything lit by sunlight is in view, or even if cabin lights are on (just like looking out the window of a well-lit house at night), you can't see anything. The Apollo astronauts couldn't see stars from the moon because it was daytime. The cameras couldn't see them because their exposures were too short, because it was too bright.

 

Or is it something else?

Posted

No there's nothing else. The question was would you "notice" them on the moon rather than 'can you see them'. I think you have helped me explain that they would not be as noticeable on the moon for the reasons I (and you) have now laid out. So thanks for that.

Posted
Is the claim that stars are invisible without the distortion of the atmosphere? Because that is simply not true. Astronauts can see the stars, as long as their eyes aren't adjusted for brighter lights, just like on the ground. So basically if anything lit by sunlight is in view, or even if cabin lights are on (just like looking out the window of a well-lit house at night), you can't see anything. The Apollo astronauts couldn't see stars from the moon because it was daytime. The cameras couldn't see them because their exposures were too short, because it was too bright.

 

I wouldn't be surprised if somewhere deep in the NASA archives is a poorly exposed image from the Moon, with the lunar landscape bright white and washed out and stars clearly visible. They were manual cameras operated by guesswork, so they probably screwed up a few images. We only get to see the nice ones.

Posted
John, your'e not listening to either me or Mr Skeptic, earths atmosphere makes all the difference!!! Stars (at the distances involved) are much tinier than you can clearly conceive and their light behaves only as a point source. Earths atmosphere, whether twinkling or not, smears out this tiny amount of light (as I have described) and makes them appear much bigger than they actually look in space.

You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell. Even if they did, receptors are so small you would barely notice.

 

I don't think that argument holds water. If the photon flux was so small that you don't get multiple photons on a receptor (in some arbitrary time), how does spreading it out via atmospheric effects make it appear brighter?

Posted

"You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell."

Where do the other photons go?

Plenty make the journey- or you wouldn't be able to see the stars from Earth.

The receptor cells are pretty lightly packed so what happens to the other photons?.

Posted

Here is an experimental demonstration of the dimming of a star by the atmosphere:

 

picture.php?albumid=34&pictureid=888

 

http://spiff.rit.edu/classes/phys445/lectures/atmos/atmos.html

 

One airmass means looking at the star when it is directly above and an airmass of 2 is about 30 degrees off the horizon. A larger apparent magnitude means the thing is less bright. As expected, the larger the path length through the atmosphere, the dimmer the star.

 

It looks roughly linear with path length which I believe Beer's law would attest to.

Posted
I wouldn't be surprised if somewhere deep in the NASA archives is a poorly exposed image from the Moon, with the lunar landscape bright white and washed out and stars clearly visible. They were manual cameras operated by guesswork, so they probably screwed up a few images. We only get to see the nice ones.

yes exactly - overexposed.

Posted

he meant over exposed for the moons surface. for taking a photograph of stars it would be perfectly exposed.

 

the same as changing exposure settings for indoor and outdoor lighting etc.

Posted (edited)
I don't think that argument holds water. If the photon flux was so small that you don't get multiple photons on a receptor (in some arbitrary time), how does spreading it out via atmospheric effects make it appear brighter?

I think Iv'e explained perfectly well how and why atmospheric effects only enhance the ability of humans to observe stars on earth using our "less than perfect" eyesight and its (slightly time-lapsed) relationship with our "less than perfect" brains - not that they add to their brightness.

This topic was raised during the Apollo era and has arisen several times since - mostly in relation to debunking theories. I have related what I recall to be the consensus view. To the best of my recollection very few (possibly none) of the astronauts either on the moon or in orbiting craft, recalled seeing stars when interviewed.

Edited by Akhenaten2
Posted
I think Iv'e explained perfectly well how and why atmospheric effects only enhance the ability of humans to observe stars on earth using our "less than perfect" eyesight and its (slightly time-lapsed) relationship with our "less than perfect" brains - not that they add to their brightness.

This topic was raised during the Apollo era and has arisen several times since - mostly in relation to debunking theories. I have related what I recall to be the consensus view. To the best of my recollection very few (possibly none) of the astronauts either on the moon or in orbiting craft, recalled seeing stars when interviewed.

 

I don't recall seeing stars when the sun is in full view here on earth, either. You're mixing two effects here.

 

Telescopes in space have no trouble collecting multiple photons from these "point sources" on receptors much smaller than what we have in out eyes. So well that we can actually tell they aren't point sources.

 

I don't find your argument compelling.

Posted (edited)

"I think Iv'e explained perfectly well how and why atmospheric effects only enhance the ability of humans to observe stars on earth using our "less than perfect" eyesight and its (slightly time-lapsed) relationship with our "less than perfect" brains - not that they add to their brightness."

 

I don't.


Merged post follows:

Consecutive posts merged

I'd also still like to know where the other photons go.

Actually, at one level I know exactly where they go. The star is pretty close to a point source but my eye isn't infinitely large. Since it has a finite aperture the photons going through it are diffracted. They spill out into a diffraction limited splodge at the retina. Roughly speaking the Airy disk will be about a micron across which is about the same size as a cell. The photons will be distributed across that diffraction pattern.

 

Any light source that would, from the point of view of geometrical optics, give an image smaller than about 1 micron, would give the same, diffraction limited blur at the retina.

 

So yesterday I got bored. I took some cooking foil, put it against a flat piece of glass and pushed the point of a sewing needle through it. That gave me a small hole in an opaque layer of metal. I checked and the hole is about 100µm across. (Not the most accurate measurement in history but I think it's within a factor of 2.)

I then folded it over the end of a torch so I had a small (about 0.1mm) bright spot.

Then I looked at it from 2.5 metres away and it looked like a small bright spot.

 

I may have got the arithmetic wrong but I think that the image of that spot at my retina would have been diffraction limited (plus any distortions from the imperfection of my eyes). My eye is about 25mm front to back and the object was about 2500 mm away so the image should have been about 100 times smaller than the object. Actually the eye behaves as if the optical centre is about 17 mm from the retina so the factor is more like 150 to one.

That gives an image a micron across or less at the retina- but that's smaller than the diffraction limit so what I actually saw was a diffraction limited spot (distorted by whatever my eye does.)

 

Now, here's the killer.

If I made the hole 10 times smaller and the lamp 100 times brighter I would see exactly the same thing. Similarly, if I made it 100 times brighter and 10 times more distant it would look the same. Also if I made it 10 times bigger but 10 times more distant it would still form the same image at my eye- a dot distorted by diffraction and the faults in my eye.

I can keep doing those operations until I have a very bright, very distant, fairly big star.

Whatever I do I still see a diffraction (and abberation) limited dot.

 

Unless you can claim that the twinkling of the light as it travels 10 feet in still air is responsible for the fact that I can see a torch with some perforated tin-foil over it then this experiment seems to indicate that twinkling is not needed.

You can see an arbitrarily small thing, provided that it's bright enough.

In the limiting case (and a star is pretty close) what you see is the point spread function of the lens.

 

 

BTW, re. "You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell. Even if they did, receptors are so small you would barely notice."

LOL.

I'm a chemist specialising in spectroscopy and I have also studied imaging systems (including the eye) as part of another university course I did just for fun. but the real killer is that you seem to have gotten so hung up on the quantum physics you forgot the basics. What you see when you look at a star isn't a true point, but the distortions of the eye.

There would be nothing to stop the folks on the moon seeing exactly the same thing, but slightly better.

Edited by John Cuthber
Consecutive posts merged.
Posted
You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell. # Even if they did, receptors are so small you would barely notice.

 

 

(#Emphasis added by Strontidog.)

 

Uhhhh. . .one thing you're forgetting, Akhenaten2, is just how many photons we're talking about, if it's even visible light.

 

You make it sound like a single line of photons hitting one point in a continuous stream. Don't forget, that this is a blanket of photons, basically hitting everywhere.

 

By your definition, if you stepped a foot to the left, the thing would be invisible. And believe me, I understand inverse square law and distance, and the distance is HUGE. One thing to remember is that every time the distance doubles, the intensity of the photon flux reduces by 75%, for a point source.

 

And that works fine for the first few light years, it really gets reduced a lot.

 

But once you've passed a few hundred light years, you have to keep doubling in order for any significant reduction in photon flux density to occur. After a while, it takes millions of light years just to reduce the density to 1/4 of what it was. And that means that from a billion light years to two billion light years, you're still at 25% intensity. So it's still only 1/4 as bright.

 

In some ways, any star you can see with the naked eye is hitting the earth (or the moon) with as encompassing a blanket of visible light photons, just as the Sun does. It can still be seen from any point that isn't blocked, just not so many photons as the closer star. And yes, I know that space isn’t a perfect vacuum, especially at these distances. There’s cosmic dust and other suns and maybe dark matter (which may or may not block photons, the jury is still out on that. . .)

 

There are still plenty of photons to go around. More than enough. Don’t worry, we’ll make more.

 

Do the point source equations, and you'll find out that after a few light years, it takes more and more to reduce the flux density.

 

No, the twinkling would be from the atmosphere. But the attenuation of the atmosphere would reduce the intensity, not increase it. There are fewer photons coming in (but still a blanket, if you will) when you are looking through miles of atmosphere as there are when you're looking from the moon.

 

Size (due to scattering) might be increased due to atmosphere, but brightness would be reduced.

 

It's all in the numbers.

 

Bill Wolfe

Posted
John, your'e not listening to either me or Mr Skeptic, earths atmosphere makes all the difference!!! Stars (at the distances involved) are much tinier than you can clearly conceive and their light behaves only as a point source. Earths atmosphere, whether twinkling or not, smears out this tiny amount of light (as I have described) and makes them appear much bigger than they actually look in space.

You need to read a bit about eyes, receptors and quantum behaviour to appreciate the very low probability that even two photons from a "point source" would hit the same receptor cell. Even if they did, receptors are so small you would barely notice.

 

there is no problem at all, because what you are actually looking at is an airy disk (or more accurately a point spread function, since the eye is not a perfect lens). The object does not need to be above a certain size to be seen, since anything with a subtended angle less than the diffraction limit of 1.22 lambda/d will look the same anyway.


Merged post follows:

Consecutive posts merged
Roughly speaking the Airy disk will be about a micron across which is about the same size as a cell. The photons will be distributed across that diffraction pattern.

 

actually this is a really cool thing about the eye. The distances between rods and cones are pretty much the same as the diffraction limit of the eye.


Merged post follows:

Consecutive posts merged
I don't recall seeing stars when the sun is in full view here on earth, either. You're mixing two effects here.

 

Telescopes in space have no trouble collecting multiple photons from these "point sources" on receptors much smaller than what we have in out eyes. So well that we can actually tell they aren't point sources.

 

I don't find your argument compelling.

 

well the reason we don't see stars on earth in daylight is because all the scattered light obscures them. On the moon though it is a different story, because there is no scattered light. If the astronauts on the moon couldn't see the stars though (and I say if, because I have no idea) it would be for the same reason that you can't see stars if you go out at night, and shine a torch into your eyes while looking at the sky.

Posted

John - re post 67. - "You can see an arbitrarily small thing, provided it's bright enough."

This was an interesting experiment you did; could you complete it for me?

By my calcs your distance/size relationship is out by a factor of about 100 (for a sun size object about 4 ly distance). Also your torch brightness could be out by a similar factor.

So if you repeat the set up, but with the torch and pinhole (a more appropriate) 250 meters away, without brightening the torch but additionally shining the light down a small bore tube onto the pinhole (to simulate more of a point source) - please advise if you can still see it.

Posted

You have missed the point twice.

 

The spot on the retina is as small as it's going to get because it's diffraction limited.

Making the object smaller will not make any difference to the size of the image.

 

 

Also, stars are brighter than the torch. and, as I pointed out earlier, there are plenty of photons getting here- that's why you can see the stars at night.

 

The stars are bright enough and they will form an image at the retina that looks exactly the same size and shape as the image of a torch-lit pinhole.

 

 

BTW, you have not yet answered my question.

Where do the other photons go?

Posted

No John, I haven't missed any point. Its not about size its' about the amount of light. The "apparent" brightness of your torch (at 2.5 mtrs) is far, far greater than that of stars. There is little comparison between your torch-lit pinhole and star-shine except in your imagination.

Many photons will get through, but many will be refracted in the atmosphere to peripheral areas to the main image and seen as colours, many similarly lost and some reflected back into space.

Posted (edited)

Do I have to put this in really big letters for you?

 

 

 

as I pointed out earlier, there are plenty of photons getting here- that's why you can see the stars at night.

 

The stars are bright enough and they will form an image at the retina that looks exactly the same size and shape as the image of a torch-lit pinhole.

 

 

BTW, you have not yet answered my question.

Where do the other photons go?

I know the torch is brighter. I don't care because it doesn't matter.

The point of the torch experiment is to show that the eye can see small things if they are bright.

Do you actually understand what things like "diffraction limited" and "point spread function" mean?

 

And if you don't answer the question about the photons that get to the eye but are not registered soon, people will think you are a troll.

 

Don't forget that more photons get to the eye on the moon than would do here on earth and those photons are enough to let you see the star from here.

What you are trying to say is that brighter (more actual photons) is darker ( less visible, even against an even blacker background).

Edited by John Cuthber
Posted
By my calcs your distance/size relationship is out by a factor of about 100

 

I'd just like to agree with John's arithmetic here:

 

If I made the hole 10 times smaller and the lamp 100 times brighter I would see exactly the same thing. Similarly, if I made it 100 times brighter and 10 times more distant it would look the same. Also if I made it 10 times bigger but 10 times more distant it would still form the same image at my eye- a dot distorted by diffraction and the faults in my eye.

The factor that flux is reduced is (r/d)^2 where r is the radius of the shiny object (shiny :)) and d is the distance to said shininess. Like John says, if you make r 10 times smaller then the flux will be dimmer by a factor of 100. Making the object 100 times brighter will compensate. Distance also can be compensated for by a brighter object as he says: 10 times more distant and 100 times brighter = the same apparent brightness.

Posted
I'd just like to agree with John's arithmetic here:

 

 

The factor that flux is reduced is (r/d)^2 where r is the radius of the shiny object (shiny :)) and d is the distance to said shininess. Like John says, if you make r 10 times smaller then the flux will be dimmer by a factor of 100. Making the object 100 times brighter will compensate. Distance also can be compensated for by a brighter object as he says: 10 times more distant and 100 times brighter = the same apparent brightness.

 

The arithmetic is solid, Iggy, but I think you're off a little on the theory. It may be anti-intuitive, but it doesn't matter how big the light source is (the radius of the aperture), not once the measurement is taken at a distance 10+ times the source radius (5+ times the diameter). That's the minimum distance for the source to be considered a point source.

 

Any closer, and you have to use disc or planar formulas to calculate intensity.

 

All point sources 'look' like they are the same size: unless the light is being scattered by some medium, which while reducing intensity, will magnify the apparent size.

 

The formula isn't r/d2 (where r is the radius of the light source)

 

The inverse square formula is: i/d2 (where i is the intensity of the photon flux)

 

I've used this formula for years with gamma and x-ray point sources, and it's a well-tested and verified field tool. And attenuation due to air is ALWAYS a factor. A reducing factor, at that.

 

The whole inverse-square law is based on the point source being a perfect sphere, with the photons spreading out—uniformly—following an inverse 4 pi geometry.

 

The limitation of shining the light through a tube, is a little weird. You’d get a collimated beam, then. And inverse square wouldn’t apply.

 

In so many words: if as in the example, you increase the intensity at the source by 100 times, and reduce the size of the source by ten times, the intensity (from the same distance) will be 100 times higher. The efficiency and refractive capabilities of the detector are moot, it’s the same detector both times. The size of the source is irrelevant—as long as we’re still talking about a point source.

 

And visible stars, as seen from space or from the surface of the earth, are point sources by ANY definition (except our star, of course.)

 

More stars are visible without an atmosphere to block them, they will not look as big from the moon and they will not twinkle.

 

Bill Wolfe

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.