Jump to content

Recommended Posts

Posted

there is a controversy about this picture

http://ipac.jpl.nasa.gov/media_images/ssc2005-22a_medium.jpg

 

discussed e.g. here:

http://www.badastronomy.com/bablog/?p=206

 

It is from the Spitzer Space Telescope. the bottom half is labeled

Infrared Background Light from First Stars

 

the splotches in the foreground are where they masked out brighter objects in order to get the way-redshifted background. infra-red heat patterns in what looks like otherwise empty sky

 

they published in Nature journal, this week

http://arxiv.org/abs/astro-ph/0511105

but it isnt settled yet that the pattern they see really is the first starlight. If it is, it would be very redshifted by like 20-fold expansion of the universe since those first stars condensed and began shining. this would have stretched the wavelengths way out, cooling down the light.

 

what started out as 0.2 micron light would be stretched about 20-fold and now be 4 micron wavelength light. and thats typical of the channels of infrared they were using to image with.

 

one prominent astronomer who is criticising this is Ned Wright.

A lot of people have learned Cosmology from his courses and his website and FAQ. He has built up great trust and respect.

 

to see his criticism look here

http://www.cnn.com/2005/TECH/space/11/02/early.stars.ap/index.html

 

and scroll down to where he is quoted, near the end of the article, saying he thinks it is wrong.

 

Kashlinsky is the lead author. the Nature article is called

Tracing the first stars with cosmic infrared background fluctuations

it is in the 3 November 2005 issue.

 

the estimate is that the light is 13.5 billion years old, based on the time since big bang being 13.7. So these stars were burning around 200 million years after bang.

 

I will get the redshift. Yeah, redshift 20 is right,

http://www.earth.uni.edu/~morgan/ajjar/Cosmology/cosmos.html

A redshift of 20 corresponds to light emitted when universe was 180 million years old.

close enough to 200. this is using the Morgan calculator with standard parameters.

 

I see they looked at light in this range

3.6, 4.5, 5.8 and 8 microns

 

visible is 0.3 to 0.7 microns

so that is sort of typical of what a star might make

these were very big stars (100 solar masses) and much hotter than the sun

so maybe shorter than 0.3

OK that gives an idea of the stretchout factor

 

have to go, back later. I think Ned Wright could be wrong and these could be

valid results. have to see. it isnt settled yet

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.