imatfaal Posted June 3, 2016 Share Posted June 3, 2016 http://hubblesite.org/newscenter/archive/releases/2016/17/text/ Astronomers using NASA's Hubble Space Telescope have discovered that the universe is expanding 5 percent to 9 percent faster than expected. "This surprising finding may be an important clue to understanding those mysterious parts of the universe that make up 95 percent of everything and don't emit light, such as dark energy, dark matter, and dark radiation," said study leader and Nobel Laureate Adam Riess of the Space Telescope Science Institute and The Johns Hopkins University, both in Baltimore, Maryland. https://www.theguardian.com/science/2016/jun/03/universe-is-expanding-up-to-9-faster-than-we-thought-say-scientists Physicist and lead author Adam Riess said: “You start at two ends, and you expect to meet in the middle if all of your drawings are right and your measurements are right. “But now the ends are not quite meeting in the middle and we want to know why.” On the Search for more details. ! Moderator Note BTW this thread is for discussing the NEWS - if you want to query the underlying theory or question accelerated expansion then the main forum is where to ask questions and the speculations forum is the place to pose alternatives. 1 Link to comment Share on other sites More sharing options...
Daecon Posted June 3, 2016 Share Posted June 3, 2016 In a way, this is quite encouraging. It means there's still more to learn, and a way forward to learn it. Link to comment Share on other sites More sharing options...
imatfaal Posted June 3, 2016 Author Share Posted June 3, 2016 In a way, this is quite encouraging. It means there's still more to learn, and a way forward to learn it. Definitely agree - "hiccough" in my title was deliberately provocative. The more we learn, the greater the quantity of data, and the deeper our understand of those data then the closer to a good solution. Bit more commentary - still not found the meat tho So – science being science – the quest is now on to find out why. The boffins have narrowed it down to three possibilities. Firstly, our calculation on the effects of dark energy could be wrong. Dark energy, which can't be detected on current instruments, is already causing the expansion of the universe and may have additional properties that theorists haven't accounted for. The second option is that in the early period after the Big Bang, a new kind of subatomic particle burst out travelling at just under the speed of light. This would have sped up the expansion of the early universe and would explain the discrepancies in current theory. "We know so little about the dark parts of the universe, it's important to measure how they push and pull on space over cosmic history," said Lucas Macri of Texas A&M University in College Station, a key collaborator on the study, published in The Astrophysical Journal. The third option is that Einstein's theories of gravitation are wrong, or at least in serious need of revision. That opens up a whole new can of worms. http://www.theregister.co.uk/2016/06/02/universe_expanding_faster/ And because you cannot have enough photographs/images/illustrations from Nasa/Hubble This Hubble Space Telescope image shows one of the galaxies in the survey to refine the measurement for how fast the universe expands with time, called the Hubble constant. The galaxy, UGC 9391, contains two types of stars that astronomers use to calculate accurate distances to galaxies, a key measurement in determining the Hubble constant. The red circles mark the locations of Cepheid variable stars. These stars pulsate at rates that correspond to their true brightness, which can be compared with their apparent brightness as seen from Earth to accurately determine their distance. The blue "X" at bottom right denotes the location of supernova 2003du, a special class of exploding star called a Type Ia supernova. These supernovae are another commonly used cosmic yardstick. They flare with the same brightness and are brilliant enough to be seen from relatively longer distances. Astronomers calibrate the supernovae with the Cepheids in galaxies such as UGC 9391 so that they can accurately calculate the distances to faraway exploding stars. UGC 9391 resides 130 million light-years from Earth. The observations for this composite image were taken between 2012 and 2013 by Hubble's Wide Field Camera 3. http://hubblesite.org/newscenter/archive/releases/2016/17/image/b/ Got it http://hubblesite.org/pubinfo/pdf/2016/17/pdf.pdf The Hubble constant (H0) measured locally and the sound horizon observed from the cosmic microwave background radiation (CMB) provide two absolute scales at opposite ends of the visible expansion history of the Universe. Comparing the two gives a stringent test of the standard cosmo- logical model. A significant disagreement would provide evidence for fundamental physics beyond the standard model, such as time-dependent or early dark energy, gravitational physics beyond General Relativity, additional relativistic particles, or nonzero curvature. Indeed, none of these features has been excluded by anything more compelling than a theoretical preference for simplic- ity over complexity. In the case of dark energy, there is no simple explanation at present, leaving direct measurements as the only guide among numerous complex or highly tuned explanations. BTW - Perlmutter's, Schmidt's and Reiss's lectures at the Nobel Prizegiving are worthy of a good watch. http://www.nobelprize.org/nobel_prizes/physics/laureates/2011/ this was for the initial accelerated expansion. Link to comment Share on other sites More sharing options...
Strange Posted June 3, 2016 Share Posted June 3, 2016 Fascinating stuff. One detail in the NASA article seems wrong to me: Type Ia supernovae, another commonly used cosmic yardstick, are exploding stars that flare with the same brightness and are brilliant enough to be seen from relatively longer distances. I think it is the rate at which the brightness falls (the light curve) that is predictable for Type 1a supernovae, not the absolute brightness. Link to comment Share on other sites More sharing options...
imatfaal Posted June 3, 2016 Author Share Posted June 3, 2016 Fascinating stuff. One detail in the NASA article seems wrong to me: I think it is the rate at which the brightness falls (the light curve) that is predictable for Type 1a supernovae, not the absolute brightness. Those press releases are too often written by the PRO or at least heavily edited. I will check up as I thought it could be absolute - they are all the same sort of progenitor which pop at exactly the same mass therefore the process is the same time and time again [mp][/mp] When the white dwarf reaches 1.4 solar masses, or about 40 percent more massive than our Sun, a nuclear chain reaction occurs, causing the white dwarf to explode. The resulting light is 5 billion times brighter than the Sun. Because the chain reaction always happens in the same way, and at the same mass, the brightness of these Type Ia supernovae are also always the same. The explosion point is known as the Chandrasekhar limit, after Subrahmanyan Chandrasekhar, the astronomer who discovered it. From the same source - HubbleSite - so possibly with the same error but it bears out my memory and the Press Release http://hubblesite.org/hubble_discoveries/dark_energy/de-type_ia_supernovae.php Link to comment Share on other sites More sharing options...
Strange Posted June 3, 2016 Share Posted June 3, 2016 Seems like you (and they) are right. With a caveat: In a series of papers in the 1990s the survey showed that while Type Ia supernovae do not all reach the same peak luminosity, a single parameter measured from the light curve can be used to correct unreddened Type Ia supernovae to standard candle values. https://en.wikipedia.org/wiki/Type_Ia_supernova#Light_curve From the same article: Recently it has been discovered that type Ia supernovae which were considered the same are in fact different, moreover a form of the type Ia supernova which is relatively infrequent today was far more common earlier in the history of the universe. This could have far reaching cosmological significance and could lead to revision of estimation of the rate of expansion of the universe and the prevalence of dark energy. More research is needed.[55][56] I assume this is taken into account in the above research. Link to comment Share on other sites More sharing options...
imatfaal Posted June 3, 2016 Author Share Posted June 3, 2016 Because the collapse always happens at the same mass, the luminosity of the explosion is always the same. http://www.astro.ex.ac.uk/people/hatchell/rinr/candles.pdf Of Course it may be that you are getting mixed up with Cephid Variables - with them it is completely the case. Their luminosity is in a strict relationship with the period of their cycle edit - I see you have a much better link. Will read up on yours above. Link to comment Share on other sites More sharing options...
DrmDoc Posted June 5, 2016 Share Posted June 5, 2016 (edited) According to this NASA HubbleSite press release, the Hubble Constant has been reduced to 73.2 kilometers per second per megaparsec, which is an uncertainty reduction to 2.4% from 3.3% initially. This means that, according to the article, "the distance between cosmic objects will double in another 9.8 billion years." Enjoy! Edited June 5, 2016 by DrmDoc Link to comment Share on other sites More sharing options...
Strange Posted June 5, 2016 Share Posted June 5, 2016 Someone beat you to it: http://www.scienceforums.net/topic/95480-another-universal-expansion-hiccough/ Link to comment Share on other sites More sharing options...
DrmDoc Posted June 5, 2016 Share Posted June 5, 2016 Someone beat you to it: http://www.scienceforums.net/topic/95480-another-universal-expansion-hiccough/ Darn it! A day late and a dollar short! Link to comment Share on other sites More sharing options...
swansont Posted June 5, 2016 Share Posted June 5, 2016 ! Moderator Note Identical topics merged Link to comment Share on other sites More sharing options...
EdEarl Posted June 7, 2016 Share Posted June 7, 2016 I don't understand this paper, Reconciling Planck with the local value of H0 in extended parameter space, but it seems relevant to this thread. The abstract: The recent determination of the local value of the Hubble constant by Riess et al, 2016 (hereafterR16) is now 3:3 sigma higher than the value derived from the most recent CMB anisotropy dataprovided by the Planck satellite in a CDM model. Here we perform a combined analysis of thePlanck and R16 results in an extended parameter space, varying simultaneously 12 cosmologicalparameters instead of the usual 6. We find that a phantom-like dark energy component, witheffective equation of state w= at -1.29 +0.15-0.12 at 68% c.l. can solve the current tension between thePlanck dataset and the R16 prior in an extended CDM scenario. On the other hand, the neutrinoeffective number is fully compatible with standard expectations. This result is confirmed whenincluding cosmic shear data from the CFHTLenS survey and CMB lensing constraints from Planck.However, when BAO measurements are included we find that some of the tension with R16 remains,as also is the case when we include the supernova type Ia luminosity distances from the JLA catalog. I think a 3.3 sigma difference in Hubble constant is large, that this paper doesn't explain the difference, but it does suggest things to investigate. Link to comment Share on other sites More sharing options...
MigL Posted June 7, 2016 Share Posted June 7, 2016 Just for consideration... A galaxy at great distance formed shortly after the Big Bang, would have stars composed of approx. 3/4 Hydrogen and 1/4 Helium as those were the available gases at the time. This star then proceeds to burn out and become a white dwarf with a ferrous, inactive core, but continues to accumulate mass from either a nearby star or a gaseous nebula through which it is travelling. This mass is in the form of Hydrogen/Helium. A close galaxy would have newer stars with a non trivial amount of heavy elements ( atomic weight greater than 26 ). When their core becomes ferrous and goes inactive at the white dwarf stage, they also continue to accumulate mass by the same method, but this additional mass also includes an appreciable amount of 'heavy' elements. Now when both distant and close stars reach a certain mass, they go Type 1A supernova before collapsing to a neutron star. My question is, do we know enough about the process ( and how differences manifest themselves ) to say that the recent ( close ) and long ago ( distant ) Type 1A events are equivalent ? And if not, could the 'differences' explain the accelerated expansion ? Link to comment Share on other sites More sharing options...
imatfaal Posted June 9, 2016 Author Share Posted June 9, 2016 I think Type 1a Supernovae are only formed from a slowly rotating Carbon-Oxygen white Dwarfs with a significant other . White dwarfs are one of the most common stars - and only a tiny subset are primed for this form of final deflagration. Iron core white dwarfs are stable within most timescales and will just cool and go dark Link to comment Share on other sites More sharing options...
Sensei Posted June 9, 2016 Share Posted June 9, 2016 Fascinating stuff. One detail in the NASA article seems wrong to me: I think it is the rate at which the brightness falls (the light curve) that is predictable for Type 1a supernovae, not the absolute brightness. "The typical visual absolute magnitude of Type Ia supernovae is Mv = −19.3 (about 5 billion times brighter than the Sun), with little variation.[14]" https://en.wikipedia.org/wiki/Type_Ia_supernova Link to comment Share on other sites More sharing options...
MigL Posted June 10, 2016 Share Posted June 10, 2016 My bad, Imatfaal. I had confused it with the Chandrshekar ( sp ? ) limit, while this is the carbon/oxygen re-ignition mass, and subsequent runaway reaction. But could the presence of heavier nuclei in the core of recent/nearby white dwarfs produce a moderating effect on the reaction such that far-away Type 1A supernova are actually more luminous and farther away in comparison with near/recent ones. Link to comment Share on other sites More sharing options...
Raider5678 Posted June 10, 2016 Share Posted June 10, 2016 According to this NASA HubbleSite press release, the Hubble Constant has been reduced to 73.2 kilometers per second per megaparsec, which is an uncertainty reduction to 2.4% from 3.3% initially. This means that, according to the article, "the distance between cosmic objects will double in another 9.8 billion years." Enjoy! Just to be clear cosmic objects refer to galaxies or galaxy clusters right? Not smaller things like solar systems. Just wondering... Link to comment Share on other sites More sharing options...
Strange Posted June 10, 2016 Share Posted June 10, 2016 Just to be clear cosmic objects refer to galaxies or galaxy clusters right? Not smaller things like solar systems. Just wondering... Yes. Expansion only occurs on very large scale. Galaxy clusters (and smaller things) are held together by their gravity. Link to comment Share on other sites More sharing options...
imatfaal Posted June 10, 2016 Author Share Posted June 10, 2016 My bad, Imatfaal. I had confused it with the Chandrshekar ( sp ? ) limit, while this is the carbon/oxygen re-ignition mass, and subsequent runaway reaction. But could the presence of heavier nuclei in the core of recent/nearby white dwarfs produce a moderating effect on the reaction such that far-away Type 1A supernova are actually more luminous and farther away in comparison with near/recent ones. Beyond my ken - although I do remember that a short and rapid production of vast amounts of the heavier elements is part of the early Type 1a Supernova; thus considering that all Type1a will have a large portion of auto-created heavier elements in their make up from a stage soon after ignition I am not sure if leached heavier elements would make a significant difference. I also think it possible that if enough heavier elements (to vary luminosity from standard candle) had leached into the white dwarf environment then this might also be enough to halt the seemingly quite well-balanced requirements for the type 1a supernova in the first place. Link to comment Share on other sites More sharing options...
Recommended Posts