beecee Posted March 19, 2021 Share Posted March 19, 2021 https://phys.org/news/2021-03-accuracy-cosmological-analysis-technique-mock.html Researchers confirm accuracy of cosmological data analysis technique using mock data: Astronomers have played a game of guess-the-numbers with cosmological implications. Working from a mock catalog of galaxies prepared by a Japanese team, two American teams correctly guessed the cosmological parameters used to generate the catalog to within 1% accuracy. This gives us confidence that their methods will be able to determine the correct parameters of the real universe when applied to observational data. The basic equations governing the evolution of the universe can be derived from theoretical calculations, but some of the numbers in those equations, the cosmological parameters, can only be derived through observations. Cosmological parameters tied to the unobservable parts of the universe, like the amount of dark matter or the expansion of the universe driven by dark energy, must be inferred by looking at their effects on the distribution of visible galaxies. There is always uncertainty when working with the dark part of the universe, and it is hard to be sure that the models and data analysis are accurate. more at link..... the paper: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.102.123541 Blinded challenge for precision cosmology with large-scale structure: Results from effective field theory for the redshift-space galaxy power spectrum: ABSTRACT: An accurate theoretical template for the galaxy power spectrum is key for the success of ongoing and future spectroscopic surveys. We examine to what extent the effective field theory (EFT) of large-scale structure is able to provide such a template and correctly estimate cosmological parameters. To that end, we initiate a blinded challenge to infer cosmological parameters from the redshift-space power spectrum of high-resolution mock catalogs mimicking the BOSS galaxy sample but covering a 100 times larger cumulative volume. This gigantic simulation volume allows us to separate systematic bias due to theoretical modeling from the statistical error due to sample variance. The challenge is to measure three unknown input parameters used in the simulation: the Hubble constant, the matter density fraction, and the clustering amplitude. We present analyses done by two independent teams, who have fitted the mock simulation data generated by yet another independent group. This allows us to avoid any confirmation bias by analyzers and to pin down possible tuning of the specific EFT implementations. Both independent teams have recovered the true values of the input parameters within subpercent statistical errors corresponding to the total simulation volume. ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Supplementary: https://phys.org/news/2021-02-supercomputer-cosmic-clock.html FEBRUARY 16, 2021 Supercomputer turns back cosmic clock: Astronomers have tested a method for reconstructing the state of the early universe by applying it to 4000 simulated universes using the ATERUI II supercomputer at the National Astronomical Observatory of Japan (NAOJ). They found that together with new observations, the method can set better constraints on inflation, one of the most enigmatic events in the history of the universe. The method can shorten the observation time required to distinguish between various inflation theories. Just after the universe came into existence 13.8 billion years ago, it suddenly increased more than 1 trillion trillion times in size in less than a trillionth of a trillionth of a microsecond, but no one knows how or why. This sudden inflation is one of the most important mysteries in modern astronomy. Inflation should have created primordial density fluctuations that would have affected the distribution of galaxy development. Thus, mapping the distribution of galaxies can rule out models for inflation that don't match the observed data. more at link... the paper: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.103.023506 Constraining primordial non-Gaussianity with postreconstructed galaxy bispectrum in redshift space: ABSTRACT Galaxy bispectrum is a promising probe of inflationary physics in the early Universe as a measure of primordial non-Gaussianity (PNG), whereas its signal-to-noise ratio is significantly affected by the mode coupling due to nonlinear gravitational growth. In this paper, we examine the standard reconstruction method of linear cosmic mass density fields from nonlinear galaxy density fields to decorrelate the covariance in redshift-space galaxy bispectra. In particular, we evaluate the covariance of the bispectrum for massive-galaxy-sized dark matter halos with reconstruction by using 4000 independent N-body simulations. Our results show that the bispectrum covariance for the postreconstructed field approaches the Gaussian prediction at scale of k<0.2 h Mpc−1. We also verify the leading-order PNG-induced bispectrum is not affected by details of the reconstruction with perturbative theory. We then demonstrate the constraining power of the postreconstructed bispectrum for PNG at redshift of approximately 0.5. Further, we perform a Fisher analysis to make a forecast of PNG constraints by galaxy bispectra including anisotropic signals. Assuming a massive galaxy sample in the Sloan Digital Sky Survey Baryon Oscillation Spectroscopic Survey, we find that the postreconstructed bispectrum can constrain the local, equilateral, and orthogonal types of PNG with ΔfNL∼13, 90, and 42, respectively, improving the constraints with the prereconstructed bispectrum by a factor of 1.3–3.2. In conclusion, the reconstruction plays an essential role in constraining various types of PNG signatures with a level of ΔfNL≲1 from the galaxy bispectrum based on upcoming galaxy surveys. :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: OK, if I understand these two papers properly, they are confirming the general reliability of current cosmological methodologies for predicting, and validating computer simulation methods. Am I correct? If this is so, then we can be reasonably confident in the theories and models that arise from such data. Does this also add validity to the "reality" of what our data is telling us? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now