Can Supernovae Measurements Constrain the Curvature of the Universe?

  • Thread starter Thread starter AWA
  • Start date Start date
  • Tags Tags
    Universe
AI Thread Summary
The discussion centers on the viability of the Einstein-de Sitter universe model, which posits a flat universe with total matter density equal to the critical density and no cosmological constant. Observations indicate that the actual total matter density is significantly lower than critical, with a concordance value around Ωm = 0.27, raising questions about the model's validity. The conversation also explores the implications of supernova observations on cosmological parameters, suggesting that while these observations alone do not definitively constrain dark energy or curvature, they can be combined with other data to yield tighter constraints. The consensus is that models like L-CDM fit observations better than the Einstein-de Sitter model, which has largely failed observational tests. Ultimately, the discussion highlights the complexities of cosmological modeling and the challenges in reconciling various observational data.
AWA
Messages
134
Reaction score
0
Up until the 80's this was the model favored by the mainstream and it admittedly was in accordance with observations, could we recover it if some day DE and DM are shown to not exist (hypothetically) and are explained by some other effect? Or does it have too many problems on its own?
 
Space news on Phys.org
If I recall correctly, the Einstein de Sitter Universe is one in which:

\Omega_m = 1,

\Omega_\Lambda = 0


The major problem with this model is that numerous different observations intended to estimate the total matter density (including both dark matter and baryons) have come up with the result that it is much less than the critical density. The "concordance" value (the one that is the most consistent with all of the observations) is around Ωm = 0.27 (or something). If you get rid of dark matter, the problem becomes much worse (EDIT: estimates that are strongly constrained by big bang nucleosynthesis models put Ωb ~ 0.05).

When you combine that with the fact that the observations strongly favour a non-zero cosmological constant, well...
 
Last edited:
cepheid said:
If I recall correctly, the Einstein de Sitter Universe is one in which:

\Omega_m = 1,

\Omega_\Lambda = 0


The major problem with this model is that numerous different observations intended to estimate the total matter density (including both dark matter and baryons) have come up with the result that it is much less than the critical density. The "concordance" value (the one that is the most consistent with all of the observations) is around Ωm = 0.27 (or something). If you get rid of dark matter, the problem becomes much worse (EDIT: estimates that are strongly constrained by big bang nucleosynthesis models put Ωb ~ 0.05).

When you combine that with the fact that the observations strongly favour a non-zero cosmological constant, well...

But you are limiting yourself to the flat Einstein-de Sitter model, the way it was told at the time you had also the close and the open universes, in the case of the open solution, Ωm was less than 1, no need to reach the critical density.
I know the CMB seems to point to flatness but this result are very recent and could be subject of posterior revisions.
 
AWA said:
Up until the 80's this was the model favored by the mainstream and it admittedly was in accordance with observations, could we recover it if some day DE and DM are shown to not exist (hypothetically) and are explained by some other effect? Or does it have too many problems on its own?
Hypothetically, of course, as an effective theory. But observationally this model has utterly failed.

Btw, looks like the Einstein de Sitter universe is a flat model:
http://www.britannica.com/EBchecked/topic/139301/cosmology/27596/The-Einstein-de-Sitter-universe
 
Chalnoth said:
Hypothetically, of course, as an effective theory. But observationally this model has utterly failed.
But not as much as the L-CDM


Chalnoth said:
Btw, looks like the Einstein de Sitter universe is a flat model:
http://www.britannica.com/EBchecked/topic/139301/cosmology/27596/The-Einstein-de-Sitter-universe
Sure, in my second post I was referring also to the open Friedmann model which basically only defers in that it doesn't reach the critical density so cepheid was right, I wasn't debating him that but I guess it looked like that., but to be precise yes, the Einstein-de Sitter model is the spatially Euclidean one.
 
AWA said:
But not as much as the L-CDM
Uh, what? Are you actually claiming that L-CDM has failed observational tests? More so than the Einstein-de Sitter universe?
 
AWA said:
Sure, in my second post I was referring also to the open Friedmann model which basically only defers in that it doesn't reach the critical density so cepheid was right, I wasn't debating him that but I guess it looked like that., but to be precise yes, the Einstein-de Sitter model is the spatially Euclidean one.

Emphasis mine. I'm assuming you meant "differs" in which case it seems like you're advocating for an open universe with no cosmological constant. (I'm not trying to be a jerk and nitpick your grammar, I'm just trying to be very clear about how I'm interpreting your posts so that we can communicate effectively). Anyway, the model you propose definitely doesn't match observations as well as LCDM. In fact, such an open model is excluded with pretty high confidence, for example even just by SNe Type Ia observations alone. This is illustrated nicely in Figure 7 from this paper:

http://adsabs.harvard.edu/abs/1999ApJ...517..565P"

I also find the figure really nifty and generally useful.

Chalnoth said:
Uh, what? Are you actually claiming that L-CDM has failed observational tests? More so than the Einstein-de Sitter universe?

Chalnoth's incredulity ---> seconded. If this is what you meant, where did you get this idea?
 
Last edited by a moderator:
cepheid said:
I'm assuming you meant "differs" in which case it seems like you're advocating for an open universe with no cosmological constant.
Sorry about the word swapping, should have read it before post it.
I am not advocating for it, but in an open friedmann model maybe the SNIa observations could be interpreted geometrically rather than as cosmological constant, with k<1 the volume of a sphere is > than 4/3pir^3 and therefore the light of SN spreads over a surface bigger than 4pir^2 and is perceived as fainter than expected for a certain distance.
 
AWA said:
Sorry about the word swapping, should have read it before post it.
I am not advocating for it, but in an open friedmann model maybe the SNIa observations could be interpreted geometrically rather than as cosmological constant, with k<1 the volume of a sphere is > than 4/3pir^3 and therefore the light of SN spreads over a surface bigger than 4pir^2 and is perceived as fainter than expected for a certain distance.
Well, yes, the supernova observations alone do not put much constraint on whether it is dark energy or curvature. However, in order to go all the way to having zero cosmological constant, you need to have either a Hubble rate or supernova intrinsic brightness that is inconsistent with other observations (and not just slightly inconsistent, but completely and utterly wrong).

What is done in cosmology is the degeneracy between various parameters from one particular sort of observation is removed by combining it with different observations. For instance, if you combine supernova, WMAP, and BAO data, you end up with very tight errors around \Omega_k = 0. Just to make sure we're not doing something else wrong, though, we can also use the distance ladder to get an estimate of the supernova intrinsic brightness, and nearby Hubble expansion measurements to get a handle on the Hubble expansion rate. Add these observations to the supernova data, and you get much the same result (albeit with larger error bars).
 
  • #10
Chalnoth said:
Well, yes, the supernova observations alone do not put much constraint on whether it is dark energy or curvature. However, in order to go all the way to having zero cosmological constant, you need to have either a Hubble rate or supernova intrinsic brightness that is inconsistent with other observations (and not just slightly inconsistent, but completely and utterly wrong).

What is done in cosmology is the degeneracy between various parameters from one particular sort of observation is removed by combining it with different observations. For instance, if you combine supernova, WMAP, and BAO data, you end up with very tight errors around \Omega_k = 0. Just to make sure we're not doing something else wrong, though, we can also use the distance ladder to get an estimate of the supernova intrinsic brightness, and nearby Hubble expansion measurements to get a handle on the Hubble expansion rate. Add these observations to the supernova data, and you get much the same result (albeit with larger error bars).
Ok, I was leaving WMAP, and BAO data aside just for the moment (but notice that for instance BAO data alone permits a lambda=0-see: http://supernova.lbl.gov/Union/ ).
I am interested in the error bars with Ho, what kind of Ho would be necesary to fit the SNIa data? 90-100 km/s/Mpc? Bigger? smaller?
 
  • #11
AWA said:
Ok, I was leaving WMAP, and BAO data aside just for the moment (but notice that for instance BAO data alone permits a lambda=0-see: http://supernova.lbl.gov/Union/ ).
But again, in order to get BAO data alone to permit \Lambda = 0, you need other parameters to be absurd. You can do the same with WMAP alone (and get a different set of completely wrong parameters).

AWA said:
I am interested in the error bars with Ho, what kind of Ho would be necesary to fit the SNIa data? 90-100 km/s/Mpc? Bigger? smaller?
It's been quite a while since I looked at this in detail. But right now, H_0 is known to within about 5%.

Anyway, I looked a bit more, and I'd have to do the work over again to be sure, but according to the Supernova Legacy Survey first-year release (four years ago), it's actually not possible to fit \Omega_k = 0. You can see the paper here:
http://xxx.lanl.gov/abs/astro-ph/0510447

The relevant figure is figure 5, where even the 99.7% confidence contours nowhere intersect \Omega_\Lambda = 0 where \Omega_m &gt; 0. I'd have to look into the situation in more detail to see if this can be modified by changing H_0, but I'm not sure any longer that supernovae can actually fit a universe with no dark energy.
 
  • #12
So how exactly do they get to the fit with Probability of \Omega_\Lambda>0=0,99?
Only from the magnitudes and the redshifts? or they have more factors constraining that confidence limit?
 
  • #13
AWA said:
So how exactly do they get to the fit with Probability of \Omega_\Lambda>0=0,99?
Only from the magnitudes and the redshifts? or they have more factors constraining that confidence limit?
Well, the way it works is you perform the following calculation for each supernova:

D_L = (1+z) D_M
D_M = \frac{c}{H_0}\frac{1}{\sqrt{\Omega_k}}sinh\left(\sqrt{\Omega_k}\int_0^z\frac{dz}{E(z)}\right)
E(z) = \frac{H(z)}{H_0} = \sqrt{\Omega_m(1+z)^3 + \Omega_\Lambda + \Omega_k(1+z)^2}

Note that as \Omega_k \to 0, the hyperbolic sine and the factors of \Omega_k just disappear. Also, if \Omega_k &lt; 0, the hyperbolic sine becomes a sine (since sin(x) = sinh(ix)/i).

The luminosity distance, D_L, is what we measure when we look at the brightness of the supernova. So we sample this value at many different redshifts by looking at many different supernovae. The effect of the curvature, as you can see, is two fold. First, it modifies the Hubble expansion rate, which changes the value of the integral. Second, there is an additional geometric factor.

So, as you can see, the dependence of this observation on \Omega_k is non-trivial, and in principle, sampling enough supernovae can potentially give constraints on \Omega_k. The key is getting a broad redshift range.
 
Last edited:
Back
Top