I'm having trouble deriving the amount of dimming expected of standard candles (eg. type 1a supernovae) as a result of dark energy.(adsbygoogle = window.adsbygoogle || []).push({});

Without the presence of dark energy, the standard GR solution (matter-only, at critical density) is that the absolute bolometric brightness of a standard candle varies with redshift z as 1/(1+z - [1+z]^{1/2})^{2}. This expression is the product of two terms: 1/D_{L}^{2}multiplied by 1/(1+z)^{2}. Here D_{L}is the "luminosity distance", which is the expected dimming of light due to geometry. As it turns out, D_{L}is a function of z so that the product simplifies to 1/(1+z - [1+z]^{1/2})^{2}.

Suppose that recent stretching of space due to dark energy is by a factor of b (b>1). Obviously this would change the redshift, replacing 1+z with b(1+z) for a given distant object. (This factor b is the extra stretch that occurred between the time when a given distant object emitted a photon and the present when the photon is received.) How would the factor b change the geometric distance term?

The observed luminosity at z=1 is only about half of the value 1/(1+z - [1+z]^{1/2})^{2}. If one simply replaces 1+z with b(1+z), luminosity vs. redshift curve remains the same instead of reducing to about half at z=1.

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Standard Candle Dimming Due to Extra Expansion

**Physics Forums | Science Articles, Homework Help, Discussion**