# Finding the apparent magnitude of the sun, and finding Distances from it.

1. May 16, 2009

### TFM

1. The problem statement, all variables and given/known data

During its lifetime a massive star converts about one solar mass of hydrogen into iron in its core, releasing about 0.7 percent of the rest-mass energy in the form of radiation. When it explodes as a supernovae, the outer envelope of the star (with a mass of 10Msun, say) is ejected at speeds of up to 10,000kms−1.

b)

About 1 per cent of the supernova energy is radiated in the optical over a timescale of about 1 month. Estimate the average luminosity of the supernova in units of the solar luminosity over this time.

done

c)

Using the above numbers, estimate the distance to which supernovae can be detected in an optical survey that images each field of the sky down to an apparent magnitude of mV = 18 once per month. (The absolute magnitude of the Sun is M(sunV) = 4.8.)

2. Relevant equations

$$F = \frac{L}{4\pi d^2}$$ (1)

$$m = m_0 - 2.5 log_{10}(F)$$ (2)

$$m = M + 5(log_{10}d - 1)$$ (3)

3. The attempt at a solution

Okay I have done b), which I got an answer of10.8 Solar Luminosities

However, I am now on part c). I know I need to use the magnitude equation (2) with the flux of the star (1). Now, m0 is the reference flux, which will be the suns. Now I am trying to calculate this, since they give M for the sun. however using (3), I calculate the m value to be around 50, which is much big to the answer I found on the internet, 26. I did the following:

$$m = M + 5(log_{10}d - 1)$$

M = 4.8, d = 1Au = 1.5 * 10^11 m

$$m = 4.8 + 5(log_{10}1.5*10^{11} - 1)$$

$$m = 4.8 + 5(11 - 1)$$

$$m = 4.8 + 5(11 - 1)$$

$$m = 4.8 + 5(10)$$

m = 54.8

Any ideas where I have gone wrong?