Finding the magnitude difference between two star

  1. 1. The problem statement, all variables and given/known data

    Determine the apparent magnitude difference between sirius and the sun,as seen from the Earth. How much more luminosity is sirius than the Sun ?

    2. Relevant equations

    f=sigma*T^4(eff)
    m-n=2.5*log(f(m)/f(n))
    3. The attempt at a solution

    the apparent magnitude difference is m-n, m and n being the apparent magnitudes being the sun and sirius. I probably need the find the flux of each star. the flux=5.67e-8 W/(m^2*K^4)*T^4(eff). 30000 K is the effective temperature of Sirius and T(eff)=5780 K for the sun. f(sirius)/f(sun)=(30000 K)^4/(5780)^4=726 . Therefore, m(sirius)-n(sun)=2.5*log(726)=7.152
     
  2. jcsd
  3. mgb_phys

    mgb_phys 8,952
    Science Advisor
    Homework Helper

    The boltzman law formula isn't enought to give the total luminosity of a star, you also need it's diameter - a halogen spotlight is a similair temperature to sirius but doesn't have quite the same power output.
    What other information are you given about Sirius? YOu will also need to know the flux from the sun or it's absolute mangnitude.

    YOu can easily look these up so I'll tell you, m(sirius) = −1.47, m(sun) = −26.74
     
  4. Why do I need to calculate the total Luminosity when I can easily calculate the flux of each star by look up the effective temperature of each star? The ratio of the two fluxes of the stars are given in the equation for apparent magnitude difference
     
    Last edited: Oct 8, 2007
  5. mgb_phys

    mgb_phys 8,952
    Science Advisor
    Homework Helper

    You can't work out the fluxes of the two stars just from their temperatures unless you also know that their diamters are the same.
    To calculate the magnitude difference you either need to know the magnitude of each or the fluxes of each - these are observed parameters.
     
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?