For an assignment, we were told to use a program titled "Photoelectric Photometry of the Pleiades", located at this website.(adsbygoogle = window.adsbygoogle || []).push({});

It is basically a simulation of a telescope, in which we can "measure" the apparent magnitude of the stars in the Pleiades cluster.

My question is as follows:

Show that the program is at least faking the results continuously. Measure the V magnitudes of any two stars which are close together. Also record the counts for each star. Verify that the program is correctly convreting flux ratios to magnitude differences. To do this, you will have to determine how the flux is related to the photon count. (Don't forget the sky.)

My answer is as follows:

photon count is proportional to flux

Therefore,

[tex]\frac{P_2}{P_1}=\frac{F_2}{F_1}=\frac{1067491}{1976639}[/tex]

(I used the average photon counts from both stars).

Theoretically, the difference in magnitudes should be,

[tex]m_2 - m_1 = 6.43 - 5.76 = 0.67[/tex]

(I used the apparent magnitudes that I "recorded" with the telescope).

Using the flux ratio, the difference in magnitudes should be,

[tex]m_2 - m_1 = -2.5log(\frac{1067491}{1976639}) = 0.669 = 0.67[/tex]

I obviously obtained the correct results. However, I did not consider anything about the sky, which was a "hint" given in the question.

Also, what is the purpose of finding the magnitudes of 2 stars which are close to each other?

I do not understand how this shows that the program is faking the results, if they are converting the flux ratios to magnitude differences correctly.

Thank you for any help -- Sorry for the long post.

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Magnitude of Stars - Flux, Photon Counts

**Physics Forums | Science Articles, Homework Help, Discussion**