Apparent Magnitude: Understanding Star X's Brightness

In summary, The apparent magnitude of a star is a relative measure of how bright it appears to an observer. It is based on the brightness of the star compared to a certain reference magnitude.
  • #1
devd
47
1
What does this statement mean: "Apparent magnitude of a star, X, is m." ?

[itex] m_2 - m_1 = -2.5 log (B_2/B_1) [/itex]
Apparent magnitudes are defined relatively, right? We can talk about differences in apparent magnitudes. If i know the ratio of the brightnesses, i can find out the difference in apparent magnitudes. Do we have some kind of reference magnitude, relative to which we can make the statement in quotes?
 
Astronomy news on Phys.org
  • #3
I can understand that qualitatively. But, what about the quantitative statement?
 
  • #4
devd said:
What does this statement mean: "Apparent magnitude of a star, X, is m." ?

[itex] m_2 - m_1 = -2.5 log (B_2/B_1) [/itex]
Apparent magnitudes are defined relatively, right? We can talk about differences in apparent magnitudes. If i know the ratio of the brightnesses, i can find out the difference in apparent magnitudes. Do we have some kind of reference magnitude, relative to which we can make the statement in quotes?

Your quantitative expression is correct, and it is true that you need some reference to set the scale. Historically the star Vega was assigned an apparent magnitude of 0.0 to set the zero point of the scale. However, today the reference is defined as a certain flux in ergs/s/cm^2/Hz, as shown in the table in the Wikipedia entry. Vega is still quite close to an apparent magnitude of 0.0 with this definition.

jedishrfu's statement is incorrect. Apparent magnitude describes how bright an object appears. No celestial object has an apparent magnitude brighter than the sun (obviously). I think what jedishrfu was trying to say refers to absolute magnitude, which describes the intrinsic brightness of an object. Absolute magnitude is defined as the apparent magnitude of the object when viewed at a distance of 10 parsecs. The sun has an apparent magnitude of about -27, but an absolute magnitude of +4.83, so it would be a dim naked eye star if it were 10 parsecs away. While it is true that most naked-eye stars are intrinsically brighter than the sun (meaning they have an absolute magnitude less than the sun), it is not true that all naked eye stars are intrinsically brighter than the sun. Epsilon Eridani, for example, is a nearby naked eye star with an apparent magnitude of +3.7 but an absolute magnitude of +6.19, meaning that it is intrinsically fainter than the sun
 
  • Like
Likes 1 person
  • #5
Its one of the many ironies of Science that the thing that is easy to measure (brightness) is not considered absolute, and that absolute magnitude is a calculated (adjusted) value which requires several estimated parameters to determine (even apparent magnitude does). I expect I'm not telling you anything if I rearrange your equation to:
m' = m-2.5*log(B'/B) [where m'=m_sub2, B'=B_sub2, m=m_sub1, etc.] which answers your question (as long as we agree what (star) to use for m and B (as well as units and measurement). No particularily good reason that brightness should arbitrarily use the visible spectrum, other than that's what we see. Why its adjusted to remove atmospheric filtering is another question. Between any star and us is the following filters: dust and gas and plasma, the atmosphere of the Earth, the optical properties of the telescope, the optical clarity of our eyes, the sensitivity of your eye to different wavelengths (there are as many 'average' observers as there are 'average families'(with 2.1 children)). Its a very good idea to specify a (range of) wavelengths (as well as how the cut-off at each end is handled), but its not been that long since we first figured out how to measure the flux at a given wavelength with high accuracy. Early photodetectors were notoriously inefficient at capturing photons.
 
  • Like
Likes 1 person

1. What is apparent magnitude?

Apparent magnitude is a measure of how bright a star appears to an observer on Earth. It is based on a scale from 1 to 6, where smaller numbers indicate brighter stars and larger numbers indicate dimmer stars.

2. How is apparent magnitude measured?

Apparent magnitude is measured using a device called a photometer, which measures the amount of light received from a star. This measurement is then compared to the apparent magnitude of other stars to determine its brightness.

3. What factors affect apparent magnitude?

The two main factors that affect apparent magnitude are distance and intrinsic brightness. A star that is farther away will appear dimmer, while a star that is closer will appear brighter. Intrinsic brightness, or absolute magnitude, is a measure of a star's actual brightness, regardless of its distance from Earth.

4. Can two stars with the same apparent magnitude have different intrinsic brightness?

Yes, two stars with the same apparent magnitude can have different intrinsic brightness. This is because distance plays a major role in apparent magnitude, so a closer star with a lower intrinsic brightness may appear just as bright as a farther star with a higher intrinsic brightness.

5. How does the Sun's apparent magnitude compare to other stars?

The Sun's apparent magnitude is -26.74, making it the brightest object in our sky. However, compared to other stars, the Sun is actually considered to be a relatively average or even slightly dim star, with many other stars in the night sky having higher apparent magnitudes.

Similar threads

  • Astronomy and Astrophysics
Replies
6
Views
2K
  • Astronomy and Astrophysics
Replies
2
Views
2K
Replies
3
Views
2K
  • Astronomy and Astrophysics
Replies
2
Views
2K
  • Astronomy and Astrophysics
Replies
1
Views
2K
  • Astronomy and Astrophysics
Replies
1
Views
2K
  • Astronomy and Astrophysics
Replies
26
Views
2K
  • Astronomy and Astrophysics
Replies
1
Views
3K
  • Astronomy and Astrophysics
Replies
8
Views
2K
  • Introductory Physics Homework Help
Replies
6
Views
1K
Back
Top