- #1
Markus0003000
- 2
- 0
One of the most popular mass loss equations of a star, developed by D. Reimers, is given by:
dM/dt = -(4x10^-13) * η(L/(gR)) solar masses per year
Where η is a free parameter close to unity and L, g, and R are the luminosity of the star, surface gravity of the star, and the radius of the star, respectively.
What I am curious about is that when R increases, the amount of mass lost decreases. This seems counterintuitive, as when the radius increases, the density will decrease and the pull of gravitational energy will decrease so you would expect there to be greater mass loss.
Is there a qualitative reason why the star loses more mass as the radius decreases?
dM/dt = -(4x10^-13) * η(L/(gR)) solar masses per year
Where η is a free parameter close to unity and L, g, and R are the luminosity of the star, surface gravity of the star, and the radius of the star, respectively.
What I am curious about is that when R increases, the amount of mass lost decreases. This seems counterintuitive, as when the radius increases, the density will decrease and the pull of gravitational energy will decrease so you would expect there to be greater mass loss.
Is there a qualitative reason why the star loses more mass as the radius decreases?
Last edited: