# Mass loss in relation to radius of a star

One of the most popular mass loss equations of a star, developed by D. Reimers, is given by:
dM/dt = -(4x10^-13) * η(L/(gR)) solar masses per year

Where η is a free parameter close to unity and L, g, and R are the luminosity of the star, surface gravity of the star, and the radius of the star, respectively.

What I am curious about is that when R increases, the amount of mass lost decreases. This seems counterintuitive, as when the radius increases, the density will decrease and the pull of gravitational energy will decrease so you would expect there to be greater mass loss.

Is there a qualitative reason why the star loses more mass as the radius decreases?

Last edited:

D H
Staff Emeritus
$$\frac{dM}{dt} = -4\cdot10^{-13} \, \eta \frac {L_{\ast}} {gR_{\ast}}$$
$$\frac{dM}{dt} = -4\cdot10^{-13} \, \eta \frac {L_{\ast}R_{\ast}} {M_{\ast}}$$