Drakkith said:
Maybe I'm missing something, but since fusion is the source of power in a star, how is Luminosity not a direct result of the amount of fusion? It looks to me like the rate of fusion is dependant on the mass of the star which then determines the luminosity of the star.
Fusion is indeed the source of the luminosity, but that does not tell you that fusion
determines the luminosity-- the luminosity could still determine the fusion (and it does). Consider my analogy again-- if you had a diamond mine in your back yard, are the diamonds not the source of your spending power? Yet the rate you mine diamonds would not determine the rate you spend money, if you are the lazy type who only mines diamonds when you want to buy something. In that case, you would find that diamonds are the source of your wealth, but the rate you mine them is set by the rate you spend money, not the other way around.
That analogy only proves that the causal arrow could go either way, so to see how it goes in stars, we have to look more closely. A star is basically a leaky bucket of light. Given that the core temperature has to be about 10 million K to get fusion, and given the virial theorem, we can immediately estimate the radius of the star given its mass-- for a given mass, the star just contracts until the radius gives you a characteristic escape speed that is appropriate for the speed to get H to fuse. That's the virial theorem of a main sequence star. So now we know the basic T and basic R, for a given M, so we know how much light (thermal radiation) our star holds in its volume, and we know how long it takes to leak out of the bucket (opacity physics gives us that), so we know the luminosity-- even though we have not said squat about the fusion rate, or the coefficient involved in determining that rate given the T and P. So that coefficient does not determine the luminosity, and changing it hypothetically would have only a small effect on the luminosity. Instead, the core just adjusts a little, for whatever coefficient you take, to provide the necessary luminosity to balance the losses from the leaky bucket. That's just the process that determines the fusion rate-- there is no possible way to know the fusion rate until you know what the luminosity needs to be, but you can certainly know what the luminosity needs to be without even specifying what the coefficient of fusion is (or any details about the fusion process other than the fusion temperature).
This means that ironically, it is the extreme temperature sensitivity of fusion that makes the fusion rate the slave to luminosity, despite the fact that that same sensitivity is often (erroneously) cited as the reason that the fusion process is the cause of the high luminosity of massive stars. Indeed, all fusion ever does in stars is cause their luminosity to reach a long-lived steady state-- if you magically turn off fusion, the luminosity of the star will increase, not decrease, as the leaking bucket causes the star to contract and release gravitational energy. Fusion merely serves as a stabilizer that controls the evolutionary timescale-- the physics of the fusion rate never controls the luminosity, once we assert that fusion is very temperature sensitive and turns on at a given known temperature. The luminosity that the star has when its core reaches that fusion temperature is what will set the fusion rate, regardless of any other detail of the fusion process (thiis is also why there is a single prevailing relation between the mass and luminosity of a main sequence star, despite the fact that at some mass, p-p chain fusion is taken over by CNO cycle fusion, a comipletely different process with totally different rate coefficients).