Fantasist said:
It is the mass-luminosity relationship (essentially the same derivation as the one on Wikipedia page).
Yes it is, but the Wiki derivation is horrendous, because it first does it completely wrong (plug in the numbers you'd get from their approach, you'll see how staggeringly wrong it is), and then applies a "correction," which completely eradicates the original horrendous physics, and swaps in the real physics through the back door. It is a perfect example of what a conceptual morass you end up in if you think you should be using surface temperature to infer luminosity. When you understand what they really did there, you'll see what I mean.
And it is not really surprising that the luminosity is basically only determined by the mass (after all, the mass of the primordial cloud is the only parameter that can possibly make any difference for the star formation (assuming identical chemical composition)).
It is extremely surprising that it depends only on the mass, in the sense that it is surprising it does not depend on either R or the fusion physics.
The lack of dependence on R means that if you have a radiating star that is gradually contracting (prior to reaching the main sequence), its luminosity should not change! That would be true even if the star contracted by a factor of 10, if the opacity did not change, and the internal physics did not shift from convection to radiation. But contracting stars do tend to start out highly convective, so do make that transition, and that's why we generally have not noticed this remarkable absence of a dependence on R.
The lack of dependence on fusion physics means that when a star initiates fusion, nothing really happens to the star except it stops contracting. That's not necessarily what must happen, for example when later in the star's life it begins to fuse hydrogen, it will undergo a radical change in structure, and change luminosity drastically. But the onset of hydrogen fusion does not come with any such drastic restructuring of the star, because it started out with a fairly simple, mostly radiative structure, and when fusion begins, it just maintains that same structure because all the fusion does is replace the heat that is leaking out.
It is not further surprising that fusion didn't come into it, as the assumption of 'blackbody' radiation doesn't have to care about the details of the processes by means of which radiation is created and destroyed.
Try telling that to a red giant that begins fusing helium in its core! But you are certainly right that if we get away with assuming that fusion does not affect the internal structure of the star, then that structure is indeed a kind of black box. That's how Eddington was able to deduce that internal structure before he even knew that fusion existed. Still, if you think it's not surprising that fusion doesn't matter, then not only have you learned an important lesson, you may also find it hard to read all the textbooks and online course notes that tell you the fusion physics explains the mass-luminosity relationship!
In any case, you can calculate the luminosity from the surface temperature (as determined from the spectrum), and I bet you will get a far more accurate value for it than from your mass-luminosity relationship (where, as you seem to realize yourself, you have to make certain assumptions about the stellar structure and other parameters determining the diffusion process if you want to arrive at an absolute numerical value for the luminosity).
I'm sure that's true, but it fails the objective of understanding the luminosity from first principles. We can also just
measure the luminosity, that's the most accurate way yet!
That would contradict your derivation above: the time t increases with increasing radius and thus with increasing mass.
That's not what I meant by "emit light faster", I did not mean "the diffusion time is less", I meant "they emit light from their surface at a faster rate."
I don't think the fusion rate cares about the radiation lost from the star.
Well, we know that cannot be true, because the fusion rate equals the rate that radiation is lost from the star.
It is only determined by the local temperature and density.
Thank you for bringing that up, it's an important part of the mistake that many people make. You will see a lot of places that say words to the effect that "because fusion depends so sensitively on temperature, the fusion rate controls the luminosity". That's exactly backward. Because the fusion rate depends so sensitively on temperature, tiny changes in T affect the fusion rate a lot, so the fusion rate has no power to affect the star at all. After all, the thermodynamic properties of the star are not nearly as sensitive to T, so we just need a basic idea of what T is to get a basic idea of what the star is doing. But since fusion needs a very precise idea of what T is, we can always get the fusion to fall in line with minor T modifications. That's why fusion acts like a thermostat on the T, but it has little power to alter the stellar characteristics other than establishing at what central T the star will stop contracting.
If you don't see that, look at it this way. Imagine you are trying to iterate a model of the Sun to get its luminosity right. You give it an M and a R, and you start playing with T. You can get the T basically right just from the gravitational physics (the force balance), and you see that it is in the ballpark of where fusion can happen. You also get L in the right ballpark, before you say anything about fusion (as I showed). But now you want to bring in fusion, so you tinker with T. Let's say originally your T was too high, so the fusion rate was too fast and was way more than L. So you lower T just a little, and poof, the fusion rate responds mightily (this is especially true of CNO cycle fusion, more so than p-p chain, so it works even better for stars a bit more massive than the Sun). So you don't need to change T much at all, so you don't need to update the rest of your calculation much, so you end up not changing L to reach a self-consistent solution! So we see, it is precisely the T-sensitivity of fusion that has made it
not affect L much, though many places you will see that logic exactly reversed.
If you put a 100% reflective mirror around the star, the temperature will steadily increase, and I don't think the fusion will regulate itself down in response. On the contrary, it will result in a fusion bomb.
Yes, 100% reflection causes a lot of physical difficulties, because you can't reach an equilibrium. Even if you just stick to 99%, you would not have much problem-- L would still not be changed much.