A few years back I started a thread to make the point that there is a common misconception about main-sequence stars that their fusion rate sets their luminosity, in the sense that to know what the luminosity of the star will be, you need to know what the fusion rate is. In particular, you would need to know the details of fusion physics that set that rate. I argued that you actually don't need to know many details about fusion, except that it sets in quite suddenly around 10 million Kelvin, to get the luminosity of a main-sequence star fairly accurately. Furthermore, the reason for this is that the luminosity is actually what sets the fusion rate, because fusion is a self-adjusting process that will do whatever it needs to resupply whatever heat the star is losing. Finally, the rate a star loses heat can be known fairly well without knowing much about fusion, beyond the temperature at which it sets in. Apparently I did not present my arguments well, because that thread was closed. I mention this only because I do not want to appear to be sidestepping the mods, this new thread can be viewed as completely independent and involves a different basic question that can be posed very straightforwardly, and dodges the whole dodgy issue of "which sets which". The question is this: If you imagine that all the physics of the Sun is the same, but the rate of hydrogen fusion is uniformly doubled in all cases (say by doubling all the fusion cross sections), what does it seem like should happen to the main-sequence luminosity of the Sun? An exact answer is not needed and would be difficult, let me just ask what people think is the general answer here and why, and let that serve to address the issue in place of claims about whether fusion rates set luminosity or luminosity sets fusion rates.