Relationship between star radius and luminosity

  • I
  • Thread starter marksyncm
  • Start date
  • #1
marksyncm
100
5
In a PDF presentation on star formation that I'm currently reading, I ran into the following statement:

"If we observe an increase in a star's temperature but without any changes in its luminosity, it means the star is shrinking (its radius is decreasing)"

I'm having trouble understanding this. Here's why:

Luminosity is the total energy output of a star over a given time. If I'm understanding this definition correctly, it means an increase in a star's temperature will increase its luminosity. Assuming this is true, the statement I quoted above seems to imply that a decrease in a star's radius, assuming nothing else changes, should reduce its luminosity. (I think this must be true in order for luminosity to remain constant despite an increase in temperature.)

But why would that be the case? Intuitively I have always thought that reducing a star's radius, without changing its temperature, would keep the luminosity constant. The energy output per unit area would change, but the total for the sun would remain the same, my thinking went. Example:

A star has a surface area of 1,000 units and is emitting 1,000 units of energy per second (luminosity), therefore each single area unit is emitting a single energy unit. If the surface area of the star is reduced to 500 units and nothing else is changed, I thought the luminosity would still equal 1,000 energy units, only now each single area unit is emitting two energy units instead of one.

Is my line of reasoning above incorrect? Is there some intrinsic limit to how much energy a "single unit area" can output? What am I missing?

Thank you.
 

Answers and Replies

  • #2
Orodruin
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Gold Member
20,004
10,650
The energy output per unit area would change, but the total for the sun would remain the same, my thinking went.
This is incorrect. A star is quite close to a blackbody. Its energy emission per unit area therefore only depends on the temprature to a good approximation. If you do not change the temperature you do not change the output per unit area.
 
  • #3
marksyncm
100
5
Your post made me realize something: when reading "temperature", for some reason I was thinking "core temperature" instead of "surface temperature." I need to regroup my thoughts, but at least the original sentence I quoted from the presentations seems obviously correct now. Thanks.
 
Last edited:
  • #4
Ken G
Gold Member
4,463
335
Your intuition does have one thing right-- the luminosity of a star like the Sun does not depend on its radius, in the sense that a star that is contracting toward the main sequence does not change its luminosity much as its radius shrinks, and then it pauses to fuse its core hydrogen, but then when radius changes resume, again the luminosity does not change (until it becomes a red giant, and the internal physics changes drastically, along with the luminosity). So for an important period of the Sun's evolution, its luminosity does behave quite a lot like you describe. But as you now realize, that doesn't contradict the initial statement you cited. Indeed, the statement is true about the pre-main-sequence phase, even if you do interpret it as talking about core temperature. During that phase, the star does have a rising core temperature, and it does shrink, and its luminosity does not really change much.
 
  • #5
marksyncm
100
5
Thank you for following up, I appreciate it.

I need to follow up with a question regarding this bolded part: "Indeed, the statement is true about the pre-main-sequence phase, even if you do interpret it as talking about core temperature. "

This made me doubt my previous understanding. So it is possible to increase core temperature without increasing the star's energy output? That is, doesn't going from, say, 5,000,000 K to 7,000,000 K lead to more energy being emitted from the core, even though fusion temperatures have not been reached yet? I guess I'm still not understanding where that extra energy from the core is going if its not being emitted from the surface of the star, thereby increasing its luminosity.
 
Last edited:
  • #6
Ken G
Gold Member
4,463
335
Yes, that's what happens in the late stages of the pre-main-sequence phase of a star like the Sun. The way the solar core went from 5 million K to 7 milllion K is by gravitational contraction. So you had a higher temperature but a smaller volume. By the virial theorem, the T was proportional to M/R, and the light energy is proportional to T^4 R^3, so that's M^4 / R. M stays the same, so that's a small increase in the total light energy in the Sun. The luminosity takes that light energy and multiplies it by the rate of escape via radiative diffusion. If the opacity per gram stays fixed (a rough approximation), the escape rate scales like R/M (the light has less distance to go, but the density is higher, and diffusion cares more about the latter than the former), so the R cancels and you end up with constant luminosity as the Sun contracts and heats in the late stages of the pre-main-sequence phase. This behavior is called the "Henyey track." (Earlier stages involve complete convection, rather than radiative diffusion, and are called the "Hayashi track").
 

Suggested for: Relationship between star radius and luminosity

Replies
1
Views
1K
  • Last Post
Replies
1
Views
256
Replies
15
Views
4K
Replies
14
Views
435
Replies
10
Views
1K
Replies
8
Views
1K
  • Last Post
Replies
3
Views
484
  • Last Post
Replies
6
Views
239
Replies
1
Views
1K
Top