# Homework Help: Layer model

1. Oct 3, 2015

### TheMathNoob

1. The problem statement, all variables and given/known data
Over the last ~100 years, the average solar constant has increased by ~ 0.1 W/m2.
How much would the Earth’s temperature change simply due to this increase in the solar constant?

2. Relevant equations

[L * (1 – a)] / [4 * e * sigma]) 1/4 = Tground

L is the incoming solar radiation in the earth
a is the albedo
e is the emissitivity
sigma is the boltzman constant[
We showed that a = 0.3, e = 0.6, L= 1350 W/m2, gives us a realistic surface temperature.

3. The attempt at a solution
I attempted this by increasing the incoming radiation of the sun by 0.1, so it would be 1350.1, but I think this is wrong because the solar constant is like the constant for emissitivity, so I would think that in some way, I have to calculate the new L which is the incoming radiation of the sun.

Last edited: Oct 3, 2015
2. Oct 3, 2015

### SammyS

Staff Emeritus
You would be well advised to edit that post to put the Font size back to normal (Size 4) and remove the Bold-ing.

Last edited: Oct 3, 2015
3. Oct 3, 2015

4. Oct 3, 2015

### SammyS

Staff Emeritus
Is that 1/4 supposed to be an exponent for part, or all, of that expression?

Also: L is the solar constant.

5. Oct 3, 2015

Yes

6. Oct 4, 2015

### haruspex

Not sure what you mean. It is not some physical constant. Indeed, it probably should not be called a constant.

7. Oct 4, 2015

### HallsofIvy

Is this L the "old" or "new" value? In other words is it calculated for the present or 100 years ago? If present subtract the given change, 0.1. Calculate Tground with both old and new values to see how much it has changed.

8. Oct 4, 2015

### TheMathNoob

In this case, L is old and then we have to calculate the new T by increasing L I think by 0.1

9. Oct 4, 2015

### SammyS

Staff Emeritus
No.

Just increase the old L value by 0.1 W/m2 .