1. The problem statement, all variables and given/known data Over the last ~100 years, the average solar constant has increased by ~ 0.1 W/m2. How much would the Earth’s temperature change simply due to this increase in the solar constant? 2. Relevant equations [L * (1 – a)] / [4 * e * sigma]) 1/4 = Tground L is the incoming solar radiation in the earth a is the albedo e is the emissitivity sigma is the boltzman constant[ We showed that a = 0.3, e = 0.6, L= 1350 W/m2, gives us a realistic surface temperature. 3. The attempt at a solution I attempted this by increasing the incoming radiation of the sun by 0.1, so it would be 1350.1, but I think this is wrong because the solar constant is like the constant for emissitivity, so I would think that in some way, I have to calculate the new L which is the incoming radiation of the sun.