- #1

- 7

- 0

Is it possible to calculate the temperature the heat source based on the temperature increase of the air a certain distance away?

I was thinking the inverse square law could be used because I had found sources online (like this one explaining a classroom experiment) saying that the temperature increase an object experiences from thermal radiation is subject to the inverse square law.

I was thinking that, if the air, say 4000 radii away (heat source ~ 4 m, area of effect based on size of storm alone has radius of no less than 17,000 m), experienced a temperature increase of 10 K, the temperature of the heat source would be

(4000^2) * 10 K = 160,000,000 K

or (assuming the air starts off at 25 degrees C = 298 K)

(4000^2) * (298 K + 10 K) = (4000^2) * 308 K ~ 5,000,000,000 K.

Am I completely off here? I focused on heat transfer through radiation because I've read that radiation takes over as the dominant form of heat transfer with high enough temperature differences. Would it take a temperature of 100,000,000+ degrees to heat the atmosphere from so far away? (I know a temperature like this would do far more than simply heat the air in real life.)