# Fictional Fireball Creates Warm Front

I was debating on another forum how hot a heat source in a cartoon would have had to be to cause a temperature increase in the atmosphere. In a cartoon, I found a scene in which a "fireball" (not really fire) heats the air to create a warm front immediately, clashing with an ice beam that creates its own cold front and forming a storm in the process. (It was explicitly stated that the attacks created their own warm and cold fronts.)

Is it possible to calculate the temperature the heat source based on the temperature increase of the air a certain distance away?

I was thinking the inverse square law could be used because I had found sources online (like this one explaining a classroom experiment) saying that the temperature increase an object experiences from thermal radiation is subject to the inverse square law.

I was thinking that, if the air, say 4000 radii away (heat source ~ 4 m, area of effect based on size of storm alone has radius of no less than 17,000 m), experienced a temperature increase of 10 K, the temperature of the heat source would be

(4000^2) * 10 K = 160,000,000 K

or (assuming the air starts off at 25 degrees C = 298 K)

(4000^2) * (298 K + 10 K) = (4000^2) * 308 K ~ 5,000,000,000 K.

Am I completely off here? I focused on heat transfer through radiation because I've read that radiation takes over as the dominant form of heat transfer with high enough temperature differences. Would it take a temperature of 100,000,000+ degrees to heat the atmosphere from so far away? (I know a temperature like this would do far more than simply heat the air in real life.)

## Answers and Replies

mfb
Mentor
That doesn't work. The first approach would be the temperature necessary for a small amount of air (with a radius of about 4 m) that, once it reaches equilibrium with the air in the 17 km radius (edit:) but only there in a 4 meter shell, leads to a temperature increase by 10 K. But only if we assume heat capacity to stay the same at those temperatures, which is not true.
You also might have to wait days to reach such an equilibrium.

If you want radiation to heat the air, you need radiation that gets absorbed by air - but then the radiation falls off faster than an inverse square law due to this absorption. You get the most heating if your absorption length is about 17 km, and 1/e of the initial radiation reaches that distance. To heat air by 10 degrees, you need roughly 13 J/m3, multiplied by the absorption length you need radiation of 220 kJ/m2. Integrated over a spherical shell with a radius of 17 km and multiplied by the factor of e, that is 2.2*1015 J, about 500 kT of TNT-equivalent, a large nuclear fission bomb. Such a bomb or similar device will turn the air around it into plasma, increasing the absorption of electromagnetic radiation, therefore reducing the remaining radiation at a distance of 17 km.

As blackbody radiation, if you want to release that energy within an hour you would need e*40002*220 kJ/m2 / (3600s) = 2.5 GW/m^2, or about 15000 K. At this wavelength emission will be mainly in the infrared and visible light, where the atmosphere is quite transparent, so we don't get the correct absorption length. The object would have to be hotter, but not that much. It would need a powerful internal energy source to keep this temperature for an hour.

Last edited:
What if the temperature increase happened in a second (or less)? Would that affect the absorption of radiation significantly?

Does this mean temperature increase falls slower than the square of the distance from the source?

Last edited:
mfb
Mentor
What if the temperature increase happened in a second (or less)?
Scaling up the power by a factor 3600 means scaling up the temperature by a factor 36001/4, about 7.7. It would lead to significant UV radiation, which gets absorbed more.
Does this mean temperature increase falls slower than the square of the distance from the source?
Faster, as the intensity drops faster than 1/r2.

Faster, as the intensity drops faster than 1/r2.
I was confused because the answer came out to be 15,000 K and 1/r^2 gave 1/16,000,000.

Why did you use 13 J/m^3 for heating the air by 10 K? I thought heating the air would require around 1005 J per kg per Kelvin. A Google search brought up a density of 1.225 kg/m^3 for the density of air (at sea level).

Last edited:
mfb
Mentor
.
I was confused because the answer came out to be 15,000 K and 1/r^2 gave 1/16,000,000.
Well, that approach does not work. There is no "Kelvins radiating outwards".

Oops, forgot a factor 1000. 2.2*1018 J, or 500 MT, that is a gigantic fusion bomb. Blackbody temperature increases to 85000 K (1 hour) or 650,000 K (1s).

I think I understand this now. Thanks for clearing this up.

A 'front' in the meteorological sense is a boundary between air masses having differing temperature and humidity.
This often results in convective activity and precipitation.
Weather fronts generally are slow moving systems and can persist for many days.
Purely radiative warming should not cause a distinct boundary of that kind, and should propagate from source at the speed of light,
(although rapidly losing intensity while doing so)

I was thinking I could get a "low-end" for the hypothetical temperature using radiation because I'm pretty sure using convection would give a bigger result.