Thats my simple question!
Is there some reason it shouldn't?
Hi bystander - I read that infra red could not penetrate the ocean but caused evaporation on the surface skin. If that is true then infra red would not be able to heat the ocean like short wave radiation does. Is that true? thanks
The enthalpy of vaporization has to come from somewhere --- you don't suppose IR absorption by water might be the source?
Absolutely is the source in this example. But the energy from the incoming IR is not retained by the ocean as heat - its released through evaporation into the atmosphere. So is it correct to think that IR cannot heat (i.e. increase the temperature) of the ocean?
No. IR does heat water but at a very low rate, with not much energy compared to temperature coefficient of water.
Consider also, conduction of heat from the "warm" surface to cooler water layer below the surface. To quote a faculty member from grad school days, "Every calorie looks the same once it's off the bus." Assigning origins and destinations to energies can be misleading.
Thanks Doug. The IPCC estimates that a doubling of CO2 in the atmosphere would increase radiative forcing by 3.7W/m2 assuming clear sky conditions. This could only heat water in the top few molecules initially if water is as impermeable to infra red as I understand. Then as Bystander says any heating at the surface which does not evaporate surface skin molecules could be conducted to (or just mixed with) deeper levels. Could you point me in the direction of calculations as to what the magnitude of this heating might be? I completely understand that measuring this through real world observations must be very difficult as the calories are off the bus by then, but someone must have modelled this or tested this in a controlled environment?
The IPCC spins at best and lies at worst. Impeached.
Quick and dirty demonstration? Two liter soda bottle filled with water; measure T in the morning; leave in sun all day; measure T in late afternoon.
But sunlight is Short wave radiation ! Bystander - the experiment you suggest would only tell me what I already know that: short wave radiation heats water. I want to understand if infra red can heat water!!
Bystander - I found the following passage on a CAGW skeptic's website (http://climaterealists.com/index.php?id=4245)
"However the effect of downwelling infrared is always to use up all the infrared in increasing the temperature of the ocean surface molecules whilst leaving nothing in reserve to provide the extra energy required (the latent heat of evaporation) when the change of state occurs from water to vapour. That extra energy requirement is taken from the medium (water or air) in which it is most readily available. If the water is warmer then most will come from the water. If the air is warmer then most will come from the air. However over the Earth as a whole the water is nearly always warmer than the air (due to solar input) so inevitably the average global energy flow is from oceans to air via that latent heat of evaporation in the air and the energy needed is taken from the water. This leads to a thin (1mm deep) layer of cooler water over the oceans worldwide and below the evaporative region that is some 0.3C cooler than the ocean bulk below."
The last sentence does seem to be validated with this paper: http://www.nature.com/nature/journal/v358/n6389/abs/358738a0.html
But is the rest fair?
Doug - I'm not sure I'm reading your graph correctly but it does seem to show that sunlight at sea level is a mixture of UV, visible and infra red wavelengths. If so - putting a bottle of water in the sunlight and measuring the temp change over a day isn't going to tell me anything about the effectiveness of IR in heating water
The shortwave (visible) is going right on through. That's observation one. A day in the sun will bring the bottle up to 60-70 C. That's observation two. From the Planck radiation law 75% of the energy in sunlight is transmitted at wavelengths longer than the peak intensity wavelength, 500 nm (red).
Dear Bystander - I appreciate your patience with me on this and I am grateful. That said, a bottle in the sunlight "experiment" is a blind alley if we're trying to understand the oceans. Low evaporation rates, the conduction of heat through the plastic of the bottle and diffraction of the light through the curved surfaces of the bottle all make such an experiment an extremely poor way to model the ocean temperature / IR relationship. Furthermore your comment on Planck radiation law calculations does not address this question if incoming IR causes evaporation in the ocean skin layer rather than an increase in the temperature of the ocean.
To get back on track do you agree that this graph is representative of the absorption spectrum of liquid water?
If so - its clear that IR does not penetrate below 1cm and the wavelengths of back radiation from the atmosphere penetrate much less than that (e.g. 1/10^5 metres).
As per the nature article I referenced above (http://www.nature.com/nature/journal/v358/n6389/abs/358738a0.html) the top 1mm of the ocean is typically 0.3C cooler than the bulk mixed layer. Forgive me for being slow - but if most of the total IR radiation and all of the back radiation from the atmosphere penetrates less than the depth of the ocean skin layer that is cooler than the water below - how can IR increase the temperature of the ocean?
These observations suggest to me that almost all of IR radiation incident on the ocean causes evaporation rather than an increase in temperature of the ocean. Where am I going wrong?
... not "wrong," into a "semantic ditch," perhaps. If you're going to give me all the solar radiation that penetrates further than 1 mm by the Kebes plot (shorter than 2 μm), you've given me 80 - 90% of the IR. If you define IR as only that radiation that is absorbed in 1 mm or less, and ignore the 0.8 - 2 μm gap between visible and IR acknowledged by a specific argument, you're losing a lot of energy.
Thanks Bystander. I think I get where you're coming from. I'll give you whatever IR you want! Thanks to this discussion I think I can now refine my original question a little more clearly:
Can an increase in Atmospheric back radiation (from say increases in atmospheric concentrations of greenhouse gases) lead to increases in ocean temperatures?
For anyone else that is interested in this topic I found this set of articles (and associated comments) really useful.
As with most other discussions in climatology the answer isn't simple . . .
What is "back radiation?"
its downward longwave radiation. IR radiating from the atmosphere down to the surface of the earth. Not IR direct from the sun.
Heat moves from (hotter/same T/cooler) body/system to (hotter/same T/cooler) body/system?
Heat moves from hot to cold obviously.
To summarise (and no doubt over-generalise!) one side of the argument seems to posit that:
solar radiation heats the ocean, but atmospheric radiation only heats the top few molecules. So increased Downward Longwave Radiation (DLR) is unable to transfer any additional heat into the bulk of the ocean, instead the energy goes into evaporating the skin layer into water vapor.
The other side of the argument seems to postulate that:
additional downward longwave radiation must alter the IR flux at the surface of the ocean leading to more IR being trapped in lower ocean layers and maintaining a higher water temperature than would be the case with less downward longwave radiation
so the downward longwave radiation doesn't heat the ocean but it slows the cooling that would happen with less DLR
And quite frankly I'm confused! What do you think?
I should also add that I'm very much aware that none of the stuff I've read on this is in the peer reviewed literature. Wozniak et al (2013) "Light abosrbtion in sea water" looks like it might be just what I need - but its paywalled . . . any other suggestions would be gratefully received
Not being "flip" with you --- just wanted to be sure we're both working from the same initial set of ideas/postulates/principles.
Welcome to the wonderful world of energy "balances" in non-equilibrium systems. The system we're "analyzing" (hah!) has as heat sources the sun, ~ 10-4steradians at ~ 5800 K or 1-1.3 kW/m2 at earth surface, and crustal heat leak of 10-30 mW/m2, negligible. The heat sink is 4π steradians at ~ 4 K, the CMB. What else do we know? Some fraction of incident solar radiation is reflected, what fraction is subject to some uncertainty; some fraction is transmitted, very small through the atmospheric "halo", but enough to illuminate an otherwise totally eclipsed moon; and, some fraction is absorbed by atmo-, hydro-, and lithospheres, exchanged by conduction, convection, and radiation, and radiated to the CMB.
What are the exchange rates for each mechanism among the spheres? How good are the models? How good are the measurements? How do we know which "bus" which calorie came in on?
Thanks Bystander that makes me feel better but I can't help feel disappointed that certain reputable scientists describe the science as settled. It seems then that comments within CAGW community that the recent plateau / pause / hiatus in global mean atmospheric temperatures might be explained by the "missing heat" being "trapped" in the deep oceans is not a scientific conclusion from research but merely a hypothesis. Furthermore it seems proving that rising atmospheric CO2 concentrations can heat the ocean is also very difficult. It remains only a hypothesis.
Separate names with a comma.