Relative Humidity (RH) affecting atmospheric attenuation of light?

AI Thread Summary
The discussion focuses on how relative humidity (RH) impacts atmospheric attenuation of light, particularly during sunset and moonrise/moonset scenarios. The inquiry compares conditions of 0% RH to 100% RH, emphasizing the potential variations in lux levels under these extremes. Participants note that the effects of humidity on light attenuation may vary by frequency within the visible spectrum, suggesting that different colors of light could be affected differently. There is a suggestion that absolute humidity, rather than relative humidity, might be a more significant factor in understanding light absorption and scattering. The conversation highlights the complexity of the topic and the need for further research to clarify these effects.
sistruguru
Messages
7
Reaction score
0
Ok, so I have another question that online searches have not been able to produce. Everything I've seen online references sound waves, or RF signals, or electromagnetic waves. I know that the reason we can view sunsets and sunrises safely is because of atmospheric attenuation - the amount of "material" (water vapor, pollution, topography, foliage, etc), and of course the more oblique, the more substance there is for the light to travel - but I'm trying to find a publication that can tease out how relative humidity (alone) impacts the degree of atmospheric attenuation. Let's say we are comparing a sunset/sunrise in Arizona with 15% RH, compared to a sunset/sunrise with 90% RH in SC. How much of an impact does that added amount of humidity have? To go with extremes, to make it easy, let's just compare an RH of 0% to an RH of 100%.

As always, any insight helps.
 
Science news on Phys.org
Just so no one thinks the difference might be negligible, I'm doing this in reference to moonlight, at moonrise/ moonset, with the various phases of the moon. Of course a full moon would be the easiest to envision. So the question would be, on the night of a full moon, when the moon is at 170 degrees to the horizon, in an environment with 100% humidity, compared to a full moon at 170 degrees to the horizon with 0% humidity, would the amount of lux vary? And let's consider both locations to be of reasonable distance from any sources of air pollution. And both are nearly flat, so minimal interference from topography or foliage.
 
^that first link seemed to be mostly concentrated around electromagnetic waves, correct? It only mentions light a couple of times in the article.

I used your search string and came up with several new pubs I had not seen. I'll dig through those. Most seem to be dancing around the immediate question, but I suppose that is to be expected. Perhaps it is just a more involved answer.
 
But light is electromagnetic waves. It differs from radio only by frequency.

Even within the visible spectrum, the answer for red light may be different than green, and different than blue.

Good luck digging. You're sure to learn something interesting.
 
  • Like
Likes davenn and sistruguru
I'd be surprised if relative humidity was the key factor, unless it was so high that mist was starting to form. I think you should be asking about absolute humidity--the actual amount of water vapour in the air.

Edit: There's about twice as much water vapour in the air at 50% RH and room temperature, than at 100% RH and the freezing point.
 
Last edited:
  • Like
Likes sistruguru
Interesting, thanks for that suggestion. I'll look more into that
 

Similar threads

Back
Top