- #1

cragwolf

- 169

- 0

Now, the Moon should get roughly the same energy flux (OK, a little bit less since it's slightly further away at full moon, but it does not make much difference). The Moon is initially treated like a disk that intercepts part of the expanding sphere of sunlight. This disk has area pi * R^2, where R is the radius of the Moon.

The Moon then reflects this sunlight, but now I treat it like a sphere, and only half the sphere (the half we on Earth can see) is involved in reflecting the sunlight. So we have this expanding hemisphere of moonlight that now the Earth intercepts.

So the Earth is thus capturing a flux of:

1400 * (pi * R^2) / (2 * pi * d^2)

where d is the distance of the Moon from the Earth.

Oh, but I forgot the albedo of the Moon, which is 0.07, so the flux is actually:

1400 * 0.07 * (pi * R^2) / (2 * pi * d^2)

Or roughly:

1400 * (1 / 1400000)

In other words, moonlight is 1400000 times fainter than sunlight. But when I look up the magnitudes:

Sun = -26.7

Moon = -12.6

which is a factor of 14.1 magnitudes of difference, or roughly a ratio of 436000 to 1. So I'm off by a factor of about 3. This difference has me worried, and I'm wondering if I did the physics wrong.