- #1
accdd
- 78
- 18
If I place a mirror on the ground facing upward, what color will those who observe it from space see?
Distance does not make light turn red.reflected light has to travel a lot farther, but I don't know if that is enough to make the reflection in the mirror turn red.
🤔
They'll see blue.
The mirror reflects what it sees.
60 miles of atmosphere is not tintless, whether you're looking up from below or down from above.
No.If you took a photograph of the sky and placed it on the ground would you expect it to look like the mirror? If not, why not?
Aren't the colors in this photo mainly due to the color of things? In a mirror, the color observed should be almost only that of the light scattered by the atmosphere.
You are talking about the effect of Rayleigh scattering and atmospheric dust. For a long optical path, blue will be scattered away preferentially while the reds and oranges will suffer less effect. Yes, this occurs at sunrise and sunset when the optical path through the atmosphere is perhaps hundreds of kilometers long.For the same reason that the sky at sunset is red, I expect that the color seen by the mirror may be red, or a color between blue and red. Am I wrong?
In a mirror, the color observed should be a combination of three things:Aren't the colors in this photo mainly due to the color of things? In a mirror, the color observed should be almost only that of the light scattered by the atmosphere.
I am pretty sure that uncorrected images of Earth from space are blue tinted.Then continents would look blue seen from the ISS. Or blue tinted. That is not seen from space.
If the atmospheric effects (blue sky) only extend to 20km altitude, the absorption would not be more than say 20%. So the contribution of what you see via the mirror could be perhaps 80% to what the space observer would see from direct scattering of theSun. Using that crude estimate, if there were a nearby reference black on the ground, the result from the mirror would be around 180% relative to the black area. That would probably make the mirror appear a bit but not much brighter than clean snow - noticeable under the right conditions.There is a useful magic number to keep in your quill of usefull numbers. For midspectrum (green ~500nm) light the Rayleigh scattering rate for air at STP is ~1% per km .
Shadows on satellite images are very black. Shadows are really blue, since they are illuminated by the blue of all the sky, but not the Sun. I suspect the blue is removed from the entire image by a digital haze filter.Using that crude estimate, if there were a nearby reference black on the ground, the result from the mirror would be around 180% relative to the black area.
I think this must be a subjective thing. If you look at the sky and not of the Sun, depending on which precise direction you are looking you will see a 'blue' that is the result of the high frequency tilt. Look at the RGB values on any of your most impressive blue sky photos and there is still a lot of R and G. There's about four times the amount of far blue light scattered as far red but, between those extremes, the levels of other wavelengths follow a just slightly curved line so the integral effect will be to desaturate the 4:1 effect. Then, we never look directly at the Sun and the sky doesn't change colour when we stand in the shadow of a wall.Shadows are really blue,
What shadows? Shadows on objects like the craft are not subject to scatter so they can be very contrasty. The shadows on the Earth would have to be the day / night terminator but the 'shadow of the Earth on the Moon (total eclipse) makes the Moon a dim red colour but very visible.Shadows on satellite images are very black.
The shadows you see in images such as Google Earth, of the Earth, taken from satellites during the day.What shadows?
Google Earth is tiffled from beginning to end to make people watch it and to bring out the features they want. Also, the images are compressed so you can watch them with low bandwidth. Entertaining but I have often found my Times World Atlas is better ( loads of hand drawing).The shadows you see in images such as Google Earth, of the Earth, taken from satellites during the day.
The sky looks blue, and that light illuminates whatever is in the shadow of objects on the Earth. If an object in the shadow is white, then it will look blue like the sky.
So, someone sees something but doesn't realize it's in a mirror. What colour is it? The thread title may not be too precise but the Summary explains the question that's really been asked. Sounds like the OP could be a successful AD Man - he grabbed our attention.Now if you're asking what the image seen in the mirror is, that's a different story.
Then why does the sky look blue against the black of space, when viewed from Earth.I think black from the space will make bluish tint very hard to see
I think black from the space will make bluish tint very hard to see
There is useful information. In particular you get three numbers (R,G,B). The exact calibration of these numbers will be far more useful if you have a white (or known uniform gray) image and perhaps a "black" image. Of course if you have a JPEG or other compression scheme it is more complicated.I have another question: if I have a photo of the sky taken with a smartphone, can I get physically meaningful data from the colors in the picture? Or can the RGB matrix I get cannot be used for anything?