Taking Pictures From Outer Space: Color or B&W?

  • Thread starter Thread starter brianthewhitie7
  • Start date Start date
  • Tags Tags
    Pictures
AI Thread Summary
Satellites, referred to as probes or spacecraft, take color photographs of other planets, but these images often undergo enhancement or modification before publication. Digital photos are typically captured using monochrome sensors with colored filters, which can reduce resolution. Color calibration is crucial, especially for missions like the Mars rovers, which use calibration tools to ensure accurate color representation. Some images utilize synthetic color, focusing on specific light frequencies rather than full-color filters. NASA provides details on the wavelengths used and offers access to raw, unprocessed images for transparency.
brianthewhitie7
Messages
17
Reaction score
0
When satelites go to other planets to study them and take pictures are the pictures in color or black and white? and if they are in black in white how do they know what the color really is when adding color to the pictures?
 
Astronomy news on Phys.org
Satellites orbit a single body, like Earth; they don't go from one planet to another. Instead, call them 'probes' or simply ' spacecraft .'

They do indeed take color photographs. Often the colors are enhanced or modified before their publication, however. The sunlight is so dim at Neptune, for example, that it's unlikely than any human being there would be able detect any significant color with the eye, so it's meaningless to ask what the color there "really is."

- Warren
 
Well, I think it would be a little more precise to say that all digital photos are taken with monochrome sensors that one way or another are alternatly filtered to allow one color at a time to pass through. Some cameras take 3 (or 4, with a luminance image) separate photos, mechanically alternating colored filters. The photos are combined later into a single color photo. Other cameras have a matrix of colored filters attached to the sensor to alternate colors on one sensor. The software controlling the camera knows which pixels are which colors and assigns them accordingly. Obvously, this method reduces the effective resolution of the sensor. It's called a Bayer Matrix: http://www.cyanogen.com/help/maxdslr/One-Shot_Color_and_the_Bayer_Array.htm

I don't know for sure, but I would guess that, as warren implies, most space probes and telescopes have "color" sensors that use the Bayer matrix.

Most photos taken through most cameras require color calibration unless careful control of the lighting is available. For the Mars rovers, for example, color calibration is a major problem (it is the Red Planet...) and as such the rovers were sent up with a sundial with color calibration spots on it so that the colors can easily calibrated once the photos are sent to Earth.
http://marsrovers.nasa.gov/mission/ spacecraft _instru_calibr.html

It may be beyond the scope of what you were asking, but some pictures you see use "synthetic" color - the filters used to provide the color aren't actually full-color filters, but filters that only allow a very specific single frequency of light through, such as the hydrogen alpha emission line (a specific frequency of red). Here's one famous Hubble image and how it was taken: http://www.pbs.org/wgbh/nova/origins/hubble.html
 
Last edited by a moderator:
russ_watters said:
For the Mars rovers, for example, color calibration is a major problem (it is the Red Planet...) and as such the rovers were sent up with a sundial with color calibration spots on it so that the colors can easily calibrated once the photos are sent to Earth.
And this was a lesson hard-learned. The pictures that came back from the earlier Mars landers were manually interpreted, and for years we thought that Mars had a blue sky like Earth. That's why they put the colour calibration on later landers.
 
If you read the press releases from NASA sites, like from the Cassini site, they mention at what wavelength the photo(s) was(were) taken and how they were modified to show details on the planet or its satellites. They also have a section where you can view the raw images, i.e. unprocessed versions.(at reduced size)

http://saturn.jpl.nasa.gov/home/index.cfm
 
Publication: Redox-driven mineral and organic associations in Jezero Crater, Mars Article: NASA Says Mars Rover Discovered Potential Biosignature Last Year Press conference The ~100 authors don't find a good way this could have formed without life, but also can't rule it out. Now that they have shared their findings with the larger community someone else might find an explanation - or maybe it was actually made by life.
TL;DR Summary: In 3 years, the Square Kilometre Array (SKA) telescope (or rather, a system of telescopes) should be put into operation. In case of failure to detect alien signals, it will further expand the radius of the so-called silence (or rather, radio silence) of the Universe. Is there any sense in this or is blissful ignorance better? In 3 years, the Square Kilometre Array (SKA) telescope (or rather, a system of telescopes) should be put into operation. In case of failure to detect...
Thread 'Could gamma-ray bursts have an intragalactic origin?'
This is indirectly evidenced by a map of the distribution of gamma-ray bursts in the night sky, made in the form of an elongated globe. And also the weakening of gamma radiation by the disk and the center of the Milky Way, which leads to anisotropy in the possibilities of observing gamma-ray bursts. My line of reasoning is as follows: 1. Gamma radiation should be absorbed to some extent by dust and other components of the interstellar medium. As a result, with an extragalactic origin, fewer...

Similar threads

Back
Top