Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Taking Pictures

  1. Jul 3, 2007 #1
    When satelites go to other planets to study them and take pictures are the pictures in color or black and white? and if they are in black in white how do they know what the color really is when adding color to the pictures?
  2. jcsd
  3. Jul 3, 2007 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Satellites orbit a single body, like Earth; they don't go from one planet to another. Instead, call them 'probes' or simply 'spacecraft.'

    They do indeed take color photographs. Often the colors are enhanced or modified before their publication, however. The sunlight is so dim at Neptune, for example, that it's unlikely than any human being there would be able detect any significant color with the eye, so it's meaningless to ask what the color there "really is."

    - Warren
  4. Jul 3, 2007 #3


    User Avatar

    Staff: Mentor

    Well, I think it would be a little more precise to say that all digital photos are taken with monochrome sensors that one way or another are alternatly filtered to allow one color at a time to pass through. Some cameras take 3 (or 4, with a luminance image) separate photos, mechanically alternating colored filters. The photos are combined later into a single color photo. Other cameras have a matrix of colored filters attached to the sensor to alternate colors on one sensor. The software controlling the camera knows which pixels are which colors and assigns them accordingly. Obvously, this method reduces the effective resolution of the sensor. It's called a Bayer Matrix: http://www.cyanogen.com/help/maxdslr/One-Shot_Color_and_the_Bayer_Array.htm [Broken]

    I don't know for sure, but I would guess that, as warren implies, most space probes and telescopes have "color" sensors that use the Bayer matrix.

    Most photos taken through most cameras require color calibration unless careful control of the lighting is available. For the Mars rovers, for example, color calibration is a major problem (it is the Red Planet....) and as such the rovers were sent up with a sundial with color calibration spots on it so that the colors can easily calibrated once the photos are sent to Earth.
    http://marsrovers.nasa.gov/mission/spacecraft_instru_calibr.html [Broken]

    It may be beyond the scope of what you were asking, but some pictures you see use "synthetic" color - the filters used to provide the color aren't actually full-color filters, but filters that only allow a very specific single frequency of light through, such as the hydrogen alpha emission line (a specific frequency of red). Here's one famous Hubble image and how it was taken: http://www.pbs.org/wgbh/nova/origins/hubble.html
    Last edited by a moderator: May 3, 2017
  5. Jul 3, 2007 #4


    User Avatar
    Gold Member

    And this was a lesson hard-learned. The pictures that came back from the earlier Mars landers were manually interpreted, and for years we thought that Mars had a blue sky like Earth. That's why they put the colour calibration on later landers.
  6. Jul 4, 2007 #5
    If you read the press releases from NASA sites, like from the Cassini site, they mention at what wavelength the photo(s) was(were) taken and how they were modified to show details on the planet or its satellites. They also have a section where you can view the raw images, i.e. unprocessed versions.(at reduced size)

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook