Taking Pictures From Outer Space: Color or B&W?

  • Context: High School 
  • Thread starter Thread starter brianthewhitie7
  • Start date Start date
  • Tags Tags
    Pictures
Click For Summary

Discussion Overview

The discussion centers on the nature of photographs taken by satellites and space probes, specifically whether they are in color or black and white, and how color is determined and represented in these images. It touches on technical aspects of imaging technology, calibration challenges, and the implications of light conditions on color perception in space.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants assert that satellites take color photographs, while others clarify that many images are initially captured in black and white and then color is added through various methods.
  • One participant explains that digital photos are taken with monochrome sensors that use filters to capture colors sequentially, either through mechanical means or a Bayer matrix.
  • Concerns about color calibration are raised, particularly in relation to the Mars rovers, which were equipped with calibration tools to ensure accurate color representation.
  • Another participant notes that some images use synthetic color, where filters allow specific frequencies of light to pass through, rather than full-color filters.
  • References to NASA's practices indicate that they provide details on the wavelengths of images and modifications made for clarity, as well as access to raw, unprocessed images.

Areas of Agreement / Disagreement

Participants express differing views on the nature of color in space photography, with some asserting that color images are produced while others emphasize the challenges and methods of achieving accurate color representation. The discussion remains unresolved regarding the implications of these methods on the perceived colors of celestial bodies.

Contextual Notes

Limitations include the dependence on specific imaging technologies and calibration techniques, as well as the influence of lighting conditions on color perception in space environments.

brianthewhitie7
Messages
17
Reaction score
0
When satelites go to other planets to study them and take pictures are the pictures in color or black and white? and if they are in black in white how do they know what the color really is when adding color to the pictures?
 
Astronomy news on Phys.org
Satellites orbit a single body, like Earth; they don't go from one planet to another. Instead, call them 'probes' or simply ' spacecraft .'

They do indeed take color photographs. Often the colors are enhanced or modified before their publication, however. The sunlight is so dim at Neptune, for example, that it's unlikely than any human being there would be able detect any significant color with the eye, so it's meaningless to ask what the color there "really is."

- Warren
 
Well, I think it would be a little more precise to say that all digital photos are taken with monochrome sensors that one way or another are alternatly filtered to allow one color at a time to pass through. Some cameras take 3 (or 4, with a luminance image) separate photos, mechanically alternating colored filters. The photos are combined later into a single color photo. Other cameras have a matrix of colored filters attached to the sensor to alternate colors on one sensor. The software controlling the camera knows which pixels are which colors and assigns them accordingly. Obvously, this method reduces the effective resolution of the sensor. It's called a Bayer Matrix: http://www.cyanogen.com/help/maxdslr/One-Shot_Color_and_the_Bayer_Array.htm

I don't know for sure, but I would guess that, as warren implies, most space probes and telescopes have "color" sensors that use the Bayer matrix.

Most photos taken through most cameras require color calibration unless careful control of the lighting is available. For the Mars rovers, for example, color calibration is a major problem (it is the Red Planet...) and as such the rovers were sent up with a sundial with color calibration spots on it so that the colors can easily calibrated once the photos are sent to Earth.
http://marsrovers.nasa.gov/mission/ spacecraft _instru_calibr.html

It may be beyond the scope of what you were asking, but some pictures you see use "synthetic" color - the filters used to provide the color aren't actually full-color filters, but filters that only allow a very specific single frequency of light through, such as the hydrogen alpha emission line (a specific frequency of red). Here's one famous Hubble image and how it was taken: http://www.pbs.org/wgbh/nova/origins/hubble.html
 
Last edited by a moderator:
russ_watters said:
For the Mars rovers, for example, color calibration is a major problem (it is the Red Planet...) and as such the rovers were sent up with a sundial with color calibration spots on it so that the colors can easily calibrated once the photos are sent to Earth.
And this was a lesson hard-learned. The pictures that came back from the earlier Mars landers were manually interpreted, and for years we thought that Mars had a blue sky like Earth. That's why they put the colour calibration on later landers.
 
If you read the press releases from NASA sites, like from the Cassini site, they mention at what wavelength the photo(s) was(were) taken and how they were modified to show details on the planet or its satellites. They also have a section where you can view the raw images, i.e. unprocessed versions.(at reduced size)

http://saturn.jpl.nasa.gov/home/index.cfm
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 40 ·
2
Replies
40
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K