How is color added to Hubble images?

  • Context: Undergrad 
  • Thread starter Thread starter Phys12
  • Start date Start date
  • Tags Tags
    Color Hubble Images
Click For Summary
SUMMARY

The Hubble Space Telescope captures images in black and white, utilizing color filters to isolate specific wavelengths of light. These filtered images are then combined to create a full-color representation, allowing for the visualization of both visible and non-visible wavelengths. The process involves taking multiple greyscale images with different filters and merging them to retain comprehensive data. This method enables astronomers to present information in a way that is interpretable for human viewers, often using false color assignments for wavelengths beyond human perception.

PREREQUISITES
  • Understanding of Hubble Space Telescope imaging techniques
  • Knowledge of color filtering and wavelength isolation
  • Familiarity with image processing software, such as Photoshop
  • Basic concepts of light and photon detection
NEXT STEPS
  • Research Hubble's Wide Field Planetary Camera specifications and capabilities
  • Explore the principles of color filtering in astronomical imaging
  • Learn about false color imaging techniques in astronomy
  • Investigate the role of spectrometry in determining object composition
USEFUL FOR

Astronomers, astrophotographers, and anyone interested in the techniques behind astronomical imaging and color representation in space photography.

Phys12
Messages
351
Reaction score
42
http://hubblesite.org/reference_desk/faq/answer.php.cat=topten&id=93

So, I found out that the images taken by the Hubble Space Telescope are not colored but black and white and colors are added to them later. I thought that the way you figure out what colors are present in your object(s) of interest is by looking at its spectrum. But how do you find those if you don't have a camera capable of recording colors? Is it the case that the imager in the Hubble is black and white, while the spectrometer is not? And we use the information we get from the spectrometer to fill in the colors on the images that we get from the imager?
 
Astronomy news on Phys.org
phinds said:
Thanks!

The article mentioned two things which I am unclear about. If I understand it correctly, It said that the telescope does take black and white images and also that it has color filters to get only part of the visible wavelength at a time and then it combines them all. Then why do you need the black and white images in the first place if you can get colored pictures from the different filters?
 
Phys12 said:
Then why do you need the black and white images in the first place if you can get colored pictures from the different filters?
The camera is like one of the three types of rods in our eyes - it can only discern how many photons hit the detector. If you were to shine unfiltered light onto the detector, it'd register a true greyscale image. I.e., it'd return bright spots where lots of photons of any wavelength hit, and dark spots where there were just a few.
But, if you filter out the incoming light, so that only red, only green, or only blue light can pass to the detector, then you can imitate what our eyes do. Our eyes give our brains information about colour by having one type of rods for each of the three colours (RGB), each registering only how many photons of a specific kind hit them, and then combining the data from the three types of detectors. The difference with the telescope being, instead of actual three cameras, each with its own filter, you have one camera taking the same picture three times (or more), with different filters.
Each time, regardless of which filter you have on, the information the camera returns is the same: how many photons hit the detector. Just as with the rods in our eyes, the information that we interpret as colour comes about only from combining the three types of greyscale images.
 
  • Like
Likes   Reactions: Imager and Phys12
Phys12 said:
Thanks!

The article mentioned two things which I am unclear about. If I understand it correctly, It said that the telescope does take black and white images and also that it has color filters to get only part of the visible wavelength at a time and then it combines them all. Then why do you need the black and white images in the first place if you can get colored pictures from the different filters?
Because each of the color filters only let's through a limited range and combining them does NOT give the same amount of information as the full gray-scale image. The filtered images (narrowband) are somehow processed/overplayed with the gray scale (broadband) image to give the color but retain the full information. Also, as the article notes, some of the filters used are outside the range of human vision so color are added.
 
One point that may not be quite clear yet, is that the colours of the final image need not be the colours you would see if you looked through the telescope or if the telescope camera took colour pictures like a normal camera. The colours are used to help present the information for the human viewer and can be whatever we choose.

As phinds says, invisible colours like UV and IR can be given visible colours, so that we can see these invisible images.

The colours may not even relate to the wavelengths of the received light in any way. One could have an image obtained at a single wavelength (or a narrowband of wavelengths) then map intensity to different colours, as is done with IR imaging in the mundane context of measuring heat loss in buildings.

In fact all this talk of black and white or greyscale "images" is really acknowledging that what is recorded is always just an array of intensity values.
 
  • Like
Likes   Reactions: rbelli1
Phys12 said:
The article mentioned two things which I am unclear about. If I understand it correctly, It said that the telescope does take black and white images and also that it has color filters to get only part of the visible wavelength at a time and then it combines them all. Then why do you need the black and white images in the first place if you can get colored pictures from the different filters?
Please note: there is no such thing as a color digital camera sensor. All of them, including the ones you own (I assume you own several) take black and white photos though colored filters and then use software to apply colors after taking the picture. Astronomers just separate the process and do it semi-manually while your cameras do it for you. It's for control. This photo is narroband red (Ha) only:
2017-06-29%20M16-Ha.jpg


This is Ha, G, B, but clearly I needn't have bothered with the green and blue:

Horsehead-HaRGB.jpg


When I (an amateur) take a picture of a red nebula like the Horsehead, I can use a broadband filter (like the normal red filter on your cameras) or a narrowband filter to cut down on extraneous noise from light pollution.

Caveat though; you can do "false color" assignments. All infrared and radio telescope photos are of course, but it is sometimes useful for optical range images.

[edit] For maybe a more direct answer to the question:
For these photos I manually saved the photos with names according to the filter used, then combined/assigned them in Photoshop. I now have somewhat more automated equipment and software.

Edit2:
This handbook has a lot of detail about how the Hubble Wide Field Planetary Camera worked:

https://www.google.com/url?sa=t&sou...FjABegQIBxAB&usg=AOvVaw3W-NT3LssCadXbdQwtAGh9
 

Attachments

  • 2017-06-29%20M16-Ha.jpg
    2017-06-29%20M16-Ha.jpg
    42.9 KB · Views: 749
  • Horsehead-HaRGB.jpg
    Horsehead-HaRGB.jpg
    31.4 KB · Views: 799
Last edited:
  • Like
Likes   Reactions: Phys12

Similar threads

  • · Replies 4 ·
Replies
4
Views
1K
  • Sticky
  • · Replies 311 ·
11
Replies
311
Views
54K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 45 ·
2
Replies
45
Views
15K
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
6K
  • · Replies 15 ·
Replies
15
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K