Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I How is color added to Hubble images?

  1. May 22, 2018 #1
    http://hubblesite.org/reference_desk/faq/answer.php.cat=topten&id=93

    So, I found out that the images taken by the Hubble Space Telescope are not colored but black and white and colors are added to them later. I thought that the way you figure out what colors are present in your object(s) of interest is by looking at its spectrum. But how do you find those if you don't have a camera capable of recording colors? Is it the case that the imager in the Hubble is black and white, while the spectrometer is not? And we use the information we get from the spectrometer to fill in the colors on the images that we get from the imager?
     
  2. jcsd
  3. May 22, 2018 #2

    phinds

    User Avatar
    Gold Member

  4. May 22, 2018 #3
    Thanks!

    The article mentioned two things which I am unclear about. If I understand it correctly, It said that the telescope does take black and white images and also that it has color filters to get only part of the visible wavelength at a time and then it combines them all. Then why do you need the black and white images in the first place if you can get colored pictures from the different filters?
     
  5. May 22, 2018 #4

    Bandersnatch

    User Avatar
    Science Advisor
    Gold Member

    The camera is like one of the three types of rods in our eyes - it can only discern how many photons hit the detector. If you were to shine unfiltered light onto the detector, it'd register a true greyscale image. I.e., it'd return bright spots where lots of photons of any wavelength hit, and dark spots where there were just a few.
    But, if you filter out the incoming light, so that only red, only green, or only blue light can pass to the detector, then you can imitate what our eyes do. Our eyes give our brains information about colour by having one type of rods for each of the three colours (RGB), each registering only how many photons of a specific kind hit them, and then combining the data from the three types of detectors. The difference with the telescope being, instead of actual three cameras, each with its own filter, you have one camera taking the same picture three times (or more), with different filters.
    Each time, regardless of which filter you have on, the information the camera returns is the same: how many photons hit the detector. Just as with the rods in our eyes, the information that we interpret as colour comes about only from combining the three types of greyscale images.
     
  6. May 22, 2018 #5

    phinds

    User Avatar
    Gold Member

    Because each of the color filters only lets through a limited range and combining them does NOT give the same amount of information as the full gray-scale image. The filtered images (narrowband) are somehow processed/overplayed with the gray scale (broadband) image to give the color but retain the full information. Also, as the article notes, some of the filters used are outside the range of human vision so color are added.
     
  7. May 23, 2018 #6

    Merlin3189

    User Avatar
    Homework Helper
    Gold Member

    One point that may not be quite clear yet, is that the colours of the final image need not be the colours you would see if you looked through the telescope or if the telescope camera took colour pictures like a normal camera. The colours are used to help present the information for the human viewer and can be whatever we choose.

    As phinds says, invisible colours like UV and IR can be given visible colours, so that we can see these invisible images.

    The colours may not even relate to the wavelengths of the received light in any way. One could have an image obtained at a single wavelength (or a narrowband of wavelengths) then map intensity to different colours, as is done with IR imaging in the mundane context of measuring heat loss in buildings.

    In fact all this talk of black and white or greyscale "images" is really acknowledging that what is recorded is always just an array of intensity values.
     
  8. May 23, 2018 #7

    russ_watters

    User Avatar

    Staff: Mentor

    Please note: there is no such thing as a color digital camera sensor. All of them, including the ones you own (I assume you own several) take black and white photos though colored filters and then use software to apply colors after taking the picture. Astronomers just separate the process and do it semi-manually while your cameras do it for you. It's for control. This photo is narroband red (Ha) only:
    2017-06-29%20M16-Ha.jpg

    This is Ha, G, B, but clearly I needn't have bothered with the green and blue:

    Horsehead-HaRGB.jpg

    When I (an amateur) take a picture of a red nebula like the Horsehead, I can use a broadband filter (like the normal red filter on your cameras) or a narrowband filter to cut down on extraneous noise from light pollution.

    Caveat though; you can do "false color" assignments. All infrared and radio telescope photos are of course, but it is sometimes useful for optical range images.

    [edit] For maybe a more direct answer to the question:
    For these photos I manually saved the photos with names according to the filter used, then combined/assigned them in Photoshop. I now have somewhat more automated equipment and software.

    Edit2:
    This handbook has a lot of detail about how the Hubble Wide Field Planetary Camera worked:

    https://www.google.com/url?sa=t&sou...FjABegQIBxAB&usg=AOvVaw3W-NT3LssCadXbdQwtAGh9
     
    Last edited: May 23, 2018
  9. May 28, 2018 #8

    rbelli1

    User Avatar
    Gold Member

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted