Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Stargazing Exposures and stacking

  1. Feb 8, 2017 #1

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    I have a big problem with dynamic range in my DSLR pictures of the sky. The contrast range of pictures on a DSLR is 256 levels. That's a range of around 6 magnitudes. To look at very faint objects, I have to expose a picture to place a faint object so that it is a reasonable number of levels above black, if the statistics of stacking are going to help. Any picture - particularly a wide field picture - is going to include mag 2 or 3 stars in it, which are going to burn out. The stacking process can yield a bigger contrast range if it gives a 16bit image but how to get rid of the gross white blobs? Do I really just have to edit them out and insert those stars from a lower exposed image? I guess the answer has to be Yes. But some of those bright stars are bang in the middle of some Nebulae. ???? What must I do to make the resulting picture look like 'the truth'?
     
  2. jcsd
  3. Feb 8, 2017 #2

    Drakkith

    User Avatar

    Staff: Mentor

    The only way to deal with this that I know of is to do exactly what you're already doing. As long as you're not burning out your pixel counts (maxing them out) then you can edit the image to bring the brightness of the stars down and keep the surrounding details. If your exposures are too long and your maxing your pixels counts around these stars then all that detail is lost forever. Maybe @russ_watters knows a better way.
     
  4. Feb 8, 2017 #3

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    You mean 'Curves'? But that's too late to deal with the real problem, afaics. For the brighter stars, the image gets bigger, proportional with to the brightness (the sinx/x curve gets clipped further and further down plus whatever happens 'electronically' on the sensor. The more of that you're prepared to put up with, the dimmer the wanted object that can be recorded. The basic limit of 255 levels is there always (at least, on a dslr). I guess what's needed is a sensor with a logarithmic response. I imagine someone's going to tell me that there is one.
     
  5. Feb 8, 2017 #4

    Drakkith

    User Avatar

    Staff: Mentor

    In a single exposure perhaps. But if you're stacking you can put together any number of images to bring out the finer details without burning out your image around the bright stars. If you can see this image of the Carina Nebula, there's a very bright star (Eta Carinae) just above the 'corner' formed by the dust. I was able to bring out the details surrounding the star by altering the brightness and contrast curves and/or performing some other digital processing. Otherwise it would have been just a huge bright spot.

    As far as I know, you want your camera to have a linear response across most of its range. But I don't use a DSLR, so I don't know if things are different there.
     
  6. Feb 8, 2017 #5
    The main problem with star colors in wide angle astrophotograhy is the amazing light gathering capability of a fast and wide lens combined with a digital sensor. Combined with the fact that all the light from a star is concentrated on one or just a few pixels will result that a bright star will blow out those pixels in seconds. It gets worse since most affordable digital sensors has small pixels with shallow electron wells so they saturate pretty quickly.

    Any decent DSLR should be able to output a 12 or 14 bit RAW image.

    The closest thing available is old school photographic film. It has an S-shaped response cure but it still not realy enough to control star color in wide angle photography.
     
  7. Feb 8, 2017 #6
    The traditional method of controlling the stars in an image with a large dynamic range is to take images with two or more different exposure times and then combine them. This can be done in the image editing software or sometimes directly in the stacking software. Many of the best images of the Orion nebula are made from combining a series of 5 sec, 30 sec and 5 min exposures or similar.
     
  8. Feb 8, 2017 #7

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    That's an interesting comment and it could explain quite a lot. My K10D has files of about 16MB (10 MPxel DNG) and my K2S, around 20MB (20MPxel DNG), which implies that it's not just 8bits 'per pixel' - or any simple relationship. The K2S is pretty up to date so I'd expect something near optimum.
    Do you have a reference about where the bit depth comes from in typical coding?
     
  9. Feb 8, 2017 #8

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    So you're saying that TIFF files of different brightnesses (more bits per pixel) are easier to combine an produce a bigger contrast ratio. That makes sense but I'd need some play time to do it convincingly.
    I have just ordered Make Every Photon Count and will devour it when it arrives. It comes highly recommended.
     
  10. Feb 8, 2017 #9

    Drakkith

    User Avatar

    Staff: Mentor

    Brightness is essentially photons per pixel (or electrons per pixel, or counts per pixel), not bits per pixel.

    I've not heard of this before. I'll have to look into it. :biggrin:
     
  11. Feb 9, 2017 #10
    Looking at the specs it should be a 12 bit monochrome RAW format stored using lossless compression.
    There are many, but they are usually quite wordy and tailored for CCD cameras. Basically a pixel on a digital camera (CMOS/CCD) is a tiny solar cell that can store a number of electrons (AFAIK 1 electron per detected photon). The number of stored electrons is called the well depth and depending on the technology used and the size of the pixel this number is in the 1000s to 100'000s range. If the well is full no more data can be captured and the pixel is saturated/blown out.

    When the pixel is read the voltage from the electrons in the well is amplified and measured by an ADC (analog-digital converter). The ADC outputs a digital number (ADU) of N bits (22 bits for you K2S which is odd choice since it is much larger than necessary). Each digital unit (ADU) can be converted back to electrons by multiplying it with the gain of the ADC (electrons/ADU).

    This data must be stored. In a camera made especially for astronomy or other scientific imaging this just means that the information from the ADC is stored as is. For a DSLR camera the situation is different. Even the RAW images are somewhat processed using propitiatory algorithms and then stored using whatever bit depth is used in the RAW format (for both your cameras, 12 bit). This is still mostly a number proportional to the number of photons detected per pixel. Color conversion is actually done later in the image processing/stacking software, for a DSLR that process is called debayering (from the Bayer mask applied in front of the sensor to be able to reconstruct a color image).

    For a JPG a lot more processing is done, debayering to get a color image, processing with gamma curves, sharpening, noise reduction, etc and the result is then compressed to 8 bit JPG (this is why you should not use JPG for processing if you are into serious astroimaging since you want to stack the linear RAW files and then control this process yourself).
     
  12. Feb 9, 2017 #11

    davenn

    User Avatar
    Science Advisor
    Gold Member

    don't convert to TIFF and then stack/edit ...... keep them as a RAW and do all processing ... stacking/editing

    and also as glappkaeft said

    NEVER convert to jpg before stacking and editing


    Dave
     
  13. Feb 9, 2017 #12

    Chronos

    User Avatar
    Science Advisor
    Gold Member

    It's a good idea and saves time to level out your dark frames balance before stacking images.
     
  14. Feb 9, 2017 #13

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    I was referring to the number of levels that the ADC can resolve for each sensor element. For a monochrome camera that would be more easily related to a final file size - if there were no compression.
     
  15. Feb 9, 2017 #14

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Absolutely. I would never ever ever ever do that - even with happy family snaps.

    OK that could make sense. I went into TIFF because N4 didn't seem capable of producing a proper colour image from my RAW files. I assumed that it wasn't making sense of the file metadata to do the right debayering. But I do have a query about what you say. If N4 shifts images before stacking then how can one be sure that the correct pixels (on the bayer matrix) will get added together. If you take the un-de-bayered file, they just appear as an array of little squares. Does the shifting take this into account? (I already have doubts about the debayering from my Pentax files. (All other photo software gets it right) Is anything lost in going from RAW to the non-compressed TIFF conversion?
     
  16. Feb 9, 2017 #15

    davenn

    User Avatar
    Science Advisor
    Gold Member

    ahhhh OK .... N4 ( I assume that is Nebulosity4) I haven't really played with that prog much other than dabbled with the trial version ... price to buy was a little steep for me .... Maybe it doesn't handle the Pentax .DNG files ?

    There isn't any real compression in TIFF files ... but you end up with a file that isn't as editable as the original RAW file, as things like colour balance, white balance and a few other things are fixed at the time of conversion from RAW to TIFF. You just don't get the introduced artifacts caused by the significant compressions when going from RAW to JPG


    Dave
     
  17. Feb 9, 2017 #16

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    The TIFF files are massive; just 16bit RGB, I think. N4 wouldn't even look at my PEF raw format files so I just use DNG, the more generic system. I am surprised just what N4 seems to do with them because PS and other processing packages make perfect sense of them
    Have you a comment about the effect of moving the images about for stacking? Are the quanta of movement bigger than the sensor element spacing then?

    Anyway, when you take a number of different exposure images, how do you fit them together? I suppose PS Masks could allow the 'unburnt' star images in the less exposed images to replace the burnt ones but the unwanted ones are bigger so do you have to do some feathered selection business? Then the background could look funny around the doctored bright stars. I have seen some clever images of a full moon with Saturn peeking around from behind it. That must require quite a bit of jiggery pokery, I imagine.
     
  18. Feb 9, 2017 #17

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    BTW I just got that book in the post. It seems to cover quite a lot of what I need with actual pictures of the various setups the guy has used. The downside is £££££, though. haha
    Must finish my work in the garden before I get stuck into it.
     
  19. Feb 9, 2017 #18

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    There is an option for TIFF compression in my Aperture (OS X)
     
  20. Feb 9, 2017 #19

    russ_watters

    User Avatar

    Staff: Mentor

    Not much beyond what has already been said;
    No matter what, you need better than 8bit color depth. You lose way too much with such a low depth. Some of the best parts of my pictures you can't even see until you stretch them. Higher bit depth on the originals and shorter (or different length) exposures makes for higher overall dynamic range. I've had trouble working with different exposure lengths though, specifically because it is hard to overlay them when the stars are different sizes.

    Also, there are software tools such as photoshop actions to shrink blown-out stars, but often I think they add artistic flair and I tend to leave them.
     
  21. Feb 9, 2017 #20

    russ_watters

    User Avatar

    Staff: Mentor

    My suspicion is that such photos are composites of separate exposures/processes, combined together after the fact. That is how I sometimes do planets with moons. I literally just cut and paste the processed photo of the planet into the processed photo of the moons.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted