# Stargazing Deep Space Imaging and stacking images for less noise

1. May 23, 2016

### davenn

hi guys
Today I would like to show the advantages of stacking a set of exposures over just a single shot
There are various ways to stack individual exposures. A number of programs are available ranging
from free to quite expensive ... several 100 US$free = DSS - Deep Sky Stacker http://deepskystacker.free.fr/english/download.htm Photoshop CC = PS and LR, (Lightroom) CC come as a subscription these days and quite affordable So not only do you get stacking features in PS, you also have very powerful editing software a little more expensive = Nebulosity http://www.stark-labs.com/nebulosity.html most expensive = Pixinsight around US$250
https://pixinsight.com/

Both Pixinsight and Nebulosity have reasonably steep learning curves
I have dabbled with both and in the end gone back to Deep Sky Stacker
I have done a couple of stacks in PS, They were sort of OK
The trick with all the software is learning the art of stretching the RGB colour curves to bring out all the details. Something I am still learning

Lets take a practical example and see the difference between stacking and a single image.
One of the major purposes of doing this is to improve the signal to noise ratio. The noise is
primarily sensor thermal noise that starts to come through because there isn't enough signal noise to overcome it because we are imaging in very low light and usually high ISO settings > 1000.
Because noise is random in its appearance and location in the exposure, but the signal -- the stars nebula etc isn't. The signal level is compounded and improved where the noise tends to cancel out.

This first image is a single 30 sec exposure with a 400mm lens at f5.6 and ISO 3200
The camera is a Canon 700D that I have modded by removing one of the filter elements
that limits IR/UV and colour range ... it improves the sensitivity to the red end of the spectrum

the 2 main objects are ... to the left M20 - the Trifid Nebula and to the right M8 - the Lagoon Nebula

You can see there is a bit of a pinkish hue to the image as a result of the extra red sensitivity

now here is 6 separate exposures ( all same settings) taken one after the other and stacked in DSS

The result is glaringly obvious. I have been able to keep the red hue under control. Have been able to
achieve a much better contrast between the nebulae, stars and the dark background. Also, if you look closely, you can see a definite loss of the fuzzy noise across the whole image ( the noise).
The overall result is much better detail in the image

I will address to more stacking details in another post in this thread

Other questions and comments are welcome

cheers
Dave

2. May 23, 2016

### Andy Resnick

I'm currently struggling with white balance/colorimetry in my stacks and would appreciate any tips. Stacking images tends to result in desaturation, and I can't seem to create a robust post-processing algorithm to restore proper colors. Any tips would be helpful....

3. May 23, 2016

### davenn

WB is difficult. Generally I'm finding I'm looking at star colour to see if I have an overall colour range of the stars is even
This is mostly done with small adjustments of the colour temp setting

Before we go further, what stacking and editing software are you using ?

Dave

4. May 23, 2016

### Andy Resnick

The free stuff: Deep Sky Survey for stacking and 32-to-16-bit 'mixdown', then ImageJ for everything else. I'm getting better with the color balance (see recent post), but it's all ad hoc and so my panoramas don't match very well.

5. May 23, 2016

### phinds

Very cool, Dave.

6. May 23, 2016

### Staff: Mentor

I use Registax, though it is no longer being updated and has some issues with large files (if you are using video...).

Nice pic, btw.

7. May 23, 2016

### Staff: Mentor

When you stack, you are left with a grey background due to your skyglow (if that's what you are referring to...). You can use the "levels" function in photoshop to just cut off below a certain grey level, making the background black.

8. May 24, 2016

### davenn

Registax is great for planetary image stacking, it's specifically designed for loading 100's to 1000's of video image frames into
and generating good sharp images of planets .... its huge popularity attests to its capability of doing this well
I personally haven't played with it, tho have seen the good results of those that have
it's amazing what can be done with a telescope and a webcam ( or similar small camera)
many astro shops have low cost cameras for this process as well
eg ...
http://www.bintel.com.au/Astrophoto...-USB-Eyepiece-Camera-II/1487/productview.aspx

D

Last edited: May 24, 2016
9. May 24, 2016

### davenn

Deep Sky Stacker works very well with the stacking process. Haven't heard of ImageJ image processing software

yes, I could see the colour balancing problems you are having. Really shouldn't be that green huh
the galaxy and all your stars are a bit off-colour ... experiment with the colour temperature settings a bit
be careful not to push the vibrance or saturation .....

yes, it's natural for the stacked set to look very bland and washed out ... don't worry about that
Now I assume you are stacking RAW image files from your camera ? if not you should be
( don't use jpg's ... but you are probably aware of that ... just checking )

here's the 5 image stack if that above image before being exported to Lightroom
as you can see it looks pretty blah and it has more detail than some stacks do

99% of us don't do colour stretching in DSS either, Rather save the tiff file and open it in your fav image editor
save it like this ....

time for bed ... hopefully tomorrow nite I can delve into the use of DSS a little more and give links to some good tutorials

Dave

Last edited: May 24, 2016
10. May 24, 2016

### Andy Resnick

Kinda- the skyglow is not grey, it's reddish (from light pollution) and varies from night to night. My flats, on the other hand, are bluish since I acquired them on an overcast night for better uniformity. In the end, my stacked histograms are not even Gaussian/Poissonian but can exhibit strong skewness. What I end up doing is cranking up the saturation to help me fine-tune the color balance on the background (ideally) to a neutral grey, then relaxing the saturation to get the last bit.

The flat doesn't totally even out the background, either- it gets close, but not close enough to do a simple threshold.

What I should do is post some screen grabs of DSS so you can see my issues.... It's running now, so maybe later today.

11. May 24, 2016

### Andy Resnick

ImageJ used to be called NIHimage, and Fiji is a bundled version. It's awesome. I do stack RAW images, what I should do is post some DSS screengrabs to make my 'problems' more clear.

Since I'm colorblind, I can't rely on my eyes for any color adjustments- I have to resort to 'instrument flying', as they say. I use ImageJ for all color tweaks, since it provides all the quantitation I need.

12. May 24, 2016

### glappkaeft

Not every one a dabbles in astrophotography lives or has the telescope in an area where there is significant light pollution, In fact many of us maka a large effort to avoid light pollution. ;) In that case air glow (green) and zodiacal lights (neutral) are the largest contributors to sky glow.

13. May 24, 2016

### Andy Resnick

Ok, here are some screengrabs- these are not my best, but they are not the worst, either. I don't claim to be an expert user, by the way- let me know if I'm totally off-base. Typically, after stacking I get this:

Note the extreme slenderness (?) of the background histogram as compared to davenn's. I don't understand the broad curve (in green), the actual green histogram is lying on top of the red and blue ones. Since I can't work with 32-bit images, the first step is to compress this into 16-bits. What I do is a version of Dolby noise reduction- I compress at the high end to maximize separation between the background and faint objects:

Some explanation is in order- there is a lot going on here. First, the obvious, is the spatial chromatism- this is (likely) caused by the chromatic differences between my flat and data images. Generally, I try and tweak the color sliders to make the bright stars white (colorless), but it's really hard to do this accurately, given the lack of fine control on the slider bars. The best I can manage will leave the center close to grey and the edges a distinct blue color.

Next is the threshold-like appearance of the luminance map. This is the intensity compression: my images often have objects spanning 10 or 11 magnitudes (or more, if there's a bright star in the field of view). I can reliably image down to magnitude 15 objects, so I want the background to be around magnitude 17 or so. That intensity range (magnitude 5 through 17) is very nearly 16-bits worth. The mapping should not be linear, but preserve the logarithmic scale as best as possible. Remember, your screen can only display 255 discrete grey values- so I still have a lot more compression to perform. After saving this 16-bit image (16 bits per channel) I import into ImageJ for further processing. To give you an idea of the SNR of this image, here's a linescan across the diagonal, where I've nicked a couple of stars and the center of M51:

The luminance is fairly constant across the image, which is good. ImageJ has a background subtraction routine built-in, and after that step I get this intensity profile:

But the image will look terrible because I haven't (yet) compressed it to 8-bits:

So the next compression step is similar to the 32- to 16-bit compression, but involves more tweaking and fine-tuning. I try and work with the 16-bit image stack as much as possible, but it's all squishy and I come up with different algorithms all the time, but I can eventually get something like this:

This is about as far as I can go with ImageJ- there's some additional room for improvement (for example, I can work with a HSB stack and operate on the brightness component to further suppress background), but that's about it. There's still too much green in the midtones, but it's really hard to correct since the whites are white and the blacks are black. What's not apparent on these downsized images is the posterization that occurs way back when I compress to 16 bits per channel. I don't have any good examples to show here, but basically, what appears to be a step change of 1 or 2 grey values on the monitor actually corresponds to intensity changes of 500 or 1000 in the 16-bit image; background subtraction fails on these 'steps'.

That's the sum total of my post-processing knowledge......

14. May 24, 2016

### davenn

Andy
I'm interested in specifically how are you doing your "flats" and how many did you use ?

Actually taking a step back .... what camera are you using for your imaging ? a DSLR or an astrophotography specific camera ... make model of either ?
Either way, the process is basically the same

1) ensure you are taking the flats with the zoom ( focal length), aperture, focus is as for you actual imaging

then use either of these two methods....
1. Take a clean white t-shirt. Drape it over the lens or lens hood or front of scope. Smooth it out. Shoot a few frames, rotate the shirt, shoot a few more. Obviously if you’re doing this at night, you’ll need a uniform light source – the colour temperature setting doesn’t matter much.
2. Select a uniform white or grey display on your iPad, iPhone, Mac or laptop computer. Hold the tablet up against the lens or front of telescope – making sure the lens is completely covered by the display and take several exposures. Rotate the camera or light source to avoid hot spots.

For that image above I didn't use any darks, flats or bias frames
I occasionally use darks, I should more often as it would help remove hot pixel dots.
Generally I find the vignetting and field flattener adjustments in Lightroom solve most uneven lighting across the frame
Bias frames I have never bothered with

OK that kinda explains the green hue to your M51 image ... must make image processing quite difficult
Actually that last image of M51 in your last post looks so much better, colour wise

15. May 24, 2016

### davenn

OK and sometimes, before stretching you may not see much more. Again don't panic too much, don't do any processing in DSS
as you obviously started to do in the second screenshot. Rather Save the file as a 16 bit TIFF with the setting I have shown in that screenshot of my earlier post

The 32 bit files, are they coming from your camera ?

By any chance, do you have a www site that you could upload a saved 16 bit tiff from DSS to so I could download it and have a play ?

Dave

16. May 24, 2016

### Andy Resnick

Lots of questions...

Good question- I've been playing with flat frames for a while, trying to get good results. I have to use flat field correction, otherwise I give up the outer 30% of my frame. I have 3 sets of flats, 1 set when I image at 800mm and 2 when I image at 400mm, each set of flats is around 40 images. The reason I have 2 sets at 400mm is because one set undercorrects and the other overcorrects- when I average the two stacked results, the background is *significantly* easier to deal with. Yes, yes I know to take the flats with the same aperture settings, etc. I've tried with all kinds of light sources as well.

I'm using a Nikon D810 (not D810A). The lens is mounted onto my motorized tripod.

This absolutely does not work (for me). I think it's because the radiance of a laptop/LED TV is not the same as sky radiance- the angular distribution of emitted light isn't the same, so the off-axis bits are illuminated differently. My best results come from imaging a heavily overcast sky (at night).

I don't bother with bias or darks, either. The flats definitely help with vignetting, tho- at f/2.8 the effect has to be corrected.

The 32-bit (per channel) files are output from DSS. It's worth a try to do all my post-processing in ImageJ, why not? I think the easiest way for me to get you any files is Dropbox- PM me with an email and I'll set it up.

17. May 24, 2016

### davenn

OK all cool :)

nice :)

OK interesting .... so have you tried the white tee-shirt and illuminate it method ?
you can do that at home any time. you can also aim the camera and lens at a white/off-white wall and do flats that way

difficult to imagine how you get a balanced illumination that way ---- try the white tee or white wall as a comparison

darks are good to do, as said it gets rid of hot pixels from the final stacked image

just save the TIFFs as 16 bit from DSS, saves having to do the 32 to 16 conversion later on and negates any problems that conversion may be causing

Done

Dave

18. Jun 1, 2016

### Andy Resnick

Update- davenn and I corresponded a bit, which I found immensely helpful (thanks!). One improvement I made is a more stringent cutoff for 'acceptable' images. I made two cutoffs- one based on the overall DSS score and the other based on the 'full width half max' output by DSS. I don't know the exact DSS algorithms used to compute these quantities, but the former relates to (among other things) how round the stars are- how much RA drift occurs in a frame- and the latter refers to how much blurring/seeing affects an image. The cutoff scores are likely scene, lens, and sensor dependent; for my M13 images the score cutoff is 200 and the FWHM is 4.7 pixels using a 800/5.6 lens. The 'worst' image looks like this:

Fortunately, 10 hours of imaging yielded about 200 acceptable frames- I'm not proud of the massive inefficiency (5%!)- but I gained a robust method to covert 32-bit/channel DSS images to 8-bit/channel RGB TIFs. This is what I ended up with:

I got this without any tweaking, only 'automatic' per-channel brightness/contrast settings.

Now that I'm getting the hang of PEC (periodic error correction), I expect my efficiency to increase substantially. Which is good, since the Ring nebula (M57) is coming into position.

19. Jun 1, 2016

### Staff: Mentor

Yeah, if you've put in 10 hours of imaging and only 5% of the images are useable then something's wrong. What mount are you using?

20. Jun 1, 2016

### Andy Resnick

I made it sound worse than it is- I've already doubled the amount of time I can 'reliably' acquire a single frame,and I included *total* outdoor time- from setup to teardown. Each frame requires a few seconds to allow the shutter ring-down to dissipate prior to exposure. On a good night I can acquire about 200 images in 90 minutes (actual image acquisition time)- Dr. Resnick needs to be in bed around midnight. I'm generally looking near the celestial equator, so everything is worst-case: earth's rotation is 15 arcsec/s, each pixel of my sensor covers 1.25 arcsec (800mm lens), so without a tracking mount I can only acquire for about 0.1 seconds before I get star trails. As of yesterday I am able to 'reliably' (keep rate of 20%) acquire for 10s, meaning I've compensated for nearly 99% of the earth's rotation. After another few rounds of PEC I'll see if I am approaching marginal return or not.

I'm *very* happy with my mount (Losmandy GM-8). Trivial to setup and align.

Going to a shorter lens (400mm), I should be able to acquire for 20 or 30 seconds at a time, with a proportional increase in efficiency.