# Astrophotography limitations

• Stargazing
• Andy Resnick

#### Andy Resnick

This may be discussed elsewhere, but I thought I could share some results I obtained during the course of trying to photograph YU55 and some other faint objects. The background references are Roggeman and Welsh, "Imaging through turbulence" and David Saint-Jacques' Ph.D dissertation which I downloaded at: http://www.google.com/url?sa=t&rct=...sg=AFQjCNFUNLv0Lq32L73kmxBlwhRbfjZgeg&cad=rja

The dissertation is quite well-written, and clearly explains the nature of the problem (clear air turbulence). One chapter is devoted to 'field tests', and that's where I got a lot of this information.

First, all you need is an image of stars- leave the shutter open as long as you like, but do not over-expose. Here's a crop of my image (2 second exposure time):

[PLAIN]http://img339.imageshack.us/img339/6885/dsc16581.png [Broken]

The image was scaled up 400% (without interpolation), and I performed a linescan across two of the stars to figure out how big the 'Airy disk' is- here it's about 4 pixels FWHM.

Step 1: figure out the angular FOV per pixel. This image was acquired using a 400mm lens, 35mm image format, for a total view angle of q = 2*arctan(d/2f) = 2*arctan (36mm/800mm) = 5.1 degrees. Each pixel thus covers 3 arcsec.

Result #1: My point spread function/ Airy disk diameter is about 6 arcsec, based on the faint star.

Step 2: figure out the entrance pupil diameter. This is the focal length divided by the f-number, and is close to the physical diameter of the front lens element. On my lens, this is 140mm.

Step 3: now calculate the theoretical airy disk diameter- q = 2*arcsin(1.22*l/d), where l is the wavelength of light. I use l = 500 nm (green), giving me q = 1.6 arcsec.

Result #2- my seeing conditions limit what my lens can resolve. This is why I stopped trying to image faint stars/nebulae at 800mm- the only effect is to decrease the amount of light hitting the sensor, meaning I have to leave the shutter open longer, which increases the 'star trails'.

Step 4: calculate the coherence length 'r'. This number is the length over which the incoming wavefront fluctuates by 1 radian, and is a measure of how good the seeing conditions are- the larger 'r' is, the more quiet the upper atmosphere, and the larger the telescope aperture can be. The relevant formula is r = l/q, where q is the angular extent of the star image.

Result #3: r= 17mm for my site, when this image was acquired. For reference, 'normal' seeing conditions for astronomy has q = 1 arcsec and r = 100 mm. This is an important result- r is a measure of the maximum useful entrance pupil diameter. That is, I could operate my lens all the way down to f/4 without affecting the amount of resolvable detail. That is, the real advantage to having a large entrance pupil is the ability to *detect* faint objects, not to resolve them. There is no point to me getting a 12" telescope, for example- a 12" POS shaving mirror would work just as well at my site.

Step 5: calculate 'minimum acquisition time' to prevent star trails: Without any tracking mount, this is given by setting the acquisition time equal to the angular size of a pixel divided by the angular speed of stars- which depends on the declination of the star. For my lens, and looking at the ring nebula (5 arcsec/sec), this comes to about 1 second.

Step 6: calculate the coherence time- this depends on the windspeed, and is a measure of how fast the wavefront varies. The calculation is fairly complex, but as an approximation is given by: t = 2.6*(r/v)(1+8(d/r)^1.7)^0.1). Estimating v = 10 m/s and using the values for d and r, I get t = 1/15 second. This means any acquisition time over 1/15 second will time-average out the temporal variations, leaving only the blurred average image. Shorter acquisition times can be used (IIRC) to perform wavefront deconstruction, and sets the limit on adaptive optics correction schemes.

Final result- For bright objects, I am better off using a lower focal length lens (85mm, 12 arcsec/pixel) as my minimum acquisition time can be much longer (2-3 seconds), and because the seeing conditions are so poor, I do not lose very much resolution.

Edit- for the case where the exposure time is much less than the coherence time (e.g. for very bright objects like the moon, Jupiter, Space Station, etc), the spatial blurring is 'frozen', resulting in an improvement in image quality- the aberration can be considered as a spatially-varying magnification. This means I do gain some improvement by working at 800mm. Also, the technique of 'focus stacking' will work best if the frame rate is high- a generic 30 fps video camera would most likely be ideal for me to use.

Last edited by a moderator:

Hey Andy,

Good work on capturing the asteroid last week! What was its magnitude at the time of your image?

Am I correct that your calculations lead to you to a final minimum integration time of 1/15th of a second? Is there more information to be gained from them? I ask only because there are many factors that go into a selection of integration time, and I find that the dominant ones are typically simple signal-processing limitations like quantization noise.

I also use track-and-stack methods to solve all kinds of problems in my own astrophotography. Good to know there's another sky shooter on PF!

- Warren

Thanks! I didn't image the asteroid, tho- I was looking in the right direction at the right time, but I don't have clear evidence to verify capture. At the time, YU55 was about magnitude 11- it was near the time of closest approach, located near Delphinus epsilon.

The stars in the image above are: TYC 1091-166-1 (the faint one, magnitude 11), BD+10 4287 (the upper one is a faint double star, the bright one is magnitude 10, the faint one magnitude 12), and the lower one is HD 194671, magnitude 9.

As for the integration time, you are right- there are many issues involved. The calculation I performed was for seeing conditions only. For comparison, I capture Jupiter at 1/60s ISO 100 and the moons at 1/6s ISO 400- sensor noise is not really an issue there.

Get anything good lately?

Good info, Andy! If I can ever manage to pull together the time and resources and ambition to build an observatory to house my refractor, I can try putting my Canons to work.

Definitely very interesting stuff, Andy. I've never done a scientific analysis of seeing conditions before. I just pick my nights as carefully as I can and hope for the best. Now you'll have me looking much more closely at the weather forecasts, though!

Here's one I took recently of the Andromeda Galaxy:

Equipment: Celestron C11 SCT, HyperStar III, Nikon D90; ~500 mm f/2 Schmidt-camera configuration; approximately 1.5 hours of integration time, stacked from many 15 second subframes.

- Warren

whoa...

Wow, nice. You did that with a lot of 15 second exposures?

Thanks, guys. Yes, this is a composite of something like 400 individual 15-second exposures. I use this technique -- track-and-stack -- for a couple of reasons:

- My mount is of mediocre quality, and I don't bother to align in any way at all. It has field rotation, flexure, plenty of periodic error, you name it -- yet it performs quite well over short periods of time. I try to trade complexity in the field for complexity in software. I usually only have a few hours under the stars, and I hate twiddling knobs in the dark, but I don't mind letting my computer chug for a couple days on 15 GB of raw data.

- I can throw out the lowest-quality 30% of the sub-exposure and instantly eliminate time-limited defects like airplanes, satellites, periods of poor seeing, etc. that would ruin long exposures. When used to weed out periods of poor seeing, this technique is sometimes called "speckle imaging."

- I have much better control on signal quality. Short exposures eliminate the possibility of saturation, and I can continue to take additional exposures, forever, and continue to improve the SNR and dynamic range of my images, forever. An object like M31 has absolutely enormous dynamic range -- probably at least 140 dB in the image above -- and this technique allows me to capture all of it, even with a consumer-grade 12-bit camera with ~60 dB of dynamic range.

If you'd like to see more examples, I have a few more on http://www.warrencraddock.com/photo/dark#M8" [Broken].

- Warren

Last edited by a moderator:
chroot motivated me to try focus stacking- here's a single image of the Ring Nebula (400mm, 0.8", ISO 3200):

[PLAIN]http://img403.imageshack.us/img403/559/singley.png [Broken]

and the result after averaging 24 images:

[PLAIN]http://img171.imageshack.us/img171/2461/avgstack1.png [Broken]

Last edited by a moderator:
How do you combine your images? Averaging them, summing them, or what?

Nice work, Andy! It's really thrilling to see what you can get with even just a couple of dozen subframes stacked together. The individual subframes are barely recognizable, but when you add 'em up -- WOW!

- Warren

How do you combine your images? Averaging them, summing them, or what?

Averaging and summing are actually the same process; averaging is just summation with a normalization step at the end. Summing images is definitely the first place to start with this kind of imaging. I use a somewhat more sophisticated kind of stacking called "kappa-sigma clipping," which attempts to discard noisy pixels before the summation.

- Warren

Last edited:
Averaging and summing are actually the same process; averaging is just summation with a normalization step at the end. Summing images is definitely the first place to start with this kind of imaging. I use a somewhat more sophisticated kind of stacking called "kappa-sigma clipping," which attempts to discard noisy pixels before the summation.

- Warren

Ah, ok. I've read a big book on image processing, but I don't really have much experience as I've just gotten into this hobby and have to drive 20-30 minutes at minimum to get out of the middle of the city.

If I have a lot of background noise from light pollution and such, is it better to sum them or to average them or what?

How do you combine your images? Averaging them, summing them, or what?

I just did it the dumb way- in ImageJ, I cropped and aligned each frame by hand and then converted the images into a stack. Then I did a 'z-projection', displaying the average intensity. 25-30 images is about the maximum I can deal with this way, due to the laborious first step. There's a free program I'm going to try out soon called RegiStax (http://www.astronomie.be/registax/ [Broken]), hopefully it will automate that step for me.

For me, getting the images into a stack is the hard part. Everything after that: background subtraction, despeckle, etc. can all be done easily and quickly in an automated fashion prior to the z-projection.

Last edited by a moderator:
Nice work, Andy! It's really thrilling to see what you can get with even just a couple of dozen subframes stacked together. The individual subframes are barely recognizable, but when you add 'em up -- WOW!

- Warren

Thanks! I have to admit I was really surprised at how much improvement there is, also. The Orion Nebula is coming up in an hour or so; I'm looking forward to trying this again!

Oh wow, yeah getting a program to do it is WAY easier. My SBIG camera came with CCDops, which had a "track and stack" which I used before I started trying autoguiding. It would take multiple short exposures and automatically stack them into the final image which you would then save.

RegiStax is good; I tend to use DeepSkyStacker more, though. It's very powerful, but it's interface is very complex and has a lot of 'gotchas.' Good luck with your photos!

- Warren

Gotcha's?

RegiStax is good; I tend to use DeepSkyStacker more, though. It's very powerful, but it's interface is very complex and has a lot of 'gotchas.' Good luck with your photos!

- Warren

I just installed RegiStax and DeepSkyStacker, they are chugging on the original images as I type this. Last night I found Lynkeos, which is written for the Mac OS, and it worked extremely well on both the Ring nebula and Jupiter; unfortunately heavy clouds rolled in before I could get the Orion nebula.

Here's a link to some of my photo's. Most of these are just 1-3 shots of just a couple of minutes through RGB filters. The ugliest ones were through my Meade DSI II one shot color camera. I have since upgraded.

Ugh, thought I could create an album on imageshack, but I guess it doesn't work like I thought. Forgive the list of links.

http://img820.imageshack.us/img820/7766/ngc2359.jpg [Broken]
http://img406.imageshack.us/img406/2097/c312.jpg [Broken]
http://img194.imageshack.us/img194/8426/moonha.jpg [Broken]
http://img21.imageshack.us/img21/8310/m272.jpg [Broken]
http://img88.imageshack.us/img88/441/c33dk.jpg [Broken]
http://img221.imageshack.us/img221/916/m16frost.jpg [Broken]
http://img20.imageshack.us/img20/9936/m1011.png [Broken]
http://img7.imageshack.us/img7/5691/m12ar.jpg [Broken]
http://img51.imageshack.us/img51/7328/m17p.jpg [Broken]
http://img202.imageshack.us/img202/5452/m271.png [Broken]
http://img708.imageshack.us/img708/4899/m33c.jpg [Broken]
http://img64.imageshack.us/img64/3486/m422.jpg [Broken]
http://img638.imageshack.us/img638/1326/m511.png [Broken]
http://img856.imageshack.us/img856/5379/c651.jpg [Broken]
http://img827.imageshack.us/img827/9994/c63i.jpg [Broken]
http://img155.imageshack.us/img155/5747/c34j.jpg [Broken]
http://img43.imageshack.us/img43/4186/c023s.jpg [Broken]
http://img849.imageshack.us/img849/3848/c011l.jpg [Broken]
http://img269.imageshack.us/img269/4733/m01ux.jpg [Broken]

Last edited by a moderator:
Here's a link to some of my photo's. Most of these are just 1-3 shots of just a couple of minutes through RGB filters. The ugliest ones were through my Meade DSI II one shot color camera. I have since upgraded.

Nice! The clouds parted this evening, so I tried M57 again:
[PLAIN]http://img26.imageshack.us/img26/246/sum12.png [Broken]

20 images, quickly cropped and processed with Lynkeos (that one seems to work best for me). Start to finish- tripod setup until final image- was about 10 minutes.

Last edited by a moderator:
This should give you an idea how wretched the seeing is here- this is what the Andromeda galaxy looks like from my backyard, under fairly good conditions- low humidity, clear skies, etc. It took me longer to find than to acquire and process the images:

[PLAIN]http://img854.imageshack.us/img854/4993/averagecrop.png [Broken]

Last edited by a moderator:
So sad, Andy! On a clear dark night here, you just can't ignore Andromeda. Unfortunately, we have a couple of 15-acre greenhouses to our south, producing tomatoes, and when they decide to flip on the grow-lights, the sky turns to crap.

Here's M51 from my front yard about 2 hours ago.. Only 5 minutes of exposure through each filter of RGB. I took pictures of about 4 more galaxies but all were crap thanks to the light pollution.

[PLAIN]http://img855.imageshack.us/img855/3134/m51j.jpg [Broken]

Last edited by a moderator:
So sad, Andy! On a clear dark night here, you just can't ignore Andromeda. Unfortunately, we have a couple of 15-acre greenhouses to our south, producing tomatoes, and when they decide to flip on the grow-lights, the sky turns to crap.

Here's M51 from my front yard about 2 hours ago.. Only 5 minutes of exposure through each filter of RGB. I took pictures of about 4 more galaxies but all were crap thanks to the light pollution.

Yeah, living in a large city at sea level isn't the best place for astronomy... at least I know not to waste my money on a telescope. I couldn't do a 5 minute exposure even at ISO 100 with a tracking mount- the light pollution would completely wash out the image. Even so, it's remarkable that the technology to take these images, once available only to a few, is now commonplace.

I'm calling this 'building my skills' for when I can take advantage of better conditions...

Edit- I figured out a decent workflow: after acquiring the images (about 20-30 at ISO3200, 0.8" shutter), I open the series in Microsoft picture manager and batch crop out the central third of the frame to reduce the file size and omit the vignetted portion. Then, I open the cropped series in DeepSky and let it register and add the images. I'll do a final color correction (the red channel is usually too bright) and I'm done. Here's the set of M31, M57, and M42 stacked images (downsized to fit here)

[PLAIN]http://img847.imageshack.us/img847/5540/andromedac.png [Broken]

[PLAIN]http://img337.imageshack.us/img337/5928/ringh.png [Broken]

[PLAIN]http://img210.imageshack.us/img210/6102/orionw.png [Broken]

Last edited by a moderator:
Andy, really nice work! Those last images show excellent focus and pretty good dynamic range. I suspect there's actually a lot more data in your files than you're currently bringing out with your processing.

Drakkith, you are a master! Thanks for posting all those images in post #20, I really enjoyed looking through them. My favorite is your single-channel shot of the Eagle nebula. Beautiful!

Andy Resnick said:
Yeah, living in a large city at sea level isn't the best place for astronomy... at least I know not to waste my money on a telescope. I couldn't do a 5 minute exposure even at ISO 100 with a tracking mount- the light pollution would completely wash out the image.

This is largely a myth -- light pollution is not nearly as big a deal as it is made out to be, at least if your goal is (like mine) simply to make pretty pictures. If your goal is to take scientifically meaningful photos, though, light pollution really does become a problem.

I live in San Francisco, and only need to drive out 15-20 minutes to places that are dark enough for my astrophotography. The photo of Andromeda that I posted above was taken under skies with so much light pollution that I could have read a newspaper. I have to contend with enormous light domes over both San Francisco and San Jose, with dozens and dozens of international flight cutting across the sky, and, yes, even fog.

How do I handle it? By carefully selecting my exposure times, stacking lots of photos, and taking advantage of some convenient properties of light pollution. The photons from light pollution are scattered at random by the atmosphere, and essentially add shot noise to the images. If I tried to take 5-minute exposures, the integrated noise would saturate my sensor, and I would have no chance of combatting it. Instead, I take a large number of much shorter exposures -- I rarely go above 30 seconds even in perfect conditions -- and let the noise average out in post-processing. This also eliminates tracking errors, periods of bad seeing, etc.

Astrophotography is really just an exercise in signal processing. The goals are to obtain high signal-to-noise ratio, and high dynamic range. Exposure duration and total integration time are not goals, at least not with digital cameras -- that's old-school film thinking! I use my camera at ISO 800, where its noise figure is smallest. I take good temperature-correlated dark and offset frames. I shoot everything in RAW, so I can take advantage of the full 14-bit dynamic range of my sensor. Most consumer-grade cameras have only 60 dB of dynamic range, but many astronomical targets require 100+ dB of dynamic range to "look good" in a pretty photo. That means it is impossible to make pretty photos with long exposures on consumer-grade cameras!

So, for many reasons... stack, stack, stack, my friends. You still need to nail your focus, but stacking can solve literally every other problem you're likely to encounter.

- Warren

Last edited:
Philip Lacroix
So true, Warren. One of the most competent astrophotographers around these days is Greg Parker, and he lives in foggy, crappy-weather England with 'way too many people. His partner in crime is Noel Carboni, who lives in Florida, and who does the post-processing. If you have never seen their book "Star Vistas", please try to dig up a copy. Simply the most stunning book of astrophotography available, with very nice cogent explanations of what is shown in each image.

Quite a few people were aware of their project, even as it was underway. As a result, one of the three forewords was written by Sir Arthur C. Clarke before his death. The other two forewords were written by Sir Patrick Moore and Dr. Brian May. Yes, THAT Brian May.

When Amazon started dropping the prices, I bought extra copies for my new neighbor, my older neighbors' grand-daughters, and my niece. ISBN: 978-0-387-88435-6, in case you want to see if there are some odd copies left here and there. It's a large coffee-table book published by Springer in a full-bleed format.

Last edited:
Chroot, while I agree that light pollution doesn't completely kill astrophotography, it does severely hamper it. From my buddies house who's about 20 minutes out from me I could take the same exposures of M51 I just posted and they would look phenomenally better thanks to much greater signal to noise ratio. In the end I think it just comes down to having to spend a much greater amount of time actually getting the pictures, AKA more exposure time. After all, light pollution doesn't block photons from getting to your camera from whatever you are imaging, it just adds noise. Lots and lots of noise.

Edit: By the way, if you are referring to this pic: http://img221.imageshack.us/img221/916/m16frost.jpg [Broken] then I hope you noticed my crazy ring of frost around the camera window. I think it makes the image look pretty crazy! That's why I posted it lol.

Last edited by a moderator:
Here's a good example of light pollution. If I had been out of the city, it would be MUCH easier to see M101. I'm going to try taking more exposures when I have the time, so we'll see.
Edit: Hmm...should have cropped it, but I didn't think about it and I don't have the time right now as I have to be off to work!

[PLAIN]http://img442.imageshack.us/img442/3473/m101.jpg [Broken]

Last edited by a moderator:
they would look phenomenally better thanks to much greater signal to noise ratio.

You can obtain whatever SNR you desire under any sky! You just have to adjust your methods.

In the end I think it just comes down to having to spend a much greater amount of time actually getting the pictures, AKA more exposure time.

I'm afraid that's wrong, too. Increasing exposure time does not increase SNR! What increases SNR is obtaining a larger number of independent observations of your desired signal -- i.e. short sub-exposures -- so you're not integrating much of the noise in each sub-exposure. When you average N sub-exposures, the SNR increases as $\sqrt{N}$.

After all, light pollution doesn't block photons from getting to your camera from whatever you are imaging, it just adds noise. Lots and lots of noise.

Yes! That's correct. Think of it this way: no matter where you are, during a one-hour period, the same number of signal photons, those from your deep-sky target, will hit your sensor. The signal remains the same, no matter how light polluted your site. So... you don't need longer exposures to obtain more signal. All the signal photons you need are already present, even under a severely light-polluted sky. What you need is some way to observe the noise more carefully, so that it can be removed -- and that means shortening the duration of your sub-exposures. If I could, I'd build a rig that could take a million 1/4000th second sub-exposures, and it'd peer right through the densest pea-soup light pollution on Earth.

Edit: By the way, if you are referring to this pic: http://img221.imageshack.us/img221/916/m16frost.jpg [Broken] then I hope you noticed my crazy ring of frost around the camera window. I think it makes the image look pretty crazy! That's why I posted it lol.

I actually didn't notice the frost! I was paying too much attention to the beautiful pillars of creation. It's one of my favorite nebulae, and you did a great job with it!

- Warren

Last edited by a moderator:
I'm afraid that's wrong, too. Increasing exposure time does not increase SNR! What increases SNR is obtaining a larger number of independent observations of your desired signal -- i.e. short sub-exposures -- so you're not integrating much of the noise in each sub-exposure. When you average N sub-exposures, the SNR increases as $\sqrt{N}$.

Increasing the exposure times increases the signal gained. In the case of light pollution it also increases that, but either way the end result is the same. If your sky is 10,000 counts/min and your object is 1,000 counts/min, then doubling exposure times doubles both the noise and the signal. (I think because light pollution isn't normal noise, it is unwanted signal that increases linearly.) However, because the skyglow and the signal add together on your chip, I THINK you can subtract it with long enough exposures or more exposures. The readout noise, shot noise, dark noise, etc, all increases at the square root of the exposure time, not linearly, so longer exposures and more of them will definitely increase image quality.

Consider this: After 1 minute the pixels that get both the signal and the skyglow are at about 11,000 with the background at 10,000. (With the standard noise from dark current, readout, ETC) Up the exposure to 5 minutes and you will get on average a value of 55,000 for the object and 50,000 for the background. So your SNR between the object and the background is the same, but the difference is now 5,000 between the object and the background, giving you much greater range.

Averaging multiple images keeps the range the same, but it cuts back on shot noise, which is the random nature of the photons from the signal object. So you want to get BOTH exposure time and multiple images.

If I could, I'd build a rig that could take a million 1/4000th second sub-exposures, and it'd peer right through the densest pea-soup light pollution on Earth.

After combining the images the resulting image would be of similar quality of a hundred 4 second exposures or 10 forty second exposures. Worse actually, as at that low of an exposure time your images would be overwhelmingly dominated by noise. While you can subtract most of it you would never be able to get all of it.

Increasing the exposure times increases the signal gained. In the case of light pollution it also increases that, but either way the end result is the same. If your sky is 10,000 counts/min and your object is 1,000 counts/min, then doubling exposure times doubles both the noise and the signal.

Right -- if you double the exposure time, you double both the signal and the noise, so your SNR stays the same and you gain nothing. Sure, you get a brighter image, but you're no more able to separate the signal from the noise, and that's what astrophotography is all about. At some point you'll saturate your sensor, and that'll be it. You're toast.

Correct me if I'm wrong, but I think you agree that http://en.wikipedia.org/wiki/Shift-and-add" [Broken], but the targets must be sufficiently bright.

However, because the skyglow and the signal add together on your chip, I THINK you can subtract it with long enough exposures or more exposures. The readout noise, shot noise, dark noise, etc, all increases at the square root of the exposure time, not linearly, so longer exposures and more of them will definitely increase image quality.

Some of these sources of noise are fixed (the same for any exposure of any length), like readout noise. Others, like dark current, are actually linear with exposure time.

Consider this: After 1 minute the pixels that get both the signal and the skyglow are at about 11,000 with the background at 10,000. (With the standard noise from dark current, readout, ETC) Up the exposure to 5 minutes and you will get on average a value of 55,000 for the object and 50,000 for the background. So your SNR between the object and the background is the same, but the difference is now 5,000 between the object and the background, giving you much greater range.

Again, the ratio of your signal to your noise remains constant, so you've not gained much. You've spread your signal over a greater number of codes, so you've reduced the quantization noise, but it comes at a tremendous cost. Those long exposures will put a much greater burden on your mount (which has tracking error) and your seeing conditions (which change constantly). Very few amateur telescopes have tracking error that is acceptable for even one minute unguided exposures.

So, here's the bottom line: you definitely want to stack exposures to increase SNR and dynamic range. This method is used to great effect by professional astronomers all over the world. You want your exposures to be short enough to eliminate any visible tracking or seeing error, but not so short that they become dominated by the readout or quantization noise of your sensor. Somewhere in between, there's a sweet spot that will produce the optimal output. For me, with my equipment, that sweet spot is in the 10-30 second range. I hope that my photographs speak for themselves, and demonstrate the advantages of the method.

- Warren

Last edited by a moderator:
Correct me if I'm wrong, but I think you agree that http://en.wikipedia.org/wiki/Shift-and-add" [Broken] produces better SNR (and therefore better final photos) than individual long exposures.

To my knowledge it does not until your total exposure time with the stacked images exceeds the longer exposure images.

Some of these sources of noise are fixed (the same for any exposure of any length), like readout noise. Others, like dark current, are actually linear with exposure time.

Yes, and one of the reasons you need to get long exposures or stack many images is because of shot noise correct? I thought that because of the randomness of shot noise a long exposure will end up just like multiple short exposures will if the total exposure times are the same.

Again, the ratio of your signal to your noise remains constant, so you've not gained much. You've spread your signal over a greater number of codes, so you've reduced the quantization noise, but it comes at a tremendous cost.

How have you not gained much? Your shot noise has dropped thanks to the extra signal. I'm just wondering how effective it is vs the increased background noise.

Those long exposures will put a much greater burden on your mount (which has tracking error) and your seeing conditions (which change constantly). Very few amateur telescopes have tracking error that is acceptable for even one minute unguided exposures.

Sure, but let's leave that for another discussion.

So, here's the bottom line: you definitely want to stack exposures to increase SNR and dynamic range. This method is used to great effect by professional astronomers all over the world. You want your exposures to be short enough to eliminate any visible tracking or seeing error, but not so short that they become dominated by the readout or quantization noise of your sensor. Somewhere in between, there's a sweet spot that will produce the optimal output. For me, with my equipment, that sweet spot is in the 10-30 second range. I hope that my photographs speak for themselves, and demonstrate the advantages of the method.

I pretty much agree with you, however assuming you have good tracking and guiding, 5-10 minute exposures seem to be just as good as stacking ones of lesser exposure time. Heck, you're still going to stack the 5-10 min ones anyways. The advantage of shorter exposures compared to long ones (5-10 min exposures compared to 1 hour long) are that you can get rid of any frames that have satellites go through them, had a mount error, etc. That and it keeps saturation from occurring.

My only question is will getting many exposures of something help to counteract light pollution or not? I think so, but I'm not sure.

Last edited by a moderator:
Excellent photo by the way Chroot!

<snip>

So, for many reasons... stack, stack, stack, my friends. You still need to nail your focus, but stacking can solve literally every other problem you're likely to encounter.

- Warren

I understand- that makes a lot of sense. I also found a site discussing stacking, and they discussed the rule of diminishing returns- for example, if my SNR goes up by (say) 10 by stacking 50 frames, I'd need to stack 500 (or something like that) to get up to 20, 5000 to get an SNR of 30, etc.

My main limitation is the lack of a tracking mount- my exposures have to be short enough to prevent motion blur using a high camera gain. This is also ok, because like any other noise, it's statistical and washes out with stacking.