Register to reply

Astrophotography limitations

by Andy Resnick
Tags: astrophotography, limitations
Share this thread:
Andy Resnick
#1
Nov14-11, 09:47 AM
Sci Advisor
P: 5,523
This may be discussed elsewhere, but I thought I could share some results I obtained during the course of trying to photograph YU55 and some other faint objects. The background references are Roggeman and Welsh, "Imaging through turbulence" and David Saint-Jacques' Ph.D dissertation which I downloaded at: http://www.google.com/url?sa=t&rct=j...fjZgeg&cad=rja

The dissertation is quite well-written, and clearly explains the nature of the problem (clear air turbulence). One chapter is devoted to 'field tests', and that's where I got a lot of this information.

First, all you need is an image of stars- leave the shutter open as long as you like, but do not over-expose. Here's a crop of my image (2 second exposure time):



The image was scaled up 400% (without interpolation), and I performed a linescan across two of the stars to figure out how big the 'Airy disk' is- here it's about 4 pixels FWHM.

Step 1: figure out the angular FOV per pixel. This image was acquired using a 400mm lens, 35mm image format, for a total view angle of q = 2*arctan(d/2f) = 2*arctan (36mm/800mm) = 5.1 degrees. Each pixel thus covers 3 arcsec.

Result #1: My point spread function/ Airy disk diameter is about 6 arcsec, based on the faint star.

Step 2: figure out the entrance pupil diameter. This is the focal length divided by the f-number, and is close to the physical diameter of the front lens element. On my lens, this is 140mm.

Step 3: now calculate the theoretical airy disk diameter- q = 2*arcsin(1.22*l/d), where l is the wavelength of light. I use l = 500 nm (green), giving me q = 1.6 arcsec.

Result #2- my seeing conditions limit what my lens can resolve. This is why I stopped trying to image faint stars/nebulae at 800mm- the only effect is to decrease the amount of light hitting the sensor, meaning I have to leave the shutter open longer, which increases the 'star trails'.

Step 4: calculate the coherence length 'r'. This number is the length over which the incoming wavefront fluctuates by 1 radian, and is a measure of how good the seeing conditions are- the larger 'r' is, the more quiet the upper atmosphere, and the larger the telescope aperture can be. The relevant formula is r = l/q, where q is the angular extent of the star image.

Result #3: r= 17mm for my site, when this image was acquired. For reference, 'normal' seeing conditions for astronomy has q = 1 arcsec and r = 100 mm. This is an important result- r is a measure of the maximum useful entrance pupil diameter. That is, I could operate my lens all the way down to f/4 without affecting the amount of resolvable detail. That is, the real advantage to having a large entrance pupil is the ability to *detect* faint objects, not to resolve them. There is no point to me getting a 12" telescope, for example- a 12" POS shaving mirror would work just as well at my site.

Step 5: calculate 'minimum acquisition time' to prevent star trails: Without any tracking mount, this is given by setting the acquisition time equal to the angular size of a pixel divided by the angular speed of stars- which depends on the declination of the star. For my lens, and looking at the ring nebula (5 arcsec/sec), this comes to about 1 second.

Step 6: calculate the coherence time- this depends on the windspeed, and is a measure of how fast the wavefront varies. The calculation is fairly complex, but as an approximation is given by: t = 2.6*(r/v)(1+8(d/r)^1.7)^0.1). Estimating v = 10 m/s and using the values for d and r, I get t = 1/15 second. This means any acquisition time over 1/15 second will time-average out the temporal variations, leaving only the blurred average image. Shorter acquisition times can be used (IIRC) to perform wavefront deconstruction, and sets the limit on adaptive optics correction schemes.

Final result- For bright objects, I am better off using a lower focal length lens (85mm, 12 arcsec/pixel) as my minimum acquisition time can be much longer (2-3 seconds), and because the seeing conditions are so poor, I do not lose very much resolution.

Edit- for the case where the exposure time is much less than the coherence time (e.g. for very bright objects like the moon, Jupiter, Space Station, etc), the spatial blurring is 'frozen', resulting in an improvement in image quality- the aberration can be considered as a spatially-varying magnification. This means I do gain some improvement by working at 800mm. Also, the technique of 'focus stacking' will work best if the frame rate is high- a generic 30 fps video camera would most likely be ideal for me to use.
Phys.Org News Partner Science news on Phys.org
What lit up the universe?
Sheepdogs use just two simple rules to round up large herds of sheep
Animals first flex their muscles
chroot
#2
Nov14-11, 08:54 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
Hey Andy,

Good work on capturing the asteroid last week! What was its magnitude at the time of your image?

Am I correct that your calculations lead to you to a final minimum integration time of 1/15th of a second? Is there more information to be gained from them? I ask only because there are many factors that go into a selection of integration time, and I find that the dominant ones are typically simple signal-processing limitations like quantization noise.

I also use track-and-stack methods to solve all kinds of problems in my own astrophotography. Good to know there's another sky shooter on PF!

- Warren
Andy Resnick
#3
Nov15-11, 10:01 AM
Sci Advisor
P: 5,523
Thanks! I didn't image the asteroid, tho- I was looking in the right direction at the right time, but I don't have clear evidence to verify capture. At the time, YU55 was about magnitude 11- it was near the time of closest approach, located near Delphinus epsilon.

The stars in the image above are: TYC 1091-166-1 (the faint one, magnitude 11), BD+10 4287 (the upper one is a faint double star, the bright one is magnitude 10, the faint one magnitude 12), and the lower one is HD 194671, magnitude 9.

As for the integration time, you are right- there are many issues involved. The calculation I performed was for seeing conditions only. For comparison, I capture Jupiter at 1/60s ISO 100 and the moons at 1/6s ISO 400- sensor noise is not really an issue there.

Get anything good lately?

turbo
#4
Nov15-11, 11:01 AM
PF Gold
turbo's Avatar
P: 7,363
Astrophotography limitations

Good info, Andy! If I can ever manage to pull together the time and resources and ambition to build an observatory to house my refractor, I can try putting my Canons to work.
chroot
#5
Nov15-11, 01:19 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
Definitely very interesting stuff, Andy. I've never done a scientific analysis of seeing conditions before. I just pick my nights as carefully as I can and hope for the best. Now you'll have me looking much more closely at the weather forecasts, though!

Here's one I took recently of the Andromeda Galaxy:



Equipment: Celestron C11 SCT, HyperStar III, Nikon D90; ~500 mm f/2 Schmidt-camera configuration; approximately 1.5 hours of integration time, stacked from many 15 second subframes.

- Warren
Andy Resnick
#6
Nov15-11, 02:06 PM
Sci Advisor
P: 5,523
whoa....
Drakkith
#7
Nov15-11, 06:48 PM
Mentor
Drakkith's Avatar
P: 11,868
Wow, nice. You did that with alot of 15 second exposures?
chroot
#8
Nov15-11, 07:26 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
Thanks, guys. Yes, this is a composite of something like 400 individual 15-second exposures. I use this technique -- track-and-stack -- for a couple of reasons:

- My mount is of mediocre quality, and I don't bother to align in any way at all. It has field rotation, flexure, plenty of periodic error, you name it -- yet it performs quite well over short periods of time. I try to trade complexity in the field for complexity in software. I usually only have a few hours under the stars, and I hate twiddling knobs in the dark, but I don't mind letting my computer chug for a couple days on 15 GB of raw data.

- I can throw out the lowest-quality 30% of the sub-exposure and instantly eliminate time-limited defects like airplanes, satellites, periods of poor seeing, etc. that would ruin long exposures. When used to weed out periods of poor seeing, this technique is sometimes called "speckle imaging."

- I have much better control on signal quality. Short exposures eliminate the possibility of saturation, and I can continue to take additional exposures, forever, and continue to improve the SNR and dynamic range of my images, forever. An object like M31 has absolutely enormous dynamic range -- probably at least 140 dB in the image above -- and this technique allows me to capture all of it, even with a consumer-grade 12-bit camera with ~60 dB of dynamic range.

If you'd like to see more examples, I have a few more on my website.

- Warren
Andy Resnick
#9
Nov15-11, 07:26 PM
Sci Advisor
P: 5,523
chroot motivated me to try focus stacking- here's a single image of the Ring Nebula (400mm, 0.8", ISO 3200):



and the result after averaging 24 images:



Not bad!
Drakkith
#10
Nov15-11, 07:28 PM
Mentor
Drakkith's Avatar
P: 11,868
How do you combine your images? Averaging them, summing them, or what?
chroot
#11
Nov15-11, 07:30 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
Nice work, Andy! It's really thrilling to see what you can get with even just a couple of dozen subframes stacked together. The individual subframes are barely recognizable, but when you add 'em up -- WOW!

- Warren
chroot
#12
Nov15-11, 07:35 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
Quote Quote by Drakkith View Post
How do you combine your images? Averaging them, summing them, or what?
Averaging and summing are actually the same process; averaging is just summation with a normalization step at the end. Summing images is definitely the first place to start with this kind of imaging. I use a somewhat more sophisticated kind of stacking called "kappa-sigma clipping," which attempts to discard noisy pixels before the summation.

- Warren
Drakkith
#13
Nov15-11, 08:34 PM
Mentor
Drakkith's Avatar
P: 11,868
Quote Quote by chroot View Post
Averaging and summing are actually the same process; averaging is just summation with a normalization step at the end. Summing images is definitely the first place to start with this kind of imaging. I use a somewhat more sophisticated kind of stacking called "kappa-sigma clipping," which attempts to discard noisy pixels before the summation.

- Warren
Ah, ok. I've read a big book on image processing, but I don't really have much experience as I've just gotten into this hobby and have to drive 20-30 minutes at minimum to get out of the middle of the city.

If I have a lot of background noise from light pollution and such, is it better to sum them or to average them or what?
Andy Resnick
#14
Nov15-11, 08:56 PM
Sci Advisor
P: 5,523
Quote Quote by Drakkith View Post
How do you combine your images? Averaging them, summing them, or what?
I just did it the dumb way- in ImageJ, I cropped and aligned each frame by hand and then converted the images into a stack. Then I did a 'z-projection', displaying the average intensity. 25-30 images is about the maximum I can deal with this way, due to the laborious first step. There's a free program I'm going to try out soon called RegiStax (http://www.astronomie.be/registax/), hopefully it will automate that step for me.

For me, getting the images into a stack is the hard part. Everything after that: background subtraction, despeckle, etc. can all be done easily and quickly in an automated fashion prior to the z-projection.
Andy Resnick
#15
Nov15-11, 09:00 PM
Sci Advisor
P: 5,523
Quote Quote by chroot View Post
Nice work, Andy! It's really thrilling to see what you can get with even just a couple of dozen subframes stacked together. The individual subframes are barely recognizable, but when you add 'em up -- WOW!

- Warren
Thanks! I have to admit I was really surprised at how much improvement there is, also. The Orion Nebula is coming up in an hour or so; I'm looking forward to trying this again!
Drakkith
#16
Nov15-11, 09:02 PM
Mentor
Drakkith's Avatar
P: 11,868
Oh wow, yeah getting a program to do it is WAY easier. My SBIG camera came with CCDops, which had a "track and stack" which I used before I started trying autoguiding. It would take multiple short exposures and automatically stack them into the final image which you would then save.
chroot
#17
Nov15-11, 09:14 PM
Emeritus
Sci Advisor
PF Gold
chroot's Avatar
P: 10,427
RegiStax is good; I tend to use DeepSkyStacker more, though. It's very powerful, but it's interface is very complex and has a lot of 'gotchas.' Good luck with your photos!

- Warren
Drakkith
#18
Nov15-11, 10:03 PM
Mentor
Drakkith's Avatar
P: 11,868
Gotcha's?


Register to reply

Related Discussions
Astrophotography photos Stargazing & Telescopes 315
Astrophotography Question Stargazing & Telescopes 9
My Astrophotography Site Stargazing & Telescopes 23
QuickCam Astrophotography Stargazing & Telescopes 4
Afocal Astrophotography Stargazing & Telescopes 3