Welp. This turned out better than expected. Taken Friday night from my back patio. Jupiter is at opposition this week (the peak of opposition is August 19-20th), so now's a pretty good time to get out and see it. Acquisition and processing details below.
Figure 1. Jupiter
Equipment:
Meade 10" LX200-ACF (telescope)
Tele Vue 4x Powermate (basically a 4x Barlow lens)
ZWO Atmospheric Dispersion Corrector (ADC)
ZWO filter wheel with Astronomik RGB filter set
ZWO ASI290MM (monochrome camera)
Software:
FireCapture
Autostakkert!
Registax
WinJUPOS
Gimp
Midpoint timestamp: 2021-08-14 07:20.3 UT
Total integration time: 9 minutes, 50% of frames kept, no normalization.
So, this image was taken on Friday night, but it really started by my throwing away all the data taken Thursday night, which was garbage due to inadequate dew management. About a terabyte of data down the drain. "I don't need a dew shield," I had said, "I've got my active dew heater strap going. That should be enough," I told myself. "I'll be fine." I was an idiot.
So anyway, back to Friday night. This time I set up with proper dew prevention (Fig. 2).
Figure 2. Telescope
Using FireCapture software, I took several sequences of alternating Red, Blue, and Green filters, 1 minute videos per filter, 18 minute sequences. In other words, I would capture RGBRGBRGBRGBRGBRGB, where each letter corresponds to a 1 minute, uncompressed video. I only planned on processing 9 minutes of video data, but these 18 minute sequences give me some flexibility to choose the best 9 minute window within any given sequence. In-between sequences I re-leveled the ADC and refocused. Five sequences were taken. About 2/3 of a terabyte of Jupiter data was captured that night.
By a stroke of luck, the best seeing for the evening occurred somewhat early (before Jupiter crossed the meridian), when the Great Red Spot (GRS) was visible. I'll take that!
All the videos were processed using Autostakkert! software, keeping 50% of frames, no normalization.
I should note that Autostakkert! gives the option to output an additional, sharpened image, with a "conv" in the filename, in addition to the regular, unsharpened image. Regardless of what Dylan O'Donnell might suggest for his workflow (
), I don't suggest using the "conv" images in your subsequent processing; they are only a quick-and-dirty sharpening meant for evaluation purposes. Wavelet sharpening such as Registax does a much better job. That said, the Autostakkert! produced "conv" images are a great at helping you choose which outputs to keep and move forward with, and which outputs to throw away. My point is once you decide which files you want to keep (and by all means use the "conv" images to help you choose), don't use the "conv" images for further processing; use the unsharpened images moving forward.
So anyway, after evaluating all the processed data, I chose a 9-minute window in my third sequence, centered around 07:20.3 UT, for further processing. This window includes 3 images with the Red filter, 3 with the Green filter, and three with the Blue filter, alternating RGBRGBRGB.
I then individually sharpened the images using Registax wavelet sharpening, keeping the same settings for each of the three images of a given color filter, although the settings were different for different color filters. (To be clear, all the images are still black-and-white at this point, it's just that 3 images were taken with the Red filter, three with the Green filter, and three with the Blue). I was fairly aggressive on the sharpening in this step, knowing that the noise would be reduced a little bit in the upcoming WinJUPOS derotation combination step.
You may be asking, "why did you start with nine, 1 minute videos instead of three, 3 minute videos?" That's because Jupiter rotates really fast; it's day only about 10 hours. Any video over about a minute or so will begin to blur when processed due to Jupiter's rotation.
Once you have sharpened images you can use WinJUPOS to combine even a few more images together using WinJUPOS's "De-rotation of Images" tool. This tool warps indivdual images in a way that mimics Jupiter's rotation, and then combines them together into a single image, synchronized to the target timestamp.
By the way, recall that I used FireCapture to initially capture the raw data in the form of uncompressed videos. Firecapture has a setting that automatically puts the midpoint timestamp of each video in the filename, as part of the WinJUPOS file naming convention. I keep this same filename convention all through the processing (all through Autostakkert!, Registax, etc.) That makes it really easy to keep track of exact time the videos were taken, and WinJUPOS automatically sets up the time by reading the images' filenames. So using the WinJUPOS naming convension of all your files makes the workflow really easy. And again, it all starts with checking a single checkbox in FireCapture's filename settings.
If you're new to WinJUPOS, there is a bit of a learning curve, but then it's really easy to work with once you get the hang of it.
So at this point in the process, I now have three monochrome images, one taken with the Red Filter, one with the Green, and one with the Blue. I then used WinJUPOS's "De-rotation of R/G/B frames" tool to combine the three images into a single, color image. Note that I didn't really do any de-rotation in this step, since all the de-rotation was done in the previous step. But I did use this WinJUPOS tool to combine the three images into a single, color image.
In images submitted in my previous posts I used Gimp combine the three channels into a single, color image, but this time I used WinJUPOS. I think the team at WinJUPOS might have made some improvements in this area since last year, because it worked great this time.
Then I went back into Registax for a second time, and gave the color image one final round of wavelet sharpening.
To end the processing, I used Gimp for final adjustments (color curves, contrast, saturation, etc.)