Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Stargazing Is it nebulosity or an artefact?

  1. Dec 12, 2017 #1

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I bought a 'nebula filter' at some great cost and the attached was the result of four, two minute exposures on a recent clear night. I'm clearly struggling with getting the optimal exposure and Iso gain setting on my DSLR but the Orion Nebula came out fairly convincingly. There's plenty of detail in the main nebula that I don't see on the straight snapshot with similar (obvs lower) exposures so I 'believe the picture. So what is going on around the star down and to the right (Hatsia?)? There's a definite disc around it which is not present on other stars of equal apparent magnitude. Looking at similar images I have of the Pleiades, I can see the same effect on some stars but not on others. What's the opinion as to what the image is showing? stack 1 green.jpg
     
  2. jcsd
  3. Dec 12, 2017 #2

    davenn

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    The Pleiades are surrounded by nebulosity

    for your photo above, it would be good to see a closer in pic of the star in question.
    So a same sized pic but zoomed in on that star .... I assume this current pic was significantly reduced in size?
    The overall pic is too small to tell if you have a processing halo or true nebulosity.
    The nebulosity in the Orion region is quite expansive and I'm picking that it is showing nebulosity

    Dave
     
  4. Dec 12, 2017 #3

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I will sort that out. The jpeg I started with is big but, by the time it got to you, it had lost a lot of detail. Dunno why. :frown:
    It seems to vary fro star to star. I have seen pics of Pleiades and I have not yet seen that vast cloudiness that can be obtained.
    Just cooking dinner but I will try to get back after the Beef Hotpot. Yum yum.
     
  5. Dec 12, 2017 #4

    stefan r

    User Avatar
    Gold Member

    Hatsya

    This image from wikipedia commons shows plenty of nebulosity. It is rotated about 90 degrees from your picture.
    [​IMG]

    I suspect you are looking at double ionized oxygen. Your "nebula filter" might be optimized for that purpose. I do not know if Iota Orionis ionizes the oxygen from the Orion nebula or if that is Iota's own solar wind. If it is more blue than green it is reflected light.

    Pleiades have nebulosity:
    There are no stars in your picture with magnitude equal to Iota. Wikipedia puts Iota's apparent visual magnitude at 2.7. The trapezium is listed as 4.0. The entire orion nebula is also listed as 4.0. The display screen pixel will be saturated. Not an accurate measure for magnitude.

    Compare to Vega. If it is in your camera then Vega should have it too.
     
  6. Dec 12, 2017 #5

    stefan r

    User Avatar
    Gold Member

    800px-Pleiades_large.jpg
     
  7. Dec 12, 2017 #6

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    is this nebulosity.jpg

    This is the small part of the image. I guess the pale disc must be an artefact.
    The Nasa image of the pleiades is very impressive and subjectively nice to look at but the artefacts, imo, spoil it. The diffraction stars and circles don't help anhyone to understand what's actually there. They're trying to show the nebulosity at the same time as very bright stars and the contrast is too great.
    There is so much to get right for good astro images. I am working at it.
    PS The hotpot was smashing. I can hardly move now!
    PPS The filter is OIII 8nm wide; very revealing of some details.
     
  8. Dec 13, 2017 #7

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    I agree. I often generate that artifact as well during the compression of the original 16-bit (or 32-bit) per channel image down into an 8-bit/channel image, by over-doing the gamma correction (using gamma values << 1) in an effort to selectively boost the low-intensity parts of an image.
     
  9. Dec 13, 2017 #8

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    How does this argument sound? Limitiations to the sensor mean that the luminance is clipped to form a white disc with skirts falling off where the sensor can handle them. According to the Airy Pattern (see link), the first maximum is at a level of about 0.0175. That corresponds to a magnitude difference of 4. In my image, the star with the artefact, if that disc corresponds to the first sidelobe of the Airy pattern and the luminance level is clipped at 255 255 0, the skirts of the main peak fall away at about half way out to the edge of the disc artefact. The magnitude difference between the star and other stars which are just visible could be about 4 and that is the magnitude difference between the peak and the level of the first airy sideline. So the apparent visibility of this artefact is about the same as the visibility of the faintest stars in the picture. So, could that mean that the contrast range that the sensor can reproduce (for pixels which are close together) is only about 1.75%? OR could it be some extra characteristic imperfection of the lens (supposed to be ED Apochromatic)?
    The original image was dng and the artefact was visible on PS, before being turned into jpeg.
     
  10. Dec 13, 2017 #9

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    I've gone around and around on this question, and manage to confuse myself every time, so I'll reply as best I can and highlight where I fall short:

    The question is simple: What is the maximum obtainable dynamic range of my (stacked and processed) image? In other words, what is the maximum range of magnitudes I can obtain in a single (stacked and processed) image?

    It comes down to the noise floor and the number of bits available (discretization of the signal). Let's just consider a single channel to make this 'simple'. Displays are 8-bits and RAW are (say) 14-bits. This translates to, if the signal is 100% full scale at magnitude 0 (Vega), a signal-to-noise ratio of 1 is achieved at magnitude 6 (8-bits) and 10.5 (RAW). This assumes there is no noise in the system- the faintest signal has intensity value of '1' in either case. If we say there is background light, thermal noise, etc. etc., and say SNR = 1 at 5% full scale (which is way better than I get), the minimum magnitudes are about the same.

    Caution: I'm not sure I believe this. It's not clear to me why adding bits would seem to make the sensor 'more sensitive', because the incoming intensity hasn't changed. It seems to be that I can more finely reject the noise floor with more bits available.

    Stacking and averaging many frames generates an image with more bits- the most I have ever generated is a 24-bit image (1024 14-bit images). This, according to the above, means I have an image that can potentially span 18 magnitudes, which seems to strain credibility. That said, I can reliably obtain clear images of magnitude 15 stars once I have accumulated about 300 images (22 bit images, and the calculation returns 16.5 magnitude floor).

    So as a practical matter, it seems that I can generate images containing a range of up to 18 stellar magnitudes. There's obviously a bottom end based on the received intensity, but I have a really hard time calculating it, even starting with '1 photon per frame'.

    The artifacts occur when I 'squish' the 22-bit image into 8 bits. Ideally, I'd like to map the 18-magnitude span (brightness ratio 1:14552000) into a 1:255 scale, but as you can see, there will be artifacts. One critical consideration to minimize the 'skirt' is to avoid clipping at both the top and bottom keep your noise floor just barely above 'zero' and only a few of the brightest stars should be at 100% scale.

    Imaging Orion is particularly difficult due to the dynamic range present- I typically have to choose between blowing out the Trapezium or not getting the full glorious nebula.

    Does that help?
     
  11. Dec 13, 2017 #10

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    It's just that the noise level after stacking is lower than the peaks of the noise that occur on a single frame. The 'median' option takes the median value of samples on all the frames, which reduces the pk-pk noise excursion over the area and improves the SNR. The actual signal from n frames is n times the signal from one frame but the noise level (depending on the algorithm) is much less than n times. The 'sensitivity' increase that you refer to is due to the summation of many samples. It's a form of temporal filtering / bandwidth reduction.
    Only the top is 'clipped'; the peak value for one frame is limited but the bottom doesn't 'clip' , as such, if you choose to increase the number of frames but averages between random 0 and 1 values over the total number.
    You can ameliorate the problem if you fool around with the gain (curves) over the input range. AP is more of a pig than most regular daylight pictures which, when it's a problem, people use fill in flash or solar reflector screens to fill in the shadows. They used to use 'Soft Film' for high contrast subjects but we have a knob to twiddle nowadays.
     
  12. Dec 13, 2017 #11

    davenn

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    the best way (actually, pretty much the only way) to get the a full good image is to expose for the Trap. and the nebula separately and blend images in Photoshop or similar
     
    Last edited: Dec 13, 2017
  13. Dec 13, 2017 #12
    I would also suggest that you remove the filter and clean the camera lens then try to take another image (weather permitting) and see if the artifact disappears. Halos can be caused by a speck of dust or oil on the filter or the camera lens at that particular location in the image.
     
  14. Dec 14, 2017 #13

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    This raises another issue. Photographers are only too happy to consider cleaning lenses (carefully, of course). Astronomers seem against the whole idea of touching the lens. I recently washed the 10inch mirror on my Newtonian and it was NO BIG DEAL. I took care and avoided using a scrubbing brush or pan scourer. The result was that a load of dust went away but the surface was still a bit 'patchy'. Nobody died. A visual inspection (LED torch at night) of my 80mm Ed lens looks fine so I won't touch it yet. Why should I be scared to clean it?
    I am capable of cleaning my DSLR sensor with the appropriate pads and liquid. Every time a lens is changed, there is the chance of grot getting on the surface so cleaning becomes essential.
    The thing about this particular artefat seems to be that it's either there on a particular star or no, on the surrounding stars. I guess it's very magnitude dependent.
     
  15. Dec 14, 2017 #14

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    What I mean by 'clipping at the bottom' is to avoid having any pixels with value '0' in the image. For me, this happens during post-processing, trying to get rid of that last bit of non-uniform background.
     
  16. Dec 14, 2017 #15

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    I think you are right...
     
  17. Dec 14, 2017 #16

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I see what you mean now. But it's basically a subjective issue as to where you decide to cut the 'grass' at the bottom. The random noise gets lower and lower (relative to white level) as you filter more and more (more and more frames). There will always be some noise and always some stars that are only just discernible above that noise and you will lose them below your chosen black level. You really are making the system more sensitive with the processing. It's a bit like a single bit ADC which, sampling at a high enough level, can deliver as many quantised levels as you like. Oversampling makes things better.
    I have a slight problem with that approach because AP'ers produce a vast range of versions of any given astronomical object. It's more like the Impressionist School of AP. But then I have to admit that I never present a normal photograph without tinkering somewhat with the gain, levels and curves (and colour balance, probably). Also, I cannot resist removing zits and blotches from faces. PS is so clever at that sort of "repair" that it's hard to know where to stop. But I do like to think that the scene / face will still be very recognisable, whereas many AP images have false colour in order to show the features better. Google "Orion Images" and you will get every colour imaginable in the selection that you are given.
     
  18. Dec 14, 2017 #17

    davenn

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I don't understand your problem, when you view it optically through your scope, you can see the nebula and the trapezium clearly.
    Cameras have great difficulty with the huge dynamic range that the Orion Neb. presents. All the merging of 2 different exposure timed images is doing is giving a view that you would see with your eyes, where the core / trap. isn't blown out

    I'm not talking about colour/false colour renditions. Forget the colour. So do 2 exposures of different times using grey scale. I'm talking purely about getting an image that presents a view that is well exposed across the range from brightest areas to darker areas :smile:


    Dave
     
  19. Dec 14, 2017 #18

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    I guess I was introducing another issue with false colour but it is true that published pictures of many DSOs tend to vary wildly. Only a few of those pictures are like what you can actually see in your eyepiece. Perhaps you, Dave, make an effort to produce images like that but most others definitely do not.

    What PS tool do you use for that? I use the stacking tool with just the Median stacking rule. I have tried to merge a straight image of M42 and to add an OIII filter image, which has some extra detail but, using layers, I couldn't get a ' better' picture. Is it done by using Masks or is it already done for you by some bolt on?
    I know that with a TV display contrast ratio of, say 5000:1 it is possible to produce higher contrast images than a single Raw camera image. I just haven't.
    EDIT: PS, when I turn the gain down on my OIII filtered image, I can actually see the trapezium but, as you say, the faint details get lost and the impact of the shot is lost.
     
    Last edited: Dec 14, 2017
  20. Dec 15, 2017 #19

    davenn

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    ohhhh for sure, I see some really awesome ones with a well balanced colour spread
    and I see some dreadful ones that are so very over cooked in post processing.
    and in those cases what really annoys me is that those people are trying to pass them (that look) off as natural
    ... nothing could be further from the truth .... nice look for art maybe but definitely not as a good rendition of the object

    I do try, am not a perfectionist, but do try to have them look natural as possible ( broad colour range)

    Your OIII filter will always produce an unnatural green rendition, as above, cant be helped, just the way they are.

    Orion neb. has quite a wide range of colours compared to most other nebulae, which are predominantly red.
    There's only a handful that have 2 or more colours, the Trifid being one of them, with strongly contrasting red and blue

    Looking back through your posts, I didn't see any other imaging details other than the 4 x 2m exposures
    A total of 8 mins of exposure time is quite reasonable and if you had done that much without the filter
    you would have captured much more of the finer detail of the nebula.

    Difficult with 2 very different images where one of them is at the green end of the spectrum :wink:

    Layers or masks could be used .... I would normally use layers and blend the images and then use the Dodge and Burn tool
    to bring out the parts of the wanted layer



    Dave
     
  21. Dec 16, 2017 #20

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member
    2017 Award

    What a good idea. I can cope with D&B!!

    That OIII filtered image was obviously not a serious contender for realism - lol. My question was more specifically about that artefact.
    On the same night, I took the attached straight image, which I have only attacked with curves and levels. It is pretty convincing, I think but I can see the need for longer exposure time with my ED80 scope.

    As for 'realism' I guess that only a few, if any, observers actually see Nebulae as the AP images suggest. So AP (from me up to Hubble) is really a world of its own. It's a disappointment for almost every budding astronomer that they look in their shiny new scope and see fuzzy and largely colourless images. But. of course, there's nothing quite like the fantastic buzz of seeing the Jovian Moons or a star cluster through the window of your own personal spaceship in your own back garden. That experience is at least as good as your best final photoshopped version of an hour's (day's?) worth of exposures of some faint DSO.
    Stacked 1 .jpg
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Is it nebulosity or an artefact?
Loading...