Diffraction Effects and Artifacts in Telescopes like the JWST

Click For Summary
The hexagonal shape observed in bright stars from the JWST images is primarily due to diffraction artifacts caused by the telescope's internal optics. While all stars exhibit these diffraction effects, they are more noticeable in brighter stars due to saturation and image processing techniques. Dimmer stars may still have the same hexagonal artifacts, but they blend into the background, making them less visible. The discussion highlights that the appearance of these artifacts can vary significantly based on how the raw data is processed. Overall, diffraction artifacts are a consistent feature across JWST images, influenced by brightness and processing methods.
  • #91
Devin-M said:
Does it mean that only photons of the same color/ wavelength interfere with each other?
The mechanism of interference relies on two (or more) sources of precisely the same wavelength. If a red photon is detected at the same time as a blue photon then the interference patterns associated with each wavelength will be totally different; they are independent.

I suggest you read around about the mechanism of interference / diffraction and you will find that photons do not come into it. It is always described in terms of waves. (I think this has already been pointed out in this thread.)
 
Astronomy news on Phys.org
  • #92
Is it true that the dark areas in the spikes are destructive interference and this is only possible because starlight has both spatial and temporal coherence? (In other words, if a point source did not have spatial and temporal coherence would the dark areas in the spikes vanish?

5-jpg-jpg-jpg.jpg
 
  • #93
How could the spatial coherence not be perfect for a point? Do you know what coherence means?
 
  • #94
Well from the below source, light from the sun isn’t coherent, but light from the stars is apparently, so presumably light from the sun would be coherent if viewed from a great distance, but I’m not sure why a non coherent source becomes coherent when viewed from increasing distance.

https://www.mr-beam.org/en/blogs/news/kohaerentes-licht
 
  • #95
the light from a star is hardly monochromatic so how can it be “coherent’? Diffraction fringes are not visible unless you filter the light.
But coherence is a quantity with a continuum of values; everything from excellent to zero.
 
  • #96
I thought destructive interference was causing the dark regions in the spikes, but isn’t coherence a prerequisite for destructive interference?
 
  • #97
Devin-M said:
I thought destructive interference was causing the dark regions in the spikes, but isn’t coherence a prerequisite for destructive interference?
Variations, not just minima require some degree of coherence but it's not just on/off. As coherence decreases, so the patterns become less distinct. You will get a hold of this better if you avoid taking single comments that you hear / read and try to follow the details of the theory. It may be that the argument you heard relates to twinkling stars. Fair enough but the details count and a statement about one situation may not cover all cases.
 
  • #98
In this video he’s able to obtain a destructive interference pattern with just sunlight & a double slit… (3:29)
 
  • #99
Devin-M said:
In this video he’s able to obtain a destructive interference pattern with just sunlight & a double slit… (3:29)
And your point is?
The equipment he uses is introducing some coherence with a collimating slit which is causing the formation of diffraction effects. You don't only get diffraction effects with a high quality laser.
And I think we have gone past the stage, in this discussion, where we should be quoting links showing ripples on water. To discuss the limitations imposed on the basic calculations of diffraction, you need to introduce the concept and effects of coherence.
The coherence length of any source is the mean length over which all the individual bursts of light exist - very long for a good laser, short for a discharge tube and even shorter from a hot object. Interference will only occur when the path differences involved are shorter than the coherence length. That will be about boresight when you use sunlight.
 
  • #100
Starlight is coherent over the size of your setup. That includes sunlight from a given direction - but an unfiltered Sun will come from an angle of half a degree, usually that makes the blob of sunlight much wider than any interference patterns.
Devin-M said:
Does it mean that only photons of the same color/ wavelength interfere with each other?
Light is linear so you can always treat interference as something that happens photon by photon (the double-slit experiment has been done with single-photon sources, and also with electrons). If each photon only interferes with itself then it's obvious why different colors (different wavelengths) can have different patterns.
 
  • #101
mfb said:
If each photon only interferes with itself
For a 'bright' highly coherent source, there would be many photons involved and, although I really don't like this, you could say that many of the photons will 'interfere with each other'.
I always have a problem with the attempt to interpret wave phenomena in terms of photons; doesn't it call for a real strange concept of the coherence of a single photon? The photon could only actually 'arrive' at one place, which is not a pattern. But the place where it arrives could be at a peak (or null?) in the theoretical pattern and the value will be one.

A stream of single photons will form a pattern, over time. What would determine the spread of the diffraction pattern from the ideal case? Perhaps it would have to be the bandwidth of the original light source, before the 'photon gate' is operated
 
  • #102
I’m amazed by the “volume” of photons it took to produce the pattern— it was a 5 minute exposure so we’re talking about a “beam length” of 55.8 million miles. Considering the aperture diameter was 66 millimeters, that gives a “beam volume” of roughly 29.5 cubic miles of photons to produce the image.
 
  • #103
Devin-M said:
I’m amazed by the “volume” of photons it took to produce the pattern— it was a 5 minute exposure so we’re talking about a “beam length” of 55.8 million miles. Considering the aperture diameter was 66 millimeters, that gives a “beam volume” of roughly 29.5 cubic miles of photons to produce the image.
That tells you how much Energy is in that space but I think that looking at it as a spray of photons is not in the spirit of QM. There is no way that any particular photon is actually inside that cone as photons have no extent not position. Only when a photon interacts with a sensor element can you say when and where the photon existed.
 
  • #104
sophiecentaur said:
Only when a photon interacts with a sensor element can you say when and where the photon existed.
Can we say the photons in the middle spike more likely than not went through the set of slits on the left half of the bahtinov mas?
8c6d1092-2572-4490-b30e-b1ac2af22337-jpeg.jpg


418b7ad4-93e8-4a1d-a786-8ae000b91975-jpeg.jpg

1668362641339.png

https://en.m.wikipedia.org/wiki/Bahtinov_mask
 
  • #105
Devin-M said:
Can we say the photons in the middle spike more likely than not went through the set of slits on the left half of the bahtinov mas?

I know this is not intuitive but how can you say a photon ‘went through’ / may or may not have taken a path? Remember that Scientists with greater ability than me or (with respect) you struggled with this business. It has been agreed that this approach goes nowhere (nearly a hundred years ago). You need to change the model in your head or you will continue to be confused.

Devin-M said:
( I can see I have written this in the wrong place. Forgive me. Blame it on the iPhone!
 
  • #106
sophiecentaur said:
I know this is not intuitive but how can you say a photon ‘went through’ / may or may not have taken a path? Remember that Scientists with greater ability than me or (with respect) you struggled with this business. It has been agreed that this approach goes nowhere (nearly a hundred years ago). You need to change the model in your head or you will continue to be confused.
I’ll do an experiment and cover the left side of the bahtinov mask with a sheet of black plastic & see if the central spike disappears…
 
  • #107
Devin-M said:
I’ll do an experiment and cover the left side of the bahtinov mask with a sheet of black plastic & see if the central spike disappears…
That would be fun. There are many web pages about the bahtinov mask. You will notice that any good explanations are based on wave theory.
 
  • #108
I’ll compare these 2 options tonight. I predict with the horizontal slits uncovered on the right we will see vertical central spikes, and with the horizontal spikes covered up by masking tape we’ll see no central spike.
CFC98193-EDF4-4228-93E2-956B39F35B51.jpeg

8EB0417A-BA9A-45BF-8207-CE7009641AF8.jpeg


I’ll also put a Hydrogen Alpha narrowband filter in front of the image sensor which will only permit photons within 6nm of the 656.2nm wavelength to reach the sensor.

4707C123-DF52-4CF0-A058-C0224E40B3E4.jpeg

18196D15-FB3B-4572-A88A-E0394B4A9D98.jpeg

2D2229DB-AC3B-4614-80BF-A6B3F9A32164.jpeg

AD7A8748-FBDA-4901-AF02-A7586FF4B75A.jpeg

2D32DF8F-EC71-41D1-BC51-1C4AD0BC5372.jpeg

2AA33771-D7B6-4558-8973-85FD49C92BC1.jpeg
 
  • #109
I did a 5 minute exposure of Polaris at 6400iso through the full Bahtinov Mask (right), and then covered the half of the Bahtinov Mask that had horizontal slits with masking tape and took a second 5 minute exposure (left). Both exposures were at 600mm focal f/9 through a hydrogen alpha narrowband filter. The exposure with the horizontal slits covered was missing its vertical spike:
2.jpg

Aperture during partially covered exposure looked like this:
DEE2BCC7-3EB0-418A-A4FD-E499904E3684.jpeg

Uncovered:
49D9E557-9EDE-46E4-8C59-7060D5933EC0.jpeg

C81054E3-16BD-41D2-AAD0-6EAD76A3FC0C.jpeg

2aa33771-d7b6-4558-8973-85fd49c92bc1-jpeg.jpg


Here we have the full frame (uncovered):
1.jpg

7337650.jpeg

7337650-1.jpeg

5402631.png

23840FB0-E228-41D2-9E12-2FC5550ABB1A.jpeg
 
Last edited:
  • #110
sophiecentaur said:
I know this is not intuitive but how can you say a photon ‘went through’ / may or may not have taken a path?
Would or would it not be reasonable to conclude the indicated photons "went through" the indicated slits? Why or why not? Also, to my eye it seems that approximately halving the number of photons to reach the sensor in this manner did nothing to change the length or brightness of the "remaining" diffraction spikes.
5.jpg
 
  • #111
Devin-M said:
Would or would it not be reasonable to conclude the indicated photons "went through" the indicated slits? Why or why not?
It would not be reasonable to conclude that the photons that ended up in the vertical diffraction spikes went through the right half of the mask. As for why, I admit I don't know the answer well enough to explain it.

Devin-M said:
Also, to my eye it seems that approximately halving the number of photons to reach the sensor in this manner did nothing to change the length or brightness of the "remaining" diffraction spikes.
The entire image should have approximately half the intensity as before. How that is proportioned into diffraction spikes vs central spot, I don't know, but the eye is a very, very poor sensor for measuring brightness. I recommend looking at the raw image pixel by pixel to get the pixel values to get the ratios. If you want absolute numbers then do the math to convert those values to electrons counted by the sensor during readout and then convert those to incident photons. Keep in mind that the central spot could be so bright it's saturating and you're missing out on photons that should be there when you do your count.
 
  • #112
Drakkith said:
The entire image should have approximately half the intensity as before. How that is proportioned into diffraction spikes vs central spot, I don't know, but the eye is a very, very poor sensor for measuring brightness. I recommend looking at the raw image pixel by pixel to get the pixel values to get the ratios. If you want absolute numbers then do the math to convert those values to electrons counted by the sensor during readout and then convert those to incident photons. Keep in mind that the central spot could be so bright it's saturating and you're missing out on photons that should be there when you do your count.

I aligned both patterns on top of each other, then isolated a single pixel from the same point on each. Neither had a saturated red channel. One value was 220 out of 256 & the other was 214 out of 256. That's less than a 3% difference in brightness (even though approximately half the lens was covered in one of the exposures).

6.jpg


Here's a link to the RAW files:
https://u.pcloud.link/publink/show?code=kZG4l2VZPdxs8qih835rc40WKQDSfk0xAoSk
 
  • #113
With these 2 horizontal slits I expect vertical spikes only (one pointing up, one pointing down - 2 spikes total):
B509EED2-B98D-4AD1-BC74-2561DC901A09.jpeg

With these two I expect both vertical and diagonal spikes (one pointing up, one down, and two more diagonal — forming an “x” pattern - 4 spikes total - twice as many spikes, spikes roughly half as bright):
D7A431FE-07D5-4809-9E6B-45A3DA5B980F.jpeg

I expect roughly the same total photon count in both cases.
 
Last edited:
  • #114
Devin-M said:
I aligned both patterns on top of each other, then isolated a single pixel from the same point on each. Neither had a saturated red channel. One value was 220 out of 256 & the other was 214 out of 256. That's less than a 3% difference in brightness (even though approximately half the lens was covered in one of the exposures).
I had a hard time getting a good average value for each 'lobe' in each diffraction spike, so I went with the median, which was much more stable. Additionally, I'm not sure I trust this picture format completely when computing the average. There are many pixels that are exactly zero, suggesting that the camera has altered their values, since between stray light, background light, and thermal noise they should always be above zero. Using the median instead of the average helps filter out this effect, since the value of pixels below or above the median don't affect it in any way, only the number of them, whereas they greatly affect the average value.

Anyways, the median value is about 15-20% higher for each lobe that I measured in the full-aperture picture compared to the half-aperture. This suggests to me that the horizontal gaps in the mask do indeed contribute to the diagonal diffraction spikes.
 
  • #115
Drakkith said:
I'm not sure I trust this picture format completely when computing the average. There are many pixels that are exactly zero, suggesting that the camera has altered their values, since between stray light, background light, and thermal noise they should always be above zero.
I used photoshop to inspect the RAW files. I discovered even though the RAW files themselves are 14 bit per color channel giving 0-16382 values of red, 0-16382 green, 0-16382 blue, the "color picker" in photoshop is limited to 8 bit (0-255 possible values). Presumably that means if you "pick" a color that is less than (1/256)*16382, it will pick black even if there is color information present ie if the color value is between 0-63 out of 16382 it's converted to 0.
 
  • #116
Devin-M said:
I used photoshop to inspect the RAW files. I discovered even though the RAW files themselves are 14 bit per color channel giving 0-16382 values of red, 0-16382 green, 0-16382 blue, the "color picker" in photoshop is limited to 8 bit (0-255 possible values). Presumably that means if you "pick" a color that is less than (1/256)*16382, it will pick black even if there is color information present ie if the color value is between 0-63 out of 16382 it's converted to 0.
I don't think that's a problem with Maxim DL (what I used). The picture opens as a greyscale image and I can immediately look at the actual pixel values for all pixels.

Screenshot 2022-11-18 17.59.28.png
 
  • #117
I found the opposite. I copied the same portion of each spike to a separate file, then applied a 3 pixel gaussian blur. When I sampled the same central pixel from each, I found the fully uncovered exposure was dimmer. Fully uncovered had a central red value of 76 of 255 and half covered had a red value of 81 of 255:

7.jpg
 
  • #118
Devin-M said:
When I sampled the same pixel from each, I found the fully uncovered exposure was dimmer. Fully uncovered had a central red value of 76 of 255 and half covered had a red value of 81 of 255:
Single pixel values are mostly irrelevant, as they are highly subject to noise. Better to use an average, median, or sum of a number of pixels.
 
  • #119
Devin-M said:
I copied the same portion of each spike to a separate file, then applied a 3 pixel gaussian blur.
The gaussian blur averaged the sampled pixel with the neighboring pixels.
 
  • #120
Drakkith said:
It would not be reasonable to conclude that the photons that ended up in the vertical diffraction spikes went through the right half of the mask. As for why, I admit I don't know the answer well enough to explain it.
The problem is that people abuse the word "photon" to mean localized (massless) particles to describe "light". That's an idea which goes back to the socalled "old quantum theory" and Einstein's very early ideas on wave-particle duality. This is all outdated for about 100 years. The only correct quantum description of light is quantum electrodynamics, and you are always better off when thinking about light in terms of fields and waves. According to QED a photon is an asymptotic free one-quantum Fock state of the electromagnetic field and as such not localizable in the usual sense, i.e., you cannot even define a position operator in the full meaning of a position observable.

If it comes to the resolution of optical instruments like telescopes it's all about diffraction, i.e., a wave phenomenon, and even if you handle very "dim light", i.e., merely detecting indeed single photons, still the wave nature of light has to be taken into account. Although you'll detect any single photon as one spot (say in a CCD cam), which in some sense is the "particle aspect" of the notion of a photon, the information on the observed object is in collecting sufficiently many photons, and the distribution of the photons is according to the wave picture, i.e., it's given by the energy-density distribution of the electromagnetic field.
Drakkith said:
The entire image should have approximately half the intensity as before. How that is proportioned into diffraction spikes vs central spot, I don't know, but the eye is a very, very poor sensor for measuring brightness. I recommend looking at the raw image pixel by pixel to get the pixel values to get the ratios. If you want absolute numbers then do the math to convert those values to electrons counted by the sensor during readout and then convert those to incident photons. Keep in mind that the central spot could be so bright it's saturating and you're missing out on photons that should be there when you do your count.
You can calculate all this using classical electrodynamics.
 

Similar threads

  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 28 ·
Replies
28
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K