Insights Blog
-- Browse All Articles --
Physics Articles
Physics Tutorials
Physics Guides
Physics FAQ
Math Articles
Math Tutorials
Math Guides
Math FAQ
Education Articles
Education Guides
Bio/Chem Articles
Technology Guides
Computer Science Tutorials
Forums
Classical Physics
Quantum Physics
Quantum Interpretations
Special and General Relativity
Atomic and Condensed Matter
Nuclear and Particle Physics
Beyond the Standard Model
Cosmology
Astronomy and Astrophysics
Other Physics Topics
Trending
Featured Threads
Log in
Register
What's new
Search
Search
Search titles only
By:
Classical Physics
Quantum Physics
Quantum Interpretations
Special and General Relativity
Atomic and Condensed Matter
Nuclear and Particle Physics
Beyond the Standard Model
Cosmology
Astronomy and Astrophysics
Other Physics Topics
Menu
Log in
Register
Navigation
More options
Contact us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Astronomy and Cosmology
Astronomy and Astrophysics
Diffraction Effects and Artifacts in Telescopes like the JWST
Reply to thread
Message
[QUOTE="collinsmark, post: 6815865, member: 114325"] I may have been contributing to this thread trending in different direcitons because I like this thread and don't wish it to become a source of misinformation. So when there is a false or misleading claim I can't in good conscience just sit by and let that false claim go unchallenged. And it seems there are many in this thread, which is why, I assume, it's taking twists and turns. Since you bring up the history of the particular circles, I'll try to consolidate them. It may have started somewhere around the time of: This is subtly incorrect. Yes, maybe in that particular image the dots were not record[I][B]ed[/B]. [/I] In that image. Sure. But the dots are not the byproduct of the camera or any part of the imaging hardware. The dots are the result of diffraction/interference of the optics including the Bahtinov mask in this case. And those diffraction dots [I][B]are[/B][/I], in fact, record[I][B]able[/B][/I]. So to say that the dots are not record[I]able[/I] isn't true. Therefore I objected. --- Then immediately there there was some misinformation about the properties of Full Width Half Maximum (FWHM) by another poster that has since been recanted, so I won't repeat them here. All of that got sorted out. But that did take a few posts to get through. --- Then there is another: This is untrue because the spikes themselves are linear all the way down to the quantum level (and the wavefunction of quantum mechanics [QM] is completely linear as far as anybody can tell for certain). And if you're not talking about the linearity of the spikes themselves, but limitations of the camera sensor, the linearity of the camera sensor [I]can[/I] be made linear with arbitrarily high precision by either by increasing the exposure time if saturation is not a concern, or by stacking. (And thus at this point the process of stacking entered into the discussion). Also in the same post, we had this, separate separate idea that's not correct: That claim is false. The 16 bit depth of the sensor is [I]not[/I] the limiting factor. In later posts I explained why, complete with examples and some practical mathematics to achieve increased effective bit depth. But this 16 bit sensor seems to repeatedly brought up as an insurmountable limitation, even though in and of itself, it is not. Of course there are real-world, practical limitations regarding the dimmest things are possible for JWST to resolve. But this least significant step of the 16 bit ADC, by itself [I]isn't one of them[/I]. --- I'll mention that at some point somewhere around there I had a miscommunication with another poster about FAPPs and fapping, but that ended up getting all sorted out, so I won't rehash that the details here. --- With this next post I briefly thought we were in agreement about all of this stuff I just discussed above: So for a moment I thought that we had resolved the disagreements. But I guess not, because soon after we have: Which again repeats the false claim that the least significant step in the ADC is some sort of insurmountable fundamental limit. It's not. Furthermore, something new is mentioned about requiring a point source and overlapping patterns. Diffraction applies to [I]all[/I] sources, not just point sources. Even if the orientation of JWST is such that the image of a dimmer star lies within the diffraction spike of a brighter star, it is still theoretically possible to resolve the brightness of the dimmer star, it just might take more total integration time and knowledge about JWST's diffraction characteristcs (which scientists are aware of). Will the brightness of the dimmer star necessarily be record[I]ed[/I] in any given image? No. But it is record[I]able[/I]. I personally don't take serious issue with any of this until it's brought back to the ADC. Yes, overlapping diffraction patterns complicate matters for sure, but let's keep the ADC out of it. Then this came out the blue: Boldface mine. I never said there was no stacking! Of course there's stacking! There's always at least [I]some[/I] stacking. In the JWST image that posted which shows the stacking/overlap, I can see regions of at least 6, maybe 7 overlaps (could be more) and the majority are around 4 or more overlaps. And here again with the 16 bit thing: I'm repeating myself. But 16 bits is not where linearity fails. It may be the case for a single subframe, but there's always at least a little stacking which increases the bit depth in the resulting image. And if science dictates, and more details is necessary, JWST could always can be told to increase its total integration time on that particular target in question and stack more subframes. And regarding the overlap of diffraction patterns (such as diffraction spikes of brighter stars overlapping the central spot of the diffraction pattern of a dimmer star), yes that makes things more complicated, but not insurmountable. And none of it fundamentally limited by the "16 bit" aspect of the sensor. And then this post broke my heart: Good grief. Where to start. Not only does the tone of that post start off as snide and condescending, it's wrong. It's wrong in two ways: [LIST=1] [*]Of course there's more to it than just the basic tenets, theorems, and implementation of information theory statistics, and mathematics. and I [I]never[/I] said there wasn't anything more to it. But just because there is more to it, those simplistic, basic, theoretical ideas are [I]essential[/I] considerations for this topic, and cannot be ignored if further understanding is to be achieved. [*]There we go again with the 16 bit sensor limitation. Implying that this as a fundamental, insurmountable limitation is wrong. [*]We can follow the maths all the way down to the quantum level with incredible precision. And this is true, even as applied to JWST images. As a matter of fact it's crucial, since in some cases we are talking about individual photons and their paths which are subject to diffraction and self interference patterns (not dissimilar to the double slit experiment). Implying that the maths and physics don't track reality at that level or above is wrong. [/LIST] And then, since my last post, there a brand new incorrect claim: Gah! The stacking process is [I]not[/I] nonlinear. With the exception of cosmic ray/hot pixel rejection and removal, (which might be considered part of the stacking process), it's linear in the [LIST] [*]astrophotography sense: being performed well before the "stretch" or "curves" are applied (stacking is done in the process flow immediately after calibration frames are applied to the raw data from the sensor) [*]the mathematical sense: [itex] \mathcal{O}(x + y) = \mathcal{O}(x) + \mathcal{O}(y)[/itex], [*]and any other sense that I can think of. [/LIST] Cosmic ray/hot pixel identification, rejection and removal fit in nicely as part of the stacking processes. But this isn't a necessary part of stacking. You could still do stacking without it. Doing just the stacking alone, the mathematical operation is linear. As a matter of fact, it's mathematically equivalent (statistically speaking) to taking a longer exposure with unlimited bit depth (zero risk of causing saturation beyond that of the subframes), at the expense of additional read noise. And this addition of read noise does [I]not[/I] make the process nonlinear. (Any more than the addition operator is nonlinear, which it isn't.) Nor does this read noise present an insurmountable limitation if the read noise is uncorrelated/(statistically) stationary. The overall effect of the read noise can be reduced via the Central Limit Theorem, just like any other source of uncorrelated, (statistically) stationary noise source. You're right to be concerned about saturation during a single exposure. Saturation is a primary consideration for determining the exposure length of subframes. After that, enough frames are stacked to bring about the necessary detail and signal to noise ratio (SNR) determined by the scientific needs of the object being imaged. If the science requires more effective bit depth or better SNR, more frames are stacked. The read noise penalty for stacking is small, although it's not negligible, so it's something that is considered and dealt with. But it's in no way an insurmountable obstacle. And to the point of the last claim, there's nothing inherently nonlinear about it. ------------------------------------------------ Summary: There [I]are[/I] real, true, insurmountable limitations to JWST's capabilities, both theoretical and practical. I've never said otherwise. But things like the finite bit depth of the hardware sensor and the residual, thermal glow of background space [I]are not among them. [/I] Things such as the overlap between the diffraction pattern of one source over another: That makes things complicated (but can be dealt with). Scientists analyzing data must take diffraction into account all the time, with every image. But they do that. That's part of the process. But nothing about that is insurmountable. There are many, many things that make gathering scientific data from JWST or any other telescope complicated. But not all of them are insurmountable. Many obstacles brought up recently in this thread, while they may have their own complications, and require careful considerations, are not truly limitations. JWST, science and math have ways of getting around them. So what are some of the true limitations? There are many. Here are a few biggies: [LIST] [*]Time. Time spent gathering data on anyone given target means less time for gathering data for other targets. JWST has a finite lifespan. [*]Aperture. While JWST's primary mirror may be large, it does limit the angular resolution that it can resolve. It also limits its light gathering rate (although this one points back to the previous bullet). [*]Stochastic, yet (statistically) nonstationary, dim sources. If a source is changing in a non-predictable way, and those changes are occurring fast enough given how bright it is, JWST may not be able to reliably detect those changes in detail. [/LIST] And while there could be more, nowhere on that list is the 16 bit thing. And stacking is [I]not[/I] inherently nonlinear. [/QUOTE]
Insert quotes…
Post reply
Forums
Astronomy and Cosmology
Astronomy and Astrophysics
Diffraction Effects and Artifacts in Telescopes like the JWST
Back
Top