Picture sharpness and air effects

In summary, the conversation discussed the benefits of regular exercise for physical and mental health. One participant shared their experience of feeling more energized and positive after starting a workout routine, while another mentioned the importance of setting achievable goals and finding an activity that they enjoy. The group also touched on the role of exercise in reducing stress and improving sleep.
  • #1
Borek
Mentor
28,936
4,219
This is basically a question to Andy.

As I already wrote in the outdoor thread I have a problem with pictures taken while in Croatia - in general far objects are never sharp. Initially I thought it is a problem with one of the lenses I was using, but after some browsing I realized it is not just this one lens, but the same happens with all lenses used (and - together with Marzena - we have a nice collection now :wink:). So there are two possibilities. One - it is a body and autofocus problem. This one I can test here, and I will do it - it is just a matter of time and weather.

However, I wonder... is it possible that the problem was related to the place? What I mean is - temperature was high for the most time, I don't think it was ever below 30 deg C when I was taking pictures. Could it mean air motion, density waves, refraction, aerosols - whatever - that makes taking sharp pictures from the distance simply impossible?

I know it is a problem in astrophotography, I wonder if it can't work the same way in normal photography. As the light was very bright, many pictures were taken with very fast shutter speed, so in theory it shouldn't be a problem? Distant objects are always a little bit blurred in Croatia (well, distant objects means in this case 10 to 20 km), that's most likely some natural air pollution - can it be the problem?

This particular picture is probably not the best example, as EXIF info about distance is rather strange (0.64? no idea what it means), but that's how the situation looks on most pictures (this is yet another lens, wider). First, general view:

IMG_3224.jpg


And 1:1 crop:

IMG_3224'.jpg


Second plan is way too soft, while the tree trunk looks much sharper (or it doesn't? I am am stopping to believe my eyes). At 12 mm focal length and 8 Av DOF should extend behind known Universe, and not end just a few hundred meters from me, shouldn't it?
 
Last edited:
Physics news on Phys.org
  • #2
Well let's blow up the area where the tree trunk is in front of the island/hills on the horizon.

2quj4oh.jpg


I would estimate that both the tree and the hills show a transition of about three pixels to the sky behind, which suggest that both are equally sharp. But I can imagine that a blur of three pixels is somewhat disappointing where you'd expect about one to two pixels.
 
  • #3
Andre makes a good point- how big is the blur relative to the pixel size? Your 1:1 crop doesn't look obviously bad- the mast, for example, is nice and sharp.

Here's my experience (and lots of images) with atmospheric blur (as opposed to a fundamental limit of the optics)- first some disclaimers. I am going to compare three different lenses- a 15mm f/3.5, a 85mm f/1.4, and a 400mm f/2.8. The 15mm and 400mm are 30+ year old Nikkors, and I use them with an adapter that contains lens elements- weakly diverging, possibly coated, and they may degrade the theoretical performance of the lens. The 85mm is a Zeiss, new, and used without an adapter. The 400mm has a reputation of being 'tack sharp' at all apertures, the 85mm can be considered a current 'state-of-the-art' lens.

For all of these (except the last three), the shutter speed is < 1/30 s.

Here's a shot using the 400mm set to the hyperfocal distance, (IIRC) f/22:

[PLAIN]http://img13.imageshack.us/img13/2531/dsc92481.jpg [Broken]

Looking at crops at >1:1, no interpolation and no attempt to keep the magnification constant, shows the structure of the point spread function (the sun glints). Here's close to the shore:

[PLAIN]http://img856.imageshack.us/img856/8361/dsc92483.jpg [Broken]

And near the horizon, which is about a mile away:

[PLAIN]http://img84.imageshack.us/img84/1154/dsc92484.jpg [Broken]

It's not obvious if the PSF near the horizon is 'blurrier' than close up, but the contrast overall is lower at the horizon.

Here's the 85mm, shot at f/8 or so:

[PLAIN]http://img683.imageshack.us/img683/720/dsc9846.jpg [Broken]

and a blowup of the bottom:

[PLAIN]http://img195.imageshack.us/img195/1060/dsc98461.jpg [Broken]

This is about 1/2 mile away, and has much better contrast than the 400mm. However, the 85mm has much more sophisticated lens coatings which may account for some of the difference.

Here's the 15mm, which I almost always use at f/11 or f/16- the sharpest setting:

[PLAIN]http://img836.imageshack.us/img836/2633/dsc0132hwb.jpg [Broken]

and a crop near the center- again, to minimize the aberrations:

[PLAIN]http://img689.imageshack.us/img689/6919/dsc01322ee.jpg [Broken]

and a crop from the bottom edge:

[PLAIN]http://img84.imageshack.us/img84/2396/dsc01321f.jpg [Broken]

These images all together (and especially the 15mm) tell me that the lenses are delivering consistent performance over the entire field of view. Distant objects have reduced contrast, but there may not be increased blurring as compared to nearby objects. However, the object distance varies between images.

To control this, I'll put up three images of the ocean horizon, one from each lens (in order: 15, 85, 400). The horizon line is about 6 km away.

[PLAIN]http://img593.imageshack.us/img593/3199/15mml.jpg [Broken]

[PLAIN]http://img684.imageshack.us/img684/2114/85mm.jpg [Broken]

[PLAIN]http://img534.imageshack.us/img534/900/400mm.jpg [Broken]

They do not appear equally blurry- longer focal lengths (greater magnification) increases the blur. This also tells me that atmospheric effects are limiting the resolution of the image. The blur is bad enough that the jpg compression is visible at 400mm.

I know that atmospheric effects at low altitudes are much worse than at high altitudes, so here's a comparison of stars taken at high elevation angles- images of stars can be considered true point spread functions: (again, presented at 15 f/11, 85 f/1.4, and 400 f/2.8)

[PLAIN]http://img834.imageshack.us/img834/650/15mmpsf.jpg [Broken]

[PLAIN]http://img62.imageshack.us/img62/4197/85mmpsf.jpg [Broken]

[PLAIN]http://img13.imageshack.us/img13/7756/400mmpsf.jpg [Broken]

The blurs are all about the same size (call it 2.5 pixels), but comparing this to diffraction-limited performance is difficult: the three lenses have Airy disk diameters of about 5, 0.8, and 1.7 microns, all less than the size of a single pixel. I don't know how to take into account the Bayer filter and on-board processing that occurs.

I know heat shimmer is awful at 400mm- I have been trying to take images of things 5+ miles away and the atmospheric distortion is rendered quite nicely.

Is this helpful? I guess if you suspect your lens is not performing properly, you should try taking photos of known objects- stars and an LCD screen work well as test objects. Take photos at different apertures and look over the image plane to determine aberrations, and if they scale properly (other than distortion, all aberrations go down as the f-number increases). Try manual focus and compare with autofocus.

Or did I completely miss the point?
 
Last edited by a moderator:
  • #4
Thank you both, I have some material to think about. I will do some field tests, no doubt about it.

As of now I am just browsing the pictures, trying to wrap my mind around what I see.

This is full picture, as shown by the DPP. Red rectangles are places that were reported by the camera as in focus:

IMG_2482.jpg


Perhaps I am wrong, but it seems to me that the close ones:

IMG_2482''.jpg


look sharper than the distant ones:

IMG_2482'.jpg


which are reported as being in focus as well.
 
  • #5
Sometimes you come across the term 'acutance' rather than 'sharpness'. For example, I would bet that the acutance of your distant objects (say, those isolated white blobs) is the same as the near objects (say, the stems). That would imply that the distant objects are indeed in focus, and that the apparent lack of sharpness is really due to a drop-off in contrast, due to atmospheric scattering.

Looking at photos astronauts took of the moon when they were wandering around is a bit disorienting, because there is *no* atmospheric blur:

http://cache.boston.com/universal/site_graphics/blogs/bigpicture/luna_07_02/luna12.jpg

http://smbhax.com/stuff/_apollo17boulder.jpg
 
  • #6
Andy Resnick said:
Sometimes you come across the term 'acutance' rather than 'sharpness'. For example, I would bet that the acutance of your distant objects (say, those isolated white blobs) is the same as the near objects (say, the stems).

These "isolated white blobs" are actually boats and sails :smile:

That would imply that the distant objects are indeed in focus, and that the apparent lack of sharpness is really due to a drop-off in contrast, due to atmospheric scattering.

It is past 1 a.m. so I am not going to post it, but I took a picture of sky - and it looks at first sight OK. I used autofocus with a Moon and opened the lens to 2.8. But I am too tired now for a more detailed analysis.
 
  • #7
17_star_test.jpg


17 mm, 15 sec, 2.8, ISO 100, almost exact center of the frame.
 
  • #8
I took the liberty of doing some analysis- here are some intensity profiles of your images- the stars (top two), and from two stems in the near field (3rd) and the isolated vertical blob near the shore in the distant field (bottom):

[PLAIN]http://img221.imageshack.us/img221/9977/moontb.jpg [Broken]

The top image is of a fainter star and has a FWHM of about 5 pixels, about the same as the brighter star (7 pixels) which is good, since you didn't saturate the image- all the stars should render the same. That's also close to what I get with my images.

The 3rd graph is of two stems, and each has a FWHM of about 5 pixels, the same as the stars. Similarly, the tower (?) has a FWHM of about 6 pixels.

The conclusion I come to is that your landscape image is correctly focused, and that the focus is indeed set to the hyperfocal distance (or close to it). Note that the *contrast* of near and far images varies- the stems have an intensity variation of close to 100 while the tower only has an intensity variation of 50. Compare this with the stars, which also have a very high variation in intensity.

This also indicates that atmospheric scattering is acting to reduce your contrast. I'm not seeing any evidence of thermal lensing, which looks like this (radio tower):

[PLAIN]http://img268.imageshack.us/img268/7458/dsc6952p.jpg [Broken]

This was taken at 800mm, from 5 miles away, and this (top of the Terminal Tower):

[PLAIN]http://img828.imageshack.us/img828/5601/dsc7046t.jpg [Broken]

from 8 miles away. These images are both in focus and the shutter speed was fast enough to limit the blur caused by atmospheric motion.

How do your off-axis stars look?
 
Last edited by a moderator:
  • #9
Thanks. I wonder if the result (5 pixels) is not inflated - stars are zoomed in 300% (no interpolation, just every pixel was repeated 9 times). And when I zoomed further, these 3x3 squares have obvious compression artifacts.

Off-axis stars don't look much different, this is the upper left corner:

17_star_test_a.jpg


Perhaps not as sharp, but still reasonably good IMHO.

This is somewhere between the center and the edge, Vega is so bright there is a coma (or something similar) visible:

17_star_test_b.jpg


(I love how the epsilon Lyrae AKA double double shows its primary duplicity).

And the same part of the sky as yesterday, just no zooming and 100% quality:

17_star_test_c.jpg
 
  • #10
I set up an 'artificial star test' in the lab today to check a few things out. I sent light from a green laser (DPSS, 100mW) through a spatial filter with a 5-micron diameter pinhole to simulate a point source. The laser light had to be severely attenuated through crossed polarizers- one polarizer was insufficient, even though the beam is linearly polarized. I turned off as much in-camera processing as possible as well. I manually set the focus at maximum aperture and did not re-focus when changing the aperture.

The image below is a montage of the point spread function for two lenses at 4:1 magnification, varying two parameters- the numerical aperture and color vs. black and white. I wanted to probe the Bayer filter and image reconstruction algorithm, so comparison between the color and b&w images (especially when using monochromatic light) may be useful. Here's the composite image:

[PLAIN]http://img836.imageshack.us/img836/766/combinedgs.jpg [Broken]

The top two rows are the 85mm, and the bottom 2 rows are the 24mm. The middle row has two images using the 85mm and a monochrome CCD array (point grey research 'Flea') that has pixels 4.65 microns on a side. These serve as 'validated' images against which the sony images can be compared. Note- I don't know the size of the sony pixels (although I can estimate them as 6 microns on a side), so there could be an error introduced by presenting the pixels from the two sensors as equal size. Also, all the sony images were jpegs, and the Flea images were bmps.

The left-most column were taken at maximum aperture: f/1.4 for the 85mm, f/1.8 for the 24mm. The middle column was taken at f/5.6, and the right-most column taken at the minimum aperture, f/22.

The coma present in the left column can be ignored, since that could be an artifact of misalignment. There is evidence of blooming in the 85mm f/1.4 images, meaning I needed to even further attenuate the light (the shutter speed was 1/8000 s, the minimum time). Even so, the difference at full aperture between the sony and flea is striking. Some of this could be misfocus- I was able to focus the Flea camera much more accurately than the sony- and at f/5.6 the images are much sharper. I couldn't set f/5.6 using the flea camera since the lens is not attached to anything and the lens doesn't have a manual aperture ring.

Diffraction-limited performance means the spot at f/5.6 is about 7 microns in diameter, which means the central peak should be at most 2 pixels wide on the sony (if the peak fell midway between two pixels), and it seems that the spot is about 3 pixels wide which is reasonable. At f/22, a diffraction-limited spot is 26 microns in diameter: the flea sensor correctly renders the Airy disc (the dark ring is 6 pixels across = 27 microns), but the sony images don't appear much different than the f/5.6 images. This is interesting, and may indicate an artifact of interpolation between pixels. A possible check for this would be to use a red or blue source, since the pixel density of those colors is lower with a Bayer filter present.

Comparing the color and black and white images is interesting: there is clearly more spatial structure in the b&w images. I couldn't find any information about how the camera generates these, but it seems that there is considerably less smoothing. I don't have the time to routinely deal with RAW images, but the results here may motivate me to do a comparison.

I think the overall message here is that expecting sharp edges (or stars) to be a single pixel wide is unrealistic- at best an edge will be about 3 pixels wide.
 
Last edited by a moderator:
  • #11
Just a quick follow-up, I re-tested the 85mm at f/1.4 with the sony. I spent a lot of time making sure the pinhole was in focus and decreasing the laser light intensity even more. Here's the result (color only), again at 4:1 magnification:

[PLAIN]http://img560.imageshack.us/img560/6585/dsc0617l.jpg [Broken]

The central peak is now 1 or 2 pixels across, as it should be. This means I have more confidence in whatever algorithm Sony uses to interpolate the Bayer filter. It also means I may be able to examine the effects of atmospheric turbulence by imaging a bright star with this lens (so I have a wide range of possible shutter speeds).

On a semi-related note, I just got my new tripod in- the old one barely made it to the end of vacation. I had always considered an expensive tripod to be an unreasonable extravagance since 90% of the time I don't use one (the one I had was a $150 "heavy duty" model and lasted a year, equivalent to about 1 month regular use), but the reality is that using a heavy lens (15-20 pounds) requires specific equipment to hold steady. This page says everything there is to say:

http://bythom.com/support.htm

I'm not sure it's worth adding information about tripods to the camera sticky, although a brief discussion about types (table-top, monopods, etc), trade-offs, and standards (Arca-Swiss, etc.) may be interesting?
 
Last edited by a moderator:
  • #12
Discussing tripods there won't hurt, no doubts about it.
 
  • #13
The clouds parted for a few minutes this weekend, so I was able to try out a 'star test' on actual stars. An excellent reference for this is Roggemann and Welsh, "Imaging through turbulence" (CRC press). The important results can be summarized as:

1) air turbulence can usually be modeled as a fixed phase screen that moves across the field of view
2) short-time images of stars display speckle, while long-time images result in a uniform blur (time averaged speckle pattern).

The details (timescale, etc) depend on the specific weather conditions at the viewing location.

here's a montage of images of the same star, 4:1, taken at 800mm, and all but two at f/5.6.

[PLAIN]http://img706.imageshack.us/img706/1372/montagej.jpg [Broken]

Across the top are short-time exposures (f/5.6): 1/80s and 1/250s (ISO 100), 1/500s (ISO400), 1/1000s (ISO1600). Other than post-processing to equalize the brightness and contrast from one image to the next, the images are right off the camera. These images clearly show the effects of speckle. Going below 1/1000s exposure is possible, but the ISO setting would have to be increased beyond 1600, and I would need to compare the degradation due to noise.

The second row of images are longer exposures (1/10 second) taken at f/5.6 (left) and f/16 (right) (both ISO 100). It may appear surprising that the f/5.6 image is much worse, but recall that the larger aperture samples more of the aberrated wavefront. This comparison shows the trade-off between sensitivity (able to detect faint objects) and aberration due to atmospheric turbulence. This also applies to telescopes- I've often seen claims that if your primary is 12" or larger, there's not much reason to make it a precision optic because seeing conditions will limit the resolution for the majority of users.

The image on the bottom is a defocused image taken at f/16, and I include it because it's a great example of longitudinal chromatic aberration (sometimes called spherochromatism).

It should also be noted that taking images of extended objects (moon, ISS, planets, sun) will not show this effect as much because the coherence length from these objects is much shorter than from distant stars.
 
Last edited by a moderator:
  • #15
In a way nothing new - it confirms what I was always sure about, each lens is different.

https://www.physicsforums.com/showpost.php?p=3181526&postcount=1021

Lately the more time I spend taking pictures, the less I understand of what I see. Either I am doing some systematic errors, or I am expecting too much. Trick is, seems like others are able to get results that are beyond me, which suggests I have no idea what I am doing :grumpy:

Andre: have you been shooting landscapes with Automatic AF Point Selection (default mode)? And if so, with what results?
 
  • #16
Andre said:
About sharpness and lenses, I think this is a rather interesting article.

http://www.lensrentals.com/blog/2011/10/notes-on-lens-and-camera-variation

Good article- thanks for the link. I've certainly noticed that I can have 'good days' and 'bad days' with the microscope- sometimes the images are stunningly perfect, other times there's a persistent blurring. I've always blamed it on the evil monkey that lives in the 'scope... :)
 
  • #17
Borek said:
Andre: have you been shooting landscapes with Automatic AF Point Selection (default mode)? And if so, with what results?

No not that I am aware of. For landscapes I use the Tokina 12-24mm and I prefer focussing manually, setting it to an estimated hyperfocal distance, usually 2-3 meters.

I only use the automatic AF point selection in combination with AI-servo to get the tracking feature in highly dynamic situations. But most of the time I use single point AF to have full control in static situations. I also became pretty agile with the 'joy stick' moving the spot to where I want to focus, rather than center focus - lock - recompose, which is a wrong technique.

For more dynamical snap shot type of work I would use zone focus or AF point expansion, but then the keepers are not always on 100%.

Occasionally I do have pix wondering why they are not tack sharp, but I also see that the rather high resolution voids old general rules about motion blurs. 1/125 sec is no longer stopping the motion blur of a slow moving person, like it used to do in the old days.

One thing I intend to do soon is checking the focus of all 19 point indivually with the 100mm.
 
Last edited:
  • #18
Andre said:
About sharpness and lenses, I think this is a rather interesting article.

http://www.lensrentals.com/blog/2011/10/notes-on-lens-and-camera-variation

I was thinking more about this- I wonder what the results would be if instead of testing high-end lenses, they tested cell phone cameras. My suspicion is that microcameras would not show the spread in performance (for a variety of reasons).

I've been working on the "lens test lab", and it's been growing out of control- there are at least 3 classes of measurement, each very different in execution and interpretation. Other than imaging a test pattern (Seidel aberrations), there's interferometric wavefront analysis (Zernike polynomials) and linear response (PSF/MTF measurement). None of these strictly apply to sampled imaging systems, and while they can all be applied to color images it is with considerable effort- I don't mean the Bayer filter, which falls under 'sampled imagers', but chromatic aberration measurements.

Then there's the matter of understanding lens manufacturer data- MTF curves are often available, but they are usually presented in a nonstandard way.

There's another complication for images evaluated 'by eye' (as opposed to machine vision/automated tracking systems)- the perceived contrast of an image is somewhat decoupled from the resolution- a high contrast low resolution image will be perceived as being 'better' than a low contrast high resolution image.

Then there's lens aberrations like flare and bokeh which fall outside of aberration analysis, CCD effects like blooming, aliasing, dead space... in the end, it's no wonder many photographers recommend against pixel-peeping.
 
  • #19
Sounds complex indeed, but if you need an acceptable poster format print for exhibition purposes, less than tack sharp gets exposed badly.

Anyway as said, I wanted to see how my focus points are performing individually. So I made http://dl.dropbox.com/u/22026080/DOF-checker.jpg [Broken], put it on the computer screen and shot it roughy at a 45 degrees angle with the 100mm on f2.8. Boy, is that soft. It is mega better at f 8/11 which is the minimum for macro anyway.

Here is an example of the center focus point:

29g1naa.jpg


It looks that it is backfocussing slightly, but not too bad it seems

Other focus points behaved about similarly, except for the outermost right spot, which backfocussed clearly more than the others:

2myp5hi.jpg


So maybe I should not use that point anymore for single focus point selection. That's bit of a pity since I used these outer points for focussing on the eyes of the bugs, having the remaining part of the body in the center.
 
Last edited by a moderator:
  • #20
Andy Resnick said:
I've always blamed it on the evil monkey that lives in the 'scope... :)

Maybe it's a cousin of Maxwell's Demon. :uhh:
 
  • #21
Andre said:
Sounds complex indeed, but if you need an acceptable poster format print for exhibition purposes, less than tack sharp gets exposed badly.
<snip>

That's a different problem, one of excessive magnification- even a tack-sharp 35mm image will look horrible if enlarged to a poster-sized print and viewed close-up. That's why medium format cameras are used even for 8" x 10" magazine prints- the magnification is less.
 
  • #22
I only just found this thread.
I like to apply Physics to what I produce with my camera.
There are a couple of issues that I am not sure have surfaced yet. Distant objects are more subject to the effects of Haze (mostly blue scattering) this reduces actual contrast. Then, there is the JPEG coding that I think has been applied to all of the clips (?) in the thread. This is very non linear and may well be applying smart sharpening to high contrast images more than to low contrast images. The Unsharp Mask in Photoshop does this very effectively on RAW images (no JPEG coding) because you have sliders to determine the parameters for the best result.
Any Lab work needs to take this into account if you want to be sure of your conclusions, I think.
On the subject of Haze; I have looked all over for a filter that would (sharp-) cut out the far end of the visible Blue. This, I am sure, would reduce the worst of the haze - allowing more contrasty pictures - without knocking out all the Blues. Distant mountains can be made to look a lot sharper if you reduce the gain in the blue but this, of course, wrecks the colour balance. I'm convinced that some clever filter could help a lot.
Why isn't there anything about? (or is there?) It could be that filters (other than polarising) are not used much with digital photography because Photoshop can do most of it after the event.
 
  • #23
Also, lens quality is a lot more important than sheer pixel number. They sell cameras with more pixels than you can deal with but with lenses that are not much better than the proverbial 'sucked acid drop'.
That's marketing for you.
 
  • #24
sophiecentaur said:
Also, lens quality is a lot more important than sheer pixel number. They sell cameras with more pixels than you can deal with but with lenses that are not much better than the proverbial 'sucked acid drop'.
That's marketing for you.

I don't know, maybe that's an urban legend becoming true, because everybody says so.

The total system resolution is depending on both and it is demostrated here with the numbers that a high pixel sensor (body) upgrade can be better than an upgrade to a superior lens:

Andre said:
so let's continue the test. this what you'd have, if you scroll roughly halfway you see that the 350D (8mp) with the 18-55 averages about 1850 lines per height.

Now you buy the super 17-55 to get this. See that the center is clearly better getting to 2100 lines per height. But the average is not increasing very clearly.

Now we know from DPreview that the image quality of the 600D with 18MP is practically identical to the 60D and 7D, since it's the same sensor. The 50D here is slightly inferior to that, so we know that the 600D is certainly not going to give worse results here. But as shown here, the cheapo 18-55mm on the 50D gets us average results around 2200 lines per height.

Consequently better glass is not dogmatic better than a better body. On the contrary, if you start comparing all the other nice improvements between the 350D and the 600D.
 
  • #25
Yes, of course there will be examples of both. But I can't recall a popular camera advert that stated lens mtf but not pixel count. Many of the 'kit' lenses are hardly as good as the body they' re sold with. Many people buy cheap additional lenses without appreciating how much they are losing out on IQ.
It's the same with Hifi. They're very selective about the advertised spec. And always in their favour.
 
  • #26
sophiecentaur said:
Many people buy cheap additional lenses

You can safely assume we won't.

Unless the lens is not only cheap, but also good.
 
  • #27
It's just that I read mostly about lines per inch on this thread but not the other crucial aspects of lens performance - like flare, for instance - which can make a huge difference to the final product - which is supposed to be a cracking picture.
A 'soft' picture can still look stunning if the contrast and other distortions are good. The early 6Mp bodies still produced excellent pictures when used with decent lenses. Some of the 'classic' lenses from the past would give oodles of lines per inch but you needed just the right lighting conditions to exploit this.
A great example of what I say is TV camera lenses, in the early days. They were designed with different criteria from film lenses. They needed to produce a video frequency response as flat as possible up to (in TV terms) 5.5 Mhz but nothing special above that. For the large format camera tubes that were used, the lenses were hideously expensive.
Modern digital imaging has the same requirements and DSLR lenses (in addition to the different sensor size, of course) have a whole lot of different design requiremens from film lenses. Your response can usefully go down to below the finest film grain size but, if it's a bit soggy by the time you get down to digital pixel width, the extra HF information is of no use to you.

btw, did my comments about the possible effects JPEG coding fall on deaf ears? (blind eyes? or whatever)
 
  • #28
sophiecentaur said:
btw, did my comments about the possible effects JPEG coding fall on deaf ears? (blind eyes? or whatever)

No, but we shot mostly in RAW, and what we see on our monitors is in RAW as well, we just convert for jpg to show pictures on forum. At this stage - I can speak for myself only, but I doubt And2 do it much differently - I choose compression level so that the effect I am referring too is still visible. As far as I know at zero compression level jpg is a loseless format, be sure I will not hesitate to use it if I will find it necessary.
 
  • #29
I was triggered on the same sentence. Okay here is a shot with the economically prized 70-300mm lens originally in raw

xeeefl.jpg


this 100% crop (format png, 563kb) says something about long distance distortion and sharpness:

311lxmf.png


Same crop but now in moderately high quality jpg, size 24kb, notice the minor artificialities?

ipvdjl.jpg
 
  • #30
Borek said:
No, but we shot mostly in RAW, and what we see on our monitors is in RAW as well, we just convert for jpg to show pictures on forum. At this stage - I can speak for myself only, but I doubt And2 do it much differently - I choose compression level so that the effect I am referring too is still visible. As far as I know at zero compression level jpg is a loseless format, be sure I will not hesitate to use it if I will find it necessary.

OK - just checkin'. Yes, it's sensible to post jpegs of already blown-up RAW images. Just calibrating myself to the level of discussion - 'as you do'.

Do many cameras actually use lossless JPEG? The 'best' on my (albeit ancient) Pentax K10D shows a lot of artifacts fwiw. Wiki seem to say JPEG LS is used in DNG. It's a minefield and, whilst storage is so cheap and engines are so fast, why not use RAW?. It's all a lot better than going down to the chemists after your holiday to collect your enprints.
 
  • #31
Andre said:
this 100% crop (format png, 563kb) says something about long distance distortion and sharpness:

311lxmf.png

meanwhile, another point I missed to make, it seems that the lines and sails get more blurred in the bottom part where they get in the 'ground effect'. Agree?
 
  • #32
It's amazing that we see the rigging at all at that distance. It can't be bigger than 10mm and could be 6 or 8mm. (less than 10^-5 radians at a km distance). For an aperture of 20mm and a wavelength of 800nm, the Rayleigh criterion gives a resolution of 2X10^-5. No wonder it's all a bit fuzzy - right near the limit for discriminating two adjacent rigging lines so when they cross, that blur is only to be expected.
Whatever our equipment can do for us, we always want better.
 
  • #33
sophiecentaur said:
Do many cameras actually use lossless JPEG?

I am not aware of any. Some cameras can save in jpg and tiff (that was the case of Marzena's Lumix) - but I never bothered to check the difference between these tiffs and jpg, and I can't don't have access to the camera now. Canon uses either jpg or their own raw format (cr2).
 
  • #34
Andre said:
this 100% crop (format png, 563kb) says something about long distance distortion and sharpness:

311lxmf.png


Same crop but now in moderately high quality jpg, size 24kb, notice the minor artificialities?

ipvdjl.jpg

Small trick - combine the images in the Photoshop, set blending to "Difference", and modify histogram to brighten the image (this one was modified automatically):

Andre_difference.jpg


Compression artifacts are always best visible close to the edges.
 
  • #35
Nice. Note those identifiable 'ripples' on the right of the backstay - they extend way out in the difference picture.

It's even better when this is done with low noise pictures, made under really good conditions. That, of course, is a good way of assessing, objectively, how well a particular compression algorithm is working.
 
<h2>1. What is picture sharpness and how is it affected by air effects?</h2><p>Picture sharpness refers to the clarity and detail of an image. Air effects, such as humidity and temperature, can affect picture sharpness by causing distortion or blurring. This is due to the way light travels through the air and interacts with particles in the atmosphere.</p><h2>2. How does humidity impact picture sharpness?</h2><p>High humidity can cause images to appear hazy or fuzzy, as water vapor in the air can scatter light and reduce contrast. This can result in a loss of detail and sharpness in the image. On the other hand, low humidity can cause images to appear sharper and more defined.</p><h2>3. Does temperature affect picture sharpness?</h2><p>Yes, temperature can also impact picture sharpness. In cold temperatures, air is denser and can cause light to bend, resulting in distortion in the image. In very hot temperatures, air can become turbulent and cause a similar effect. Ideal temperature conditions for optimal picture sharpness are typically between 68-77 degrees Fahrenheit.</p><h2>4. Can air pollution affect picture sharpness?</h2><p>Air pollution, such as smog or smoke, can also impact picture sharpness. Particles in the air can scatter light and reduce contrast, resulting in a loss of detail and sharpness in the image. This is why images taken in heavily polluted areas may appear hazy or dull.</p><h2>5. How can I improve picture sharpness in different air conditions?</h2><p>To improve picture sharpness in varying air conditions, you can adjust the settings on your camera, such as aperture and ISO, to compensate for changes in light and air effects. Additionally, using a lens hood or polarizing filter can help reduce the impact of air effects on picture sharpness. In post-processing, you can also use software tools to enhance sharpness and clarity in your images.</p>

1. What is picture sharpness and how is it affected by air effects?

Picture sharpness refers to the clarity and detail of an image. Air effects, such as humidity and temperature, can affect picture sharpness by causing distortion or blurring. This is due to the way light travels through the air and interacts with particles in the atmosphere.

2. How does humidity impact picture sharpness?

High humidity can cause images to appear hazy or fuzzy, as water vapor in the air can scatter light and reduce contrast. This can result in a loss of detail and sharpness in the image. On the other hand, low humidity can cause images to appear sharper and more defined.

3. Does temperature affect picture sharpness?

Yes, temperature can also impact picture sharpness. In cold temperatures, air is denser and can cause light to bend, resulting in distortion in the image. In very hot temperatures, air can become turbulent and cause a similar effect. Ideal temperature conditions for optimal picture sharpness are typically between 68-77 degrees Fahrenheit.

4. Can air pollution affect picture sharpness?

Air pollution, such as smog or smoke, can also impact picture sharpness. Particles in the air can scatter light and reduce contrast, resulting in a loss of detail and sharpness in the image. This is why images taken in heavily polluted areas may appear hazy or dull.

5. How can I improve picture sharpness in different air conditions?

To improve picture sharpness in varying air conditions, you can adjust the settings on your camera, such as aperture and ISO, to compensate for changes in light and air effects. Additionally, using a lens hood or polarizing filter can help reduce the impact of air effects on picture sharpness. In post-processing, you can also use software tools to enhance sharpness and clarity in your images.

Similar threads

  • General Discussion
Replies
9
Views
1K
  • General Discussion
Replies
22
Views
2K
Replies
12
Views
1K
  • General Discussion
Replies
21
Views
2K
Replies
14
Views
2K
  • General Discussion
Replies
30
Views
2K
  • General Discussion
Replies
9
Views
1K
  • General Discussion
Replies
9
Views
1K
  • General Discussion
Replies
14
Views
2K
  • General Discussion
Replies
14
Views
1K
Back
Top