# Digital Camera Buyer’s Guide: DSLR

[Total: 1    Average: 5/5]

A: These cameras were originally designed for professionals.  Users of these cameras are expected to already understand basic digital photography techniques, and the in-camera electronic imaging processing should be viewed as enhancing good technique, not compensating poor technique. One key distinction between a DSLR and bridge camera is that the lens is removable on a DSLR. The lens performance is more critical now- a high quality lens will continue to deliver excellent performance long after the camera body is obsolete.  A second key distinction is the sensor size- sensors in DSLRs approach or even exceed the 35mm format- there are some digital medium-format cameras on the market. A DSLR and well-corrected lens, based on over 100 years of continuously improved optical design, can approach the limit of what is physically possible. So in this section, we will present some additional details of imaging theory.  Just as the case with film cameras, each manufacturer has their own proprietary ‘lens mount’ which can make switching between manufacturers problematic, if you have already spent money on a good lens. These include the Nikon F-mount, Sony E-mount, Pentax K-mount, Leica M mount, and Canon EF mount. Lag times in DSLRs are generally nonexistent.

Classification of lenses: Lenses are classified based on their 35mm equivalent rear focal lengths. The standard lens is a 50mm lens; on a 35mm format, a 50mm lens produces an image that nearly matches normal vision in both magnification and field of view.  Lenses with a shorter focal length are ‘wide angle’ lenses, until you get down to 10mm or less; these are ‘fisheye’ lenses and produce images with fields of view sometimes exceeding 180 degrees (a full hemisphere). Lenses between 70mm and 90 mm are usually referred to as ‘portrait’ lenses, while longer length lenses are ‘telephoto’ lenses that at the extreme end (1200mm and up) appear indistinguishable from telescopes.  Zoom lenses have become more common, and involve complex movement of lens elements to allow changes in the focal length while keeping the focus (largely) unchanged.  The standard ‘kit’ lens offered with a DSLR is usually a zoom lens; you do not *have* to get this lens with your camera.  Tilt-shift lenses (or ‘Perspective control’ lenses) allow you to orient the lens with respect to the sensor plane- the effect is to remove perspective from a tall building, for example.  This also allows imaging under the Scheimpflug condition, where the plane of best focus is not parallel to the sensor plane (useful for photographing hillside landscapes, for example). Macro lenses are designed for close-up focus and reproduction ratios approaching or exceeding 1:1.

Just as the focal length of a lens is unrelated to the distance to focus, the rear focal length is not related to the distance between the lens and the sensor.  Wide angle lenses, in particular “retrofocus type” designs, set the distance between the rear element and sensor to be larger than the rear focal length- the rear focal plane is located in the space between the rear element and sensor.  This is to be compared to ‘telephoto’ lenses that place the rear focal plane out in front of the front element.

Something to consider as well, is if the camera can be operated with *no* lens attached.  This will give you additional flexibility in choice of lenses and the ability to work with a bellows attachment- but then you most likely have to work in ‘full manual’ mode.

Everyone has their own opinions about what lens (or lenses) are ‘the best’. Lens performance is critical here, and time spent doing research to get the best possible lens you can afford will result in a lens you will be very happy with for a long time.

Lens aberrations: This is also a large subject, so only a brief synopsis will be presented here.

Aberrations occur because the paraxial condition fails- that is, sin(q) != q. The paraxial approximation is very accurate for small angles: the error is 0.16% at f/5, and only 1.3% at f/1.8. The next term in the expansion (sin(q)^3/3!) is the dominant error term, and represents ‘3rd order aberrations’, “Seidel aberrations”, or ‘primary aberrations’.   Each primary aberration (piston, tilt, defocus, distortion, coma, field curvature, astigmatism, and spherical) represents a independent deviation of the aberrated wavefront with respect to a reference sphere. These deviations are:

Piston- Piston is a constant shift in the wavefront phase.

Tilt- Tilt is a (spatial) linear shift in wavefront phase.  Neither piston nor tilt affect image quality, and are usually neglected.  Similarly, we will pass over defocus, as camera lenses can adjust the focus.

Distortion- Distortion is defined as the variation of magnification with image height. Straight lines do not remain straight; barrel distortion is positive, and lines bow out. The opposite is pincushion distortion. Distortion can vary with focus distance and is very noticeable: you can detect 0.5% distortion easily. This is often a dominant aberration in camera lenses, because in contrast to the other primary aberrations, the amount of distortion does not vary with aperture size. Fisheye lenses (intentionally) have distortions typically approaching 100%. Landscape and architectural photography in particular are very unforgiving of distortion.

Coma- Coma is defined as variation of magnification with aperture: rays crossing the aperture plane at different heights cross the image plane at different heights. Points appear as small ‘comet’ shaped blobs (hence the name). This is particularly distracting when looking at point sources e.g. stars or distant lights.

Field curvature/Petzval curvature- The image plane is not flat; it is instead a section of a sphere.  The center of the image is in focus while periphery is out of focus, or vice-versa. “Plan” lenses are corrected for field curvature over 95% of the image.

Astigmatism- this aberration, like coma, breaks the rotational symmetry of the optical system.  The two orthogonal directions are called ‘tangential’ and ‘sagittal’ and rays in these planes focus to two different image planes.   The effect is that defocus blur will be preferentially oriented, becoming rotationally symmetric at an in intermediate plane of focus (the ‘medial’ plane). ‘Anastigmatic’ lenses are corrected for astigmatism.

Spherical aberration has become more familiar due to the quality of ‘bokeh’ (defined below).  Spherical aberration is defined as the variation of focus with aperture: rays that cross the aperture stop at different heights are focused at different planes. Spherical aberration is always present in lenses made of spherical surfaces. ‘Aplanatic’ lenses are corrected for spherical aberration and include an aspherical element. Use of aspherical surfaces to fully correct spherical aberration is becoming more available as manufacturing technology improves.

Chromatic aberrations: these are not related to the aberrations above, but refer instead to the dispersion of the lens. The two primary forms are lateral and axial chromatic aberration. Lateral chromatic aberration results in in-focus elements appearing as rainbows, while axial chromatic results in points appearing as colored bursts (and is associated with spherochromatism, below).  This effect is more pronounced in out-of-focus components of an image.  Use of different types of glass (e.g., the crown-flint achromatic doublet) reduces chromatic aberrations. Achromatic means 2 colors focus to the same plane; apochromatic means 3 colors focus to the same plane; superachromats are corrected for four colors. Achromats will be corrected for a blue and a red color (generally the 486.13 nm Hydrogen F-line and 656.28 nm Hydrogen C-line), apochromats will also be corrected for an intermediate wavelength (generally the 587.56 nm Helium d-line).

The variation of spherical aberration with wavelength is called ‘spherochromatism’, and can this can be a dominant aberration in a well-corrected lens. Often the term ‘purple fringing’ is used to describe the effect, as objects will have dominant magenta or purple features.

As the aperture increases, aberrations grow in magnitude and additionally, higher order aberrations become non-negligible (5th order is sin(q)^5/5!, 7th order, etc. Some high-end lenses are corrected all the way out to 9th order.

Falloff– images will be brighter in the center than in the periphery.  The edges of the image correspond to large incident angles of illumination, and so the geometric obliquity factor cos^4 becomes important. Because the projected areas of both the object and sensor with respect to the optical axis become smaller by factors of cos(q), the intensity varies for each as a factor of cos^2, thus leading to a total variation as cos^4.  Especially noticeable using wide angle lenses, imaging at low f-numbers results in the edges of the image being noticeably darker than the center.  This can be used to your advantage, by naturally drawing your attention to the center of the image.  Digital cameras may incorporate Ôflat field correctionÕ to compensate for this.

Flare– nonimaging light that does not pass normally through the lens, but instead enters the lens at an extreme angle and reflects off an interior surface before reaching the sensor.  The usual effect in the image is a row of small bright images of the aperture (aperture ghosting). Typically associated with sunlight, lens flare can be controlled by a variety of methods including use of a lens hood, coating the interior structure with diffusing black paint, and adding internal baffles.  Glare can also occur from reflections between a filter and the front element, or between the first few front elements in the lens itself.  Often the result is desaturation of color.

Bokeh is a Japanese term, added to the photographic vocabulary fairly recently. It refers to how out-of-focus objects are imaged to form a compositional element of the overall image.  Ideal bokeh in background objects is produced by undercorrected spherical aberration, with the result that out-of-focus bright objects gently blur into the background.  Overcorrected spherical aberration produces bokeh that is characterized by a bright halo around background object, and is considered unattractive.

Image stabilization: lens/camera/tripod, mirror lockup: there are situations where camera motion becomes problematic: long shutter times (producing motion blur) and long telephoto lenses (high magnifications).  Also, these cameras and lenses are often *heavy*. The image can be stabilized using a variety of technologies: the most basic is a tripod/monopod.  With the advent of electronic sensors, manufacturers have been introducing motion-compensating mechanisms either within the camera body, and/or within the lens itself.  Different manufacturers use different motion compensation technologies, and there are debates regarding the advantages of either. Regardless, image stabilization allows you to take sharp images several f-stops higher (or the equivalent change in exposure time) than without. ‘Mirror lockup’ is a technique that was developed to allow mechanical vibrations from the moving mirror to damp out prior to exposing the film- pressing the shutter once raises the mirror, and pressing it again will expose the sensor.  Lastly, use of a shutter release cable (or remote electronic triggering) is used to prevent camera motion during the press of the shutter release.

Filters: In addition to the use of color filters (either electronically or with a gel), other filters can be attached to the lens, usually via a screw thread at the front surface. A basic filter is a ‘UV blocker’, which reflects ultraviolet radiation and also places a protective glass surface in front of the lens.  Some people attach a UV filter on their lenses rather than carry lens covers.  There are also gradient filters, which present a gradient (either neutral density or colored) across the front element- this can be used to even out the illumination in a scene containing a very bright region (sun, bright sky, etc) and a dark region (shadows, etc.). These filters can be rotated to obtain the optional orientation.  Polarizing filters come in two varieties, linear and circular, and are used to control sunlight that has reflected off of a flat surface: water, cars, etc. Using a polarizer when photographing a clear sky will emphasize the natural polarization of the sky. Due to the properties of autofocus sensors, a circular polarizer is generally preferred to a linearly polarizing filter- it is a linear polarizer in front and a quarter-wave retarder behind.  Use of either a polarizer or gradient filter on a zoom lens should only be performed on lenses that do not rotate the front barrel during zoom; otherwise the filter will rotate with the lens, preventing control over the orientation of the filter.

Rule of 16: The ‘rule of 16’ was developed during film, and it states: optimal exposure in bright sunlight is f/16, with the shutter speed set, in seconds, to 1/ISO.  That is, slow film (ISO 100) uses a shutter speed of 1/100 seconds, while fast film (ISO 1600) would use a shutter speed of 1/1600 seconds.  Since each f-number change halves the amount of light, the rule of 16 provides a starting point to estimate optimal aperture settings and shutter speeds.

Hyperfocal distance: The hyperfocal distance is calculated by maximizing the depth of field: when a lens is focused at the hyperfocal distance, objects from infinity to half the hyperfocal distance are rendered in focus.   The analytic result is:
$$H = f( \frac{f}{Fc}+1)$$,
where H is the hyperfocal distance, f the focal length, F the f-number, and c the diameter of the circle of confusion. The hyperfocal distance also forms a series solution: focusing the lens at 1/2 the hyperfocal distance renders objects from the hyperfocal distance to 1/3 the hyperfocal distance in focus; focusing at 1/3 the hyperfocal distance covers objects from 1/2 to 1/4 the hyperfocal distance, etc.  For example, the hyperfocal distance for a 28mm lens set to f/16 on a 35mm camera is about 1.6m. Everything from 0.8m to infinity will be sharp in a photograph taken with this lens focused at an object 1.6m away.

Telephoto lenses are rarely used for hyperfocal distance focusing, as the hyperfocal distance is quite distant with these lenses. For example, the hyperfocal distance for a 200mm lens set to f/16 on a 35mm camera is about 86 meters. Everything from about 45 m to infinity will be sharp in a photograph taken with this lens focused at this hyperfocal distance. This lens isn’t useful for taking a landscape photograph in which you want near objects to be sharp as well.

Nodal, Pupil, and Focal planes– this section was added to clarify the large amount of confusing and conflicting information we encountered on many otherwise excellent websites while constructing this buyer’s guide. All optical systems can be analyzed using six (cardinal) points: the front and rear focal point, the front and rear principal points, and the front and rear nodal points.  In addition, the location of aperture stop (equivalently, the entrance and exit pupils) and field stop, if there is one, should be known. Although these concepts are used in geometrical optics, it can be helpful to describe the action in terms of physical optics.

Focal points: Geometrically, rays initially parallel to the optical axis are brought to focus at the rear focal point.  More generally, plane waves entering an optical system will focus to points (Airy disks) at the rear focal plane. When a lens is focused to infinity, the sensor plane lies at the rear focal plane.

Nodal points: The front and rear nodal points are conjugate points with unit angular magnification.  Rays passing through the front nodal point with a given angle exit the rear nodal point at the same angle.  For lenses in air, the nodal points are located at the principal points.

Principal points: The intersection of a principal plane and the optical axis is the principal point. Rays that intersect the front principal plane at some height, exit the rear principal plane at the same height: principal planes are conjugate planes that have unit transverse magnification.  The distance between the front (rear) focal point to the front (rear) principal point is the front (rear) focal length.

Entrance/Exit pupil: the aperture stop limits the cone of light from object points.  The projection of the aperture stop into object space is the entrance pupil, the projection into image space is the exit pupil. When you look into a lens and see the aperture stop, you are actually seeing the entrance (or exit) pupil.  All light that hits the sensor *must* pass through the entrance pupil, aperture stop, and exit pupil.

Confusion arises when discussing panoramic imaging: rotating the camera to capture a large field of view. Rotating a lens about the rear nodal point does not produce motion of the image- swing lens panoramic cameras rotate the lens about the rear nodal point and have a curved image plane.  More typical is ‘stitched panoramic’ images taken with a fixed lens and flat sensor.  In this case, the lens should rotate about the entrance pupil to eliminate parallax error: near and far objects will maintain their relative positions when the lens is rotated about the entrance pupil.

Miscellany: flash, lens adapters/converters: The flash that comes with your camera (if one does) may not meet your needs.  Flash units can attach to your camera, or be designed to work remotely.  Some cameras allow you to control an entire bank of flash units remotely.  Lens converters: because different manufacturers use different lens mounts, there are adapters/converts than can allow you to use lenses made by one manufacturer on a camera made by another manufacturer.  You may lose some functionality: for example, Nikon “series G” lenses do not have an aperture ring.  New lens mount standards regularly appear (the latest is the four thirds mount).  Generally, a camera with a lens mount that places the lens close to the sensor can be easily adapted to fit a lens using a lens mount that places the lens far from the sensor (the adapter is simply a spacer).

I want to get the best digital camera there is and I don’t care how much it costs.

A: PF does not endorse any camera manufacturer or lens manufacturer.  Our members are happy to answer any detailed questions, or discuss particular cameras and lenses with you.

Tags:
14 replies
1. DrClaude says:

Nice Insight Andy.

One thing I wonder about is why there is still an "R" in DSLR, now that we don't use film.  Is there any advantage to not having the sensor exposed all the time?  I can't think of many disadvantages, like the mirror vibrations you discuss yourself.

• Andy Resnick says:

Good question.  If I understand you, the sensor can't be exposed all the time, because then there's no way to set an exposure time.  Mirrorless cameras have an electronic mechanism to 'wipe' the sensor prior to an exposure, and the main advantage to mirrorless systems is as you say- there's no mirror that moves, not only reducing vibrations but also decreasing the distance between lens mount and sensor, allowing for smaller lenses.  Mirrorless cameras can easily use lenses designed for rangefinder cameras.On the other hand, mirrorless cameras have a decreased optical throughout, because some of the light is permanently re-directed to the (digital) viewfinder.  I don't know exact numbers, but AFAIK, this represents about a half-stop of light lost.

2. olivermsun says:

Nice Insight Andy.

One thing I wonder about is why there is still an "R" in DSLR, now that we don't use film.  Is there any advantage to not having the sensor exposed all the time?  I can't think of many disadvantages, like the mirror vibrations you discuss yourself.

A few reasons (by no means exhaustive) why the reflex mirror/optical viewfinder still survives today:

1. Optical viewfinder still has fastest response and potentially best color/resolution
2. Off-sensor phase detect AF (receive their light through a semi-silvered reflex mirror and sub mirror) still best for moving/unpredictable subjects
3. Similarly, off-sensor exposure meters (including TTL flash meters) can be useful
4. Optical viewfinder doesn't eat batteries

Also, reasons not to have the sensor exposed all the time include heat/image noise, blooming, etc.

Obviously, EVFs have many unique advantages,  and most of their disadvantages relative to a reflex mirror/optical viewfinder are diminishing as EVFs continue to improve.

3. olivermsun says:

On the other hand, mirrorless cameras have a decreased optical throughout, because some of the light is permanently re-directed to the (digital) viewfinder.  I don't know exact numbers, but AFAIK, this represents about a half-stop of light lost.

Wait, doesn't the EVF usually get fed by the image sensor itself?

4. olivermsun says:

I expect different manufacturers have different approaches.  Often, there is a pellicle beamsplitter that directs some light to the autofocus sensor:

That depicts the so-called SLT (Translucent instead of Reflex) configuration, used by Sony and previously by Canon to remove the need for flipping the mirror at a small cost to the light transmitted to the sensor. In other respects it is essentially the same as a traditional AF SLR and provides an optical viewfinder.

5. Andy Resnick says:

That depicts the so-called SLT (Translucent instead of Reflex) configuration, used by Sony and previously by Canon to remove the need for flipping the mirror at a small cost to the light transmitted to the sensor. In other respects it is essentially the same as a traditional AF SLR and provides an optical viewfinder.

Exactly.

6. olivermsun says:

Exactly.

Hmm, I don't understand why you chose this example then. The SLT is actually an SLR with a Pellicle mirror and an optical viewfinder, so it doesn't explain why mirrorless cameras should lose a fraction of a stop to the EVF.

7. Andy Resnick says:

Hmm, I don't understand why you chose this example then. The SLT is actually an SLR with a Pellicle mirror and an optical viewfinder, so it doesn't explain why mirrorless cameras should lose a fraction of a stop to the EVF.

I think you missed my point- my point is that the throughput is lower on a mirrorless camera as compared to either a reflex or rangefinder camera, it's less relevant where the re-directed light goes.

8. olivermsun says:

I think you missed my point- my point is that the throughput is lower on a mirrorless camera as compared to either a reflex or rangefinder camera, it's less relevant where the re-directed light goes.

I think I got your point, but I am disagreeing that the "throughput" of a mirrorless camera has to be any different from that of a reflex or rangefinder camera.

I replied about your SLT example because, while it's true that SLTs have lower throughput than a traditional SLR, they are not mirrorless cameras, so I don't think they are a relevant example.

Finally, I do think it's relevant to explain where the re-directed light goes. If, as on a modern mirrorless camera, the light isn't re-directed anywhere but it goes straight to the image sensor, just as it does in a digital rangefinder or an SLR in "live view" mode with the mirror flipped up, then why would the mirrorless camera have any lower throughput than the others?

9. Andy Resnick says:

I think I got your point, but I am disagreeing that the "throughput" of a mirrorless camera has to be any different from that of a reflex or rangefinder camera.

Maybe it would be helpful to identify the specific camera you are thinking about?

10. olivermsun says:

The Sony A6xxx and A7 series are current mirrorless cameras with very similar sensors to several (e.g., Nikon) DSLRs, and they show low-light performance very comparable to same-format DSLRs. Olympus m4/3 cameras also perform just fine, with allowance for the smaller sensor size.

But, I think, for the purposes of understanding the DSLR-like choices on the market, it's more important to figure out what are the different fundamental constraints of various configurations and what creates those constraints. That's why I've been asking why you've said mirrorless cameras lose low-light sensitivity because they re-direct light somewhere other than the image sensor.

11. Andy Resnick says:

The Sony A6xxx and A7 series are current mirrorless cameras with very similar sensors to several (e.g., Nikon) DSLRs, and they show low-light performance very comparable to same-format DSLRs. Olympus m4/3 cameras also perform just fine, with allowance for the smaller sensor size.

Ah- this is helpful.  As I said, different manufacturers implement technologies in their own way.  The Sony A7 series embeds the AF sensor into the main image sensor, so there's no pickoff and no loss of light.  Similarly, use of an electronic front curtain shutter allows for elimination of one of the mechanical shutters (I think a back curtain shutter is still required).

https://www.mhohner.de/newsitem2/efcs

12. Andy Resnick says:

Update: Wide-angle/ultra-wide-angle/fisheye lenses

Note, unless otherwise specified, all focal lengths are in terms of the 35mm image format.

Recently, there has been a flurry of new high performance ultra-wide angle lenses introduced to the consumer market. The imaging properties of these lenses are very different from other photographic lenses; the technique used with these lenses is also very different. Because they have not been easily available for very long, many people have not had a chance to use one, so I thought an ‘update’ about this type of lens may be of value. There are many online guides explaining how to compose a photograph using an ultrawide, for example:

https://digital-photography-school.com/how-to-get-the-best-results-from-ultra-wide-lenses/

These lenses have fields of view significantly wider than your natural vision, which for each eye is about 55 degrees, roughly equivalent to a 45mm focal length lens. Until fairly recently (around 2005), high performance camera lenses with focal lengths shorter than 18mm were generally difficult to find. Today, interchangeable camera users have access to many ‘rectilinear’ lenses with fields of view exceeding 110 degrees.

The distinction between wide angle and ultrawide angle is fuzzy, but a good rule of thumb is that if the lens focal length is shorter than the short side of the sensor (24mm for 35mm format), the lens is called ‘ultrawide’. ‘Fisheye’ lenses have fields of view up to and exceeding a full hemisphere. The primary distinction between a fisheye and an ultrawide is that an ultrawide lens is designed to minimize distortion, while the fisheye (including design variants such as orthographic projection lenses and f-theta lenses) is designed to incorporate significant amounts of distortion. In images acquired with an ultrawide lens, straight lines (ideally) stay straight- ultrawides are ‘rectilinear’ lenses.

The shortest focal length currently available (2017) is a 10mm “hyperwide” lens sporting a field of view of 130 degrees. Several companies provide 11mm focal length lenses, others offer 12mm, and 14mm/15mm lenses are now fairly commonplace. There are even several ultrawide zoom lenses and an ultrawide tilt-shift lens. To give you a sense of how the field of view varies with focal length, see this image:

While not indicated, a 50mm lens would only see the central group of 2 bookshelves and staircase.

Historically, wide-angle lens design began in 1862 and (in my opinion) reached a zenith with the Hypergon and Metrogon/Topogon lenses.

Note, however, these lenses were designed for large format film. Ultrawide angle lenses have several design issues unique to short focal lengths including falloff, flare, fabrication, filters, and (f)chromatic (f)aberrations.

Many design constraints for ultrawides are principally driven by the relatively large distance between the lens mounting flange and the sensor. On most cameras, this distance is longer than the lens’ back focal length, and the general lens design is known as a ‘retrofocus’ design, sort of the opposite of a telephoto lens design. Lenses placed closer to the image plane in general are easier to design and fabricate. Because of this, wide-angle lenses were developed for rangefinder cameras (and can be used on mirrorless cameras) well before SLR cameras. Rangefinder camera users had several choices for many decades. However, to the best of my knowledge, the only viable SLR ultrawides for many years were Nikon’s 15mm and 13mm lenses introduced in the late 1970s and early 1980s and discontinued shortly thereafter.

While some development of APC-format ultrawide lenses occurred in the early 2000s, modern high-performance 35mm format ultrawides didn’t really appear until around 2010 when Nikon, Canon, and Zeiss each introduced new high-performance fixed and zoom ultrawide lenses. More recently, additional companies have introduced high-performance ultrawide angle lenses for full frame 35mm cameras, with focal lengths going all the way down to 10mm (Voightlander hyperwide). There are several technical reasons why ultrawide lens design has lagged behind telephoto lenses.

Once reason is the size and shape of the front element. The shorter the focal length, the more strongly curved the front element must be- similar to fisheye lens design. As extreme examples, Sigma’s 14/1.8 lens has a 80mm aspherical front element, while the Nikkor 13/5.6 front element measures 110mm across. These types of front lens elements are incredibly difficult to manufacture and awe-inspiring to observe.

Strongly curved front elements are one reason why ultrawide lenses can be especially susceptible to flare/glare. Lenses designed for mirrorless/rangefinder mounts have smaller diameters, as the lens is placed physically closer to the sensor, which helps to reduce flare.

A second technical difficulty is correcting transverse chromatic aberration due to dispersion of those large, highly curved, lens elements. In general, the designs of all of these new lenses were enabled by the introduction of new types of optical glass (extra low dispersion and anomalous partial dispersion glass types) by the major foundries.

Falloff, the optical throughput as a function of image height, occurs because the entrance pupil, the projection of the aperture stop into object space, does not remain constant as the location in object space move off the optical axis: at more oblique incident angles, the round hole appears as an elliptical hole. Correcting for falloff becomes more difficult as the field of view increases. Two non-optical approaches include: a ‘star fan’ some Hypergons came supplied with, and center filters.

One potential problem unique to digital sensors is caused by the Bayer filter: ultrawide lenses generate rays that enter the sensor at extreme angles due to the large field of view. The Bayer filters may not perform the same way as it does for rays that enter at more moderate angles. As a result, there can be chromatic effects around the periphery of the image independent of any uncorrected transverse color aberration.

Similarly, the huge front element generally makes use of filters difficult or not possible. Aside from the increased size of filter, there often is no place for a filter in front of the lens- some 3rd party filter makers (Coker, etc) have developed contraptions allowing use of large square filters, and the old Nikkors have a slot in the rear of the lens. Using polarizers will create a strong effect in the sky (whether this is good or bad depends on your point of view). Again, the smaller diameter lenses associated with rangefinder/mirrorless have more filter options.

What’s new for SLR users is also true for other camera mounts: for C-mount cameras, for example, there are now 1.3mm focal length lenses with 135 degree field of view which are equivalent to the 10mm hyperwide.

How low can the designers go? It’s unclear, because I’m not sure how the front element scales with the focal length. Seeing how large the Nikkor 6mm/2.8 and Coastal Optics (Jenoptik) 7.45mm/2.8 front elements are gives some idea- just note that those are fisheye lenses, so the front element is quite a bit smaller than what would be required for a rectilinear lens of similar focal length. For comparison, the large format 60mm Hypergon has a 135 degree field of view, scaling this to a 35mm format gives a 7mm equivalent focal length- the 10mm hyperwide approaches the 60mm Hypergon in field of view, but provides improved throughput.

In summary, the recent explosion of high performance ultrawide lenses provides photographers a hugely expanded range of choices at the ‘other’ end of focal length.