‘Mistakes’ in lenses are called lens aberrations. The most important thing to remember is: There are no lenses in the universe completely free of aberrations. This includes our eyes.
This article is written for someone who wants to understand lens issues quickly and practically, without wanting to take a physics course. After all, if we discover a lens has aberrations, there’s very little we can do except send it back for inspection. All we have to know are:
- What aberrations are the most pronounced, and
- Will it affect our style of shooting?
What is a lens aberration?
Technically, the word aberration can be used in various ways:
- Anything that deviates from the norm.
- Strictly optical defects arising from the inability of light rays to focus on one point.
- Any optical defect in a lens.
These are definitions that are great for scientists and engineers, but for cinematographers and photographers, when I say ‘aberration’, I mean:
Any optical effect that deviates from the norm.
I didn’t say ‘defect’, but effect. Some ‘defects’ create character. If a lens distorts or vignettes (don’t worry if you don’t understand these terms, we’ll get to it) in a way we find aesthetically pleasing, then it really isn’t a defect, is it? Depending on how you see it, almost every lens defect can be used to create a work of art.
So, rather than think of lens aberrations as bad things, let us look at them as we look at human qualities. Aberrations produce character. All we should care about is: Do we like the characteristics of a lens or not?
A few things I won’t be covering
I won’t be covering the following topics in this article, as they are special cases not always relevant:
- Fisheye and other special-effects lenses
- Infrared and other forms of electromagnetic radiation
- Other scientific aberrations not relevant to filmmaking
I’ve written about how to select lenses in What lens to get?, and that might a good place to start if you are absolutely new to camera lenses.
Let’s begin. Think of the following process as entering an automatic car wash. Check one thing at a time, and come out clean with the least amount of trouble, money and time spent.
The Airy Disk and Diffraction
Light isn’t fair. There is an aberration that is caused not by the lens, but by the nature of light and the universe itself.
When light hits a sensor, it is never the perfect point that optical diagrams make it out to be. In Airy Disk and Pixel Density of the Human Eye, I’d written about how light spreads to the shape of an airy disk:
The circular rings around an already-fuzzy blob are diffraction patterns/rings. When you stop down a lens, you are increasing the ‘blobbiness’ of a ray of light, and this results in a less sharp image. For this reason, after a certain point, if you stop down your aperture, the effects of diffraction will be very noticeable.
However, the little-known fact about diffraction (as explained in the above link) is, it’s always there. The resolving power of a lens is directly tied to the effects of diffraction. To test your lens, fix your camera on a tripod and shoot a newspaper at different apertures, without changing the focal length or point of focus. You will discover that your lens is sharpest (we’ll cover sharpness later) at probably one or two aperture values only.
If resolution is not limited by imperfections in the lens but only by diffraction, the lens is said to be diffraction limited. Since you can’t escape diffraction, you ideally want diffraction limited lenses.
The Image Circle
The first thing you need to look for in a camera lens, other than the price, is whether the damn thing will cover the sensor area or not.
Light that comes out the back end of a lens (did that come out right?) has a circular shape on a perpendicular plane surface.
At a particular flange focal distance, a lens is designed to cover a particular sensor, which should fall within the image circle. In the left image, the sensor is too big, and the lens cannot throw light on the whole area. Ideally, you want a lens that can fully cover any given sensor, as shown in the middle.
When you have the third situation, your brain gets fuzzy.
An image circle does not have a sharp edge always. Instead, the light falls off gradually. This means, there’s a difference in light levels hitting the corners of an image compared to the center region. This is called a fall-off. This gives rise to the next aberration.
It’s a hung jury whether vignetting is acceptable or not:
Vignetting is just dark sides and corners. With today’s technology, you can easily add a vignette in post production, in any size and shape you want. Still, one might feel it is faster to have a vignetting lens.
To test for vignetting or fall-off, shoot an evenly lit light-colored wall at different apertures. You will usually find that the widest apertures display the greatest vignetting. As you stop down, you will come to an aperture at which point vignetting is at its lowest.
To put things very generally, lens designers try to get the minimum vignetting at the corresponding aperture where diffraction is also the least. This aperture range (if it is more than one) is the sweet spot of the lens. Finding the sweet spot is one of the first things you should do with a new lens.
Focal length, Angle of View and Depth of field
You should choose focal lengths based on your unique requirements, as explained here.
For a simple overview of the angle of view, read this. I prefer the horizontal angle of view of a lens, which, in simple terms is: How much area does the lens cover horizontally?
Understanding this will give you ideas on dealing with different spaces. Something that goes hand in hand with the angle of view is the depth of field (DOF):
At any given time, only a certain section of the scene can be in focus. When you can focus on the horizon, you are said to have infinity focus. When you can focus from a certain distance all the way to the horizon, you are using the maximum DOF possible with your lens, and this focus distance is called the hyper-focal distance. A good lens must always allow you to have infinity focus.
The most important property of a lens that controls the ‘length’ of the depth of field is the size of the aperture. At wide apertures, the depth of field is smaller than when the lens is stopped down:
In the above image, the blurred background was caused by an aperture of f/5.6, while the sharp background was caused by an aperture of f/32.
Understanding depth of field is critically important. You should have a complete grasp of which portions of your scene are in focus, which aren’t, and, how the first transitions into the second. In the above image of the flowers, there is hardly any transition. But if you’re trying to create drama or stick with continuity (in whatever form) you will need to have a keen eye for the way in which the out of focus elements ‘gel’ with those parts that are in focus:
Generally, when people think of the depth of field, they think of a box (or a wall). This is perfectly fine, because the sensor of the camera is usually aligned to the optical center of the lens, so whatever you point to will have a plane of focus that is parallel to the sensor:
See how the DOF region is parallel to the sensor? If you want the three tallest leaves to be in focus, you will either need to stop down, or tilt your camera so the plane of focus also tilts to keep itself parallel:
You can see how having an instinctive understanding of the plane of focus is helpful. However, there are some scenes where you can’t ‘get on top’. Landscapes are one good example. How do you stay on the ground yet apply the tilt principle?
Nobody said the sensor and lens have to be parallel!
The above image shows a view camera, but it isn’t any different from any other type of camera. The back part is where the sensor is, and the front part is the lens. You can see from the dotted lines that they are not parallel. By moving the sensor plane or the lens plane forwards or backwards (called tilting) we can actually control the plane of focus.
This famous ‘trick’ is called the Scheimpflug principle. Okay, modern cameras don’t have bellows, so how does one tilt anything? There are two solutions:
- Buy a lens that is designed with the tilt-shift function.
- Buy bellows or similar device, and put it between the lens and the sensor.
Shifting is the art of moving the sensor up or down, while keeping the lens at the optical center. The reverse works as well. Why would you want to do this? The coolest reason to shift is to keep straight lines straight:
When you stand at the foot of a tall building and shoot upwards, the top of the building recedes into a point. The shift feature allows you to keep the building parallel (within certain limitations of course) as shown in the above image (second vs third photograph).
Lets continue with our idea that one should pay attention to the areas in focus, out of focus and anything in between. The bokeh strictly applies to the out-of-focus areas. I’ve already covered bokeh in What is Bokeh? so I won’t be going into detail here. Suffice to say I define it as the blurred regions of the out of focus areas of a scene.
The out of focus areas of a scene are part of the image, so if its aesthetics are integrated with the overall message of the frame or scene, the better the result.
In Part Two we’ll look at more lens aberrations, and finish this car wash.