Focus is a fundamental factor in photography. Minor focus errors during shooting can lead to discarding a photo, as they are often irreparable. Modern cameras typically use automatic focusing systems, but these systems vary in subtleties and settings. Let's start with the basics.

Active and passive systems

Autofocus systems in cameras are divided into two main types: active and passive. Active autofocus works on the principle of sending a signal to the outside — it reaches the target, is reflected and returns, on the basis of which the exact distance to the subject is calculated. An infrared beam, ultrasound or even a laser can be used as a signal. This is the principle of operation of auxiliary LiDAR cameras in smartphones. ToF-cameras have a similar principle of operation, designed for scanning three-dimensional shapes of objects — they are used in face unlocking systems, in augmented reality, etc.

Active autofocus systems emit certain signals that bounce off the target and provide data on the distance to the subject. 3D ToF in smartphones works on the same principle.

Active systems do not depend on ambient lighting conditions and can provide excellent "vision" in total darkness. However, they have a big disadvantage - it is often the impossibility of accurate focusing through transparent obstacles. Active autofocus is usually used only as an auxiliary tool, and the main work is done by passive autofocus systems.

Important. Many advanced cameras have a characteristic red light that emits a beam of light. However, this should not be confused with active autofocus — it is just an illumination for focusing on the subject in dark conditions.

Passive autofocus systems rely on the principles of contrast or phase detection. There is also a hybrid autofocus option that combines contrast and phase detection methods. Each of these points needs to be discussed in more detail.

Contrast autofocus

Contrast autofocus is based on software algorithms. It is based on the simplest principle — an image in focus is always more contrasting than an out of focus one. In the focusing process, the camera essentially employs a random selection of various focus values to monitor changes in contrast. If contrast decreases, the focus point shifts in the opposite direction; if contrast increases, the focus continues until reaching the maximum value. It's worth mentioning that the automation consistently overshoots the ideal contrast level before recognizing it as the maximum and refocusing.

Using the contrast method, the camera continues to move the focusing lenses inside the lens until the maximum contrast level is reached (the middle sunflower in the picture).

The contrast autofocus method uses data directly from the sensor and does not require any additional modules. And since it uses purely software algorithms, they can be finished to perfection almost endlessly — often the work of contrast autofocus is radically improved by new firmware.

The other side of the contrast autofocus is the yaw effect, when the focus characteristically slides back and forth, trying to find the ideal contrast value. This does not play a decisive role when photographing static frames, but in dynamic and video recording, autofocus yaw can be a real problem.

When capturing textures like a cloudless blue sky or a white wall, contrast AF may not be useful at all, as the camera will simply have nothing to focus on to correctly match the contrast level. Also, such autofocus is often inaccurate in the dark and is poorly suited for capturing fast-moving subjects. If the algorithms of contrast autofocus are not perfected, such systems can often be wrong when shooting close up, especially when the background is much brighter and more contrasty than the subject in the foreground. Moreover, autofocus in such conditions blurs even in the case of forced selection of the focus area. The disadvantages of this technology have been corrected in phase detection autofocus systems.

Phase detection autofocus

These autofocus systems are based on the principle of detecting the phase difference of light streaming through the camera lens. Phase detection autofocus uses special sensors that capture the passing light stream with the help of the microlens array. The light is divided into two parts, each of which falls on an ultra-sensitive sensor. If the light reaches the center of the sensor, the subject is in focus. If the focus is nearer or farther away from the subject, the distance between the beams will be smaller or larger, respectively.

In phase detection autofocus systems, special light-sensitive sensors are responsible for focusing.

Based on information from the autofocus sensors, the camera immediately calculates how much the focus is erroneous, and promptly adjusts it in the desired way. Phase detection autofocus systems operate as quickly and tenaciously as possible, especially in the mode of tracking moving subjects, however, they are not without flaws.

The phase detection autofocus is the predominant system in SLR cameras. The amount of space required for a phase detection system in the camera body is quite large, and the accuracy of such systems directly depends on the number and type of focus sensors (horizontal points are better with vertical details, while cross-type sensors are usually placed in the center of the image). Entry-level models are usually equipped with simple systems with a few focus points, while professional cameras have complex autofocus systems with an abundance of focus points of various types.

The type of points directly affects the focus performance of autofocus systems. Cross-type and dual cross-type AF points, placed primarily in the center of the frame, are best at capturing the subject.

The important moment when using phase autofocus in DSLRs is when the photographer presses the shutter button halfway. The camera will focus at this point, and subjects that are in focus will be framed or dotted on the display. In order for phase detection to function properly, the mechanism must be correctly aligned, otherwise your DSLR camera may experience back and front focus problems.

In mirrorless cameras and smartphones, phase detection autofocus works differently. Special PDAF (Phase-Detection Autofocus) pixels are found among the regular pixels that make up the camera sensor. They are not directly involved in constructing the image, but serve solely to assist in focusing, capturing light and analyzing the light beam to calculate the distance to the subject. One of the most advanced types of such autofocus is the proprietary Dual Pixel CMOS AF from Canon. It is found in Japanese brand cameras and many smartphones (in particular, Samsung smartphones).

In Dual Pixel systems, the pixels on the sensor consist of two halves — they are responsible for both autofocus and image formation at the same time.

In Dual Pixel systems, about 80-90% of all pixels in the sensor are two-component pixels, consisting of two photodiodes. During focusing, the camera uses information from each photodiode separately, and when taking a photo, the signals from the two photodiodes are combined to form the final image. As a result, focus is effectively tracked over almost the entire field of the frame. Dual Pixel's signature feature is the movie-like effect of smooth focus transfer as the subject moves toward the camera, keeping it in the focus zone on a blurred background.

Other modern phase detection autofocus systems follow a similar principle. But in high-resolution sensors (like 108 MP or 200 MP) not all pixels have a two-component structure — the technology has simply not reached such a miniaturization effect. The two-component modules are built into the pixel grid with a certain pitch to ensure the widest possible coverage of the future frame area.

Hybrid autofocus

The advantages of contrast and phase detection autofocus are combined under the wing of hybrid autofocus systems. In simple words, the principles of their operation can be explained as follows: phase sensors directly on the camera sensor provide the initial focus (determining the approximate location of the focus), then the sharpness is brought by contrast autofocus.

Hybrid systems combine the advantages of contrast and phase detection systems and do not have serious disadvantages. The only one is that in the focusing tracking mode they are still inferior in speed to conventional phase detection autofocus. Hybrid autofocus is often found in modern cameras and smartphone cameras.

In hybrid autofocus systems, phase detection is used to determine the approximate distance to the subject, and focus is finalized using a contrast method.

The notable post-focus function, featured in numerous Panasonic mirrorless cameras with a Micro 4/3 format image sensor, involves capturing a series of images focusing on various areas of the frame. Users can then select the desired focus point from the resulting photos.

Autofocus modes

From a practical point of view, it is important not only to understand the work of autofocus systems, but also to understand the differences between various modes, which directly affects the focus of photographs and recorded videos. Let's look at this using cameras as an example. So:

  • AF-S (AutoFocus Single) — single-frame AF mode for shooting static subjects (landscapes, architecture, etc.). As soon as the camera finds the focus, it immediately locks it on the specified point, completely ignoring any movement in the frame.
  • AF-C (AutoFocus Continuous) — continuous autofocus mode for shooting moving subjects (sports events, cars and people in motion, nimble birds and animals). Focus in this mode automatically adjusts if you or the subject in the frame moves.
  • AF-A (AutoFocus Automatic) — automatic autofocus mode. In this case, the camera independently determines whether the subject is in motion or remains motionless. Accordingly, the automation selects the appropriate focusing mode: AF-S or AF-C.
For both static and dynamic subjects, it is important to choose the appropriate focus modes.

The three AF modes mentioned above are the basic foundation. In some camera models, the list of modes may be extended, in which case it is necessary to use reference materials and/or tips.

Artificial intelligence in autofocus systems

Now let's talk about the most interesting feature of modern autofocus systems. Today, machine learning and artificial intelligence algorithms are being massively introduced into cameras, capable of recognizing a subject in a frame and even predicting its movements. The best developments in this area can be found in top-end mirrorless cameras, which can automatically focus on the eyes of people and animals, faces, identify birds, living creatures and various vehicles (cars, trains and even planes) in the frame.

Smart autofocus constantly analyzes the frame and keeps the subject in focus both in photo shooting mode and when recording video. This provides increased focusing accuracy and ease of use of the camera. Moreover, in some advanced copies of cameras, a separate coprocessor may even be responsible for operations related to artificial intelligence algorithms. A striking example of this is the Sony A7r V. The autofocus system in this camera can evaluate human poses for smooth focusing and does not lose focus, even when the subject is briefly blocked by something in the frame.

Smart autofocus systems use artificial intelligence algorithms and can focus in tracking mode even on the eyes of animals.

Machine learning principles in autofocus systems are definitely the future. But as of today, autofocus practically does not require human assistance and quite successfully copes with the assigned responsibilities of ensuring proper frame focusing in a wide variety of shooting conditions.

Autofocus in lenses

In the context of cameras, the speed of focusing response is affected not only by the in-camera autofocus system, but also by the type of drive in the lens. There are the following types of motors in autofocus optics:

  • Conventional motor — autofocus systems based on DC drives. They function slower than other types and create background noise, which can be critical when recording video; however, lenses with a similar motor are significantly cheaper than other options.
  • Stepper motor — schematically, such drives use DC flowing through several groups of coils. Applying current to the groups in the correct sequence causes the motor to rotate by one step. The more groups there are, the more precise the movement of the drive can be made. Stepper motors are equipped with both gear and screw gears. In Nikon markings they are often designated AF-P, in Canon lenses — STM. Such motors provide smooth and quiet focusing, but they are inferior to ultrasonic motors in terms of speed of focusing. Lenses with stepper motors are optimally suited for video recording, which is why they are often preferred by videographers.
  • Ultrasonic motor — the response speed of such drives is measured in tenths or even hundredths of a second. And ultrasonic motors are also practically quiet, which can be very important when recording video. In the Nikon optics range, lenses with ultrasonic drive are marked AF-S, in Sony — SSM, in Canon — USM (in various options such as Micro USM, Nano USM, etc.).
  • Screwdrive — with rare exceptions, lenses with such a drive do not have their own autofocus motor. They have only a port for connecting to the camera, and the motor is placed directly in the camera. The main advantages of screwdrive lenses are their light weight and size, but in terms of speed they are inferior to ultrasonic drives, however, they are quite comparable with conventional motors (and sometimes even surpass them). Nowadays, this option of autofocus drives is being abandoned in favor of stepper and ultrasonic motors.

Factors affecting autofocus speed

The type of focusing actuator inside the lens is not the only factor affecting autofocus speed. All passive autofocus systems depend entirely on the light transmitted through the lens. In low light conditions, it is much more difficult for autofocus to correctly catch the subject and keep it in focus.

The more light the lens transmits, the better the autofocus performs.

The focus detection range also plays an important role — a correctly selected focusing distance increases the sensitivity of the focusing system, especially in poor ambient lighting.

There is a direct correlation between autofocus performance and the aperture of the optics. The more light reaches the front lens, the better the sensors of the autofocus system operate. By closing the aperture to "dark" values (conditionally from f/5.6), less light passes through the lens, which makes the autofocus operation more difficult.



Automation in focusing systems is a real godsend for photographers and videographers. And with the right approach, it can be fully appreciated. But even automation sometimes gives failures, so it is important to get into the habit of controlling the focus and correct its errors in a timely manner. Then the percentage of defective photos and videos will be steadily decreasing to zero.