What Every Matchmove Artist Needs to Know About Lenses

…but is afraid to ask

Thomas Angarano
7 min readJun 20, 2018
A Cooke lens on a RED Scarlet camera

In this article we focus purely on the lens, a component many consider to be the most influential factor in the look of a film, but how does it influence the way we matchmove?

What is lens distortion and how does it affect me?

When taking a picture or filming, the job of the lens is to direct beams of light onto the film or image sensor. In reality, lenses are not perfect at performing this job and photons from a straight line object often end up in a curved line, which results in a distorted image. This is called lens distortion. The most straightforward types of lens distortion are barrel distortion, where straight lines curve outwards, and pincushion distortion, where straight lines curve inwards. Lens distortion is usually more pronounced towards the edges of the frame.

Image showing the differences between Barrel and Pin Cushion distortions.

As a result of lens distortion, features in the captured image do not reflect their position in the real world, which does not suffer from lens distortion. Matchmoving applications, however, often assume such ideal cameras as their underlying model to reengineer the camera and movement of a shot.

Where image features deviate from the assumed position in a perfect camera, their corresponding reengineered 3D positions will not match their real world locations. In the worst cases, this could cause your camera track to fail.

Result of a camera track in PFTrack with lens distortion and without. The highlighted areas match the straight wall in the corrected version better.

But that’s not where lens distortion’s influence in visual effects ends. For example, the mathematically perfect cameras in 3D animation packages do not exhibit any lens distortion either. Undistorted CG images, however, would not fit the distorted live action plate. Even where 3D packages can artificially distort the renders, the distortion will have to exactly match the real lens’ distortion for the composite to work.

In practice, the effects of lens distortion on the plate (the live action image) will be removed during camera tracking, which makes the matchmoving artist responsible for dealing with lens distortion. As a result, you will get a mathematically perfect virtual camera and undistorted plates. The resulting virtual camera will be used to render the CG elements, which are then composited into the undistorted plates. At this point, we have perfectly matched CG integrated in the undistorted live action plate. However, with other (non-VFX) parts of the footage still exhibiting lens distortion, your undistorted VFX shots may stand out, even if the CG is perfectly matched. That’s why, at the end of this process, (the original) lens distortion is re-applied to the composited frames. As a consequence, matchmoving not only needs the ability to remove lens distortion and export undistorted plates, but also provide a means to re-apply the same lens distortion on the composited result.

Types of lenses

There are (at least) two ways of classifying lenses, prime (or fixed focal length) versus zoom, which can be further complicated by being spherical or anamorphic.

Prime lenses cannot change their focal length (more on focal length below), whereas zoom lenses can do so within their zoom range. Not being able to change the focal length comes with some advantages for prime lenses. The simpler design and less optical elements in the lens normally results in a higher quality image, for example exhibiting less distortion, than comparable zoom lenses.

A rule of thumb for matchmoving is that the more information about the real live camera you have, the easier it is to get a good solution. When it comes to collecting this camera information to assist camera tracking, prime lenses have the additional advantage that if you know which lens was being used for a shot, you automatically also know which focal length it has. This is much harder when it comes to using zoom lenses. Even if you know which lens has been used for a shot, you still don’t know the actual focal length the lens was set to. And it is a lot harder to keep track, ideally frame accurate, of any focal length changes. The good news though is that knowing the type of zoom lens can still help in matchmoving. If nothing more, knowing the range of a zoom lens can provide boundaries when calculating the actual focal length for a frame during matchmoving.

Anamorphic lenses’ breakthrough in filmmaking began with the adoption of widescreen formats. In order to utilise as much of the film surface area as possible, the scene was squeezed horizontally.

With digital sensors, the need for anamorphic lenses is reduced to aesthetic considerations. Common anamorphic lenses squeeze the horizontal by the factor 2, which means for a digitised image that a single pixel is twice as wide as it is high, as compared to the square pixels for spherical lenses.

Comparison between a spherical and anamorphic lens

When matchmoving anamorphic footage, make sure to account for the correct pixel aspect ratio. In the above example, this ratio would be the common 2:1, but there are also lenses with different ratios.

Anamorphic lenses are available as both prime and zoom lenses.

Focal length in matchmoving

Focal length is the most prominent property of a lens. It is often the first thing mentioned in any listing of lenses to distinguish them, and also what differentiates prime and zoom lenses. The focal length, usually denoted in millimetres (mm), defines, for a given camera, the extent of the scene that is captured through the lens. This is also referred to as the (angular) field of view (FOV).

It comes as no surprise that focal length does play its part in matchmoving as well. On the other hand, it may surprise you to hear that focal length is only half the story when it comes to camera tracking. Matchmoving applications are really interested in the field of view rather than any focal length value in mm, and in order to calculate this field of view, they not only need to know the focal length, but also the size of the camera’s sensor, or film back.

The relationship between sensor size, focal length (f) and angular field of view (FOV). Note how for the same focal length f, the field of view (FOV) differs for both sensor sizes.

You may have come across this relationship with the term 35mm equivalent focal length. For example, the iPhone 5S’ main camera’s sensor size is 4.89x3.67mm, and its lens has a focal length of 4.22mm. Its 35mm equivalent focal length, however, is 29mm, which means that in order to get the same FOV with a full frame 36x24mm sensor you would need a 29mm lens, rather than the 4.22mm lens for the iPhone’s smaller sensor. This relationship is sometimes also referred to as crop factor, as explained by RED Digital Camera in Understanding Sensor Crop Factors.

Luckily, the sensor sizes for most digital cameras can be found easily online, for example in the VFX Camera Database, so make sure to always note the camera model as well as the lens when collecting information on the set.

The matter gets a bit more complicated through today’s plethora of different sensor sizes, and the fact that, depending on the format, not all of the sensor is being used capture images. In the above illustration, it doesn’t matter whether the sensor in the bottom camera is actually smaller than in the top camera, or if it’s just a smaller part of the sensor that has been used due to the chosen format. For example, your camera’s resolution may be 4500 x 3000, which is a 3:2 aspect ratio. If you now plan to shoot HD video, which has an aspect ratio of 19:6, some parts of the sensor will not be recorded in the video. For a full frame sensor, this would reduce the effective sensor size for HD video from 36 x 24 mm to 36 x 20.25 mm, as is illustrated below.

Cropped imaging area (36 x 20.25mm) vs full sensor size (36 x 24mm)

Depending on the sensor size and format, cropping may occur at the top of bottom, as in the example above, or from the sides of the sensor.

Conclusion

The camera’s lens has huge impact throughout the VFX pipeline and it is the matchmove artist’s job to mitigate most of this impact. The Pixel Farm’s matchmoving application PFTrack has a wide range of tools to use available information about the lens and camera, as well as handle situations where no such information is at hand. It also provides the necessary tools to manage all aspects of lens distortion.

References

VFX Camera Database

RED 101: Understanding Anamorphic Lenses

RED 101: Understanding Sensor Crop Factors

--

--