Art and Science vs. Marketing and Commerce

Christopher Daniel Walker
CineNation
Published in
8 min readAug 25, 2017
Christopher Nolan’s Dunkirk, shot on 65mm film by Hoyte van Hoytema, ASC, FSF, NSC. Released in 35mm, 70mm, IMAX, and digital formats.

A week after the release of Christopher Nolan’s World War Two epic Dunkirk I was looking at the listings for my local cinema and saw that they had special 35mm screenings alongside standard digital projection. It told my mum about the how Nolan is among the strongest advocates for capturing and projecting films on celluloid in an industry dominated by digital technology, and how having a 35mm print at our local cinema was somewhat of a big deal. I looked at her nonplussed reaction and said, “This means nothing to you, does it?” She simply nodded, confirming that my enthusiasm for the subject was not mutual.

Because of my love for the craft I neglect to think about how much viewers, who may be less knowledgeable or uninterested, understand or care about the technical aspects and creative choices related to filmmaking. What information should people know about the capture and presentation of films and TV series? Is there a commercial incentive to use image acquisition and exhibition as a marketing tool more than a creative one? Are people knowingly being misinformed about what is considered wholesale to be a ‘better’ photographic image in order to sell the latest technological innovation?

I grew up witnessing the digital revolution in cinema. During the 90s I saw digital visual effects supplant long-established methods and techniques for realizing the extraordinary and the impossible. With movie like Star Wars: Episode II — Attack of the Clones and Collateral, I saw digital cameras beginning to replace the mechanical wonders used for over a century. I remember the projectors changing from noisy behemoths loaded with platters to unmanned units loaded with hard drives — built to accommodate the latest era in stereoscopic cinema.

From capture to consumption the film and television landscape of today is being shaped by advancements in digital technology; but in many instances it feels that the industries’ manufacturers and distributors are at odds with the creative and aesthetic choices of its filmmakers. The visions of directors and their creative partners are increasingly being overshadowed by limiting and arbitrary demands related to how a film or TV series should be imaged and displayed. These demands are being made less for the benefit of consumers then it is for the revenue of the electronics and entertainment industries.

Acquisition

In the age of digital cinematography new cameras are being announced and released at a breakneck pace, with each one promising to deliver better specifications and greater capabilities than its predecessors. Statistics related to resolution, form factor, and dynamic range are improving every year, with companies such as Red, Panasonic, and Sony competing for industry share. The problem, however, is that camera systems are being forced upon filmmakers due to their on-paper technical performance rather than their aesthetic and ergonomic merits.

Bucking against trend, Darren Aronofsky’s The Wrestler (2008), Black Swan (2010), and Mother! (2017) have all been captured on Super 16 film stock

Resolution is spoken of as the greatest determining factor in a photographic image, with cameras being labelled as being 4K, 6K, and 8K. Cameras which are sub-4K or ‘only’ HD are being dismissed by certain studios for projects, even if cinematographers and their crews prefer them for their fidelity and functionality. The industry standard Arri Alexa, depending on its configuration, is considered to be either a 2K or 3K camera — according to Netflix’s specifications for original content the Alexa is not adequate.

Even more contentious for producers and studios is the desire of filmmakers and showrunners to shoot on film stock of any kind — thought of by many as being an obsolete format. Claims about the high cost of shooting, developing, and scanning film makes shooting with 16mm, 35mm and 65mm cameras a harder battle to win. The decision for directors and their cinematographers to choose film can contribute greatly to the texture and aesthetic of a feature or series. Todd Haynes’ Carol, AMC’s The Walking Dead, and select films from Darren Aronofsky all benefit from being shot on Super 16. The 35mm and 65mm cinematography in Christopher Nolan, Quentin Tarantino, and Paul Thomas Anderson’s films cannot be derided for originating on film instead of the latest large format digital cinema camera.

(In contrast to Netflix, Amazon Studios has shown an openness for filmmakers and showrunners to use a variety of camera systems for its original films and series intended for UHD (Ultra High Definition) and HDR (High Dynamic Range) viewing. The Arri Alexa has been used for the second season of The Man in the High Castle, while 16mm and 35mm film has been used to capture The Wall and The Lost City of Z, respectively.)

Mastering and Exhibition

For the past decade the film industry has utilized the digital intermediate process for mastering features, regardless of their originating medium. Even today the vast majority of films are graded and output at 2K resolution because of the reduced costs and man hours (for instance with visual effects) when compared to an end-to-end 4K workflow. With IMAX and other large format venues 4K mastering has gathered momentum in Hollywood, although many productions instead opt to take 2K sources and upscale them as an efficient compromise. With the exception of a handful of films (Son of Saul, The Love Witch) and acclaimed directors (Christopher Nolan, Paul Thomas Anderson) photochemical timing and printing has all but disappeared as a viable option to master features.

Marvel/Netflix’s superhero team-up series The Defenders, shot and mastered at 4K

Film projection has been deposed with 2K and 4K digital replacements, with traditional 35mm and large format 70mm prints enduring as special presentations consigned to limited numbers of theaters. Despite protestations and accusations of short-changing their audiences, IMAX has pushed forward dual projection and more recently laser projection as its replacements for the massive and expensive 15/70mm prints that made them the premium film viewing experience.

The push for UHD/4K in the home by electronics manufacturers has lead to several television broadcasters and online streaming services for adopt and master their original programming at higher resolutions. Netflix, Amazon Studios, and Hulu are providing 4K and HDR versions of their content to their subscribers who own the necessary hardware to view them. Despite their moves to ‘future-proof’ their material for when, or if, higher resolution TVs become more commonplace, many studios and networks continue to master their series at HD/2K. HBO’s Game of Thrones (shot on the Arri Alexa) and Westworld (shot on 35mm film) both cost $100 million to produce each season and are finished at 2K.

Where feature films have gradually shifted to 4K masters and projection television has embraced the technology at a more rapid speed. But as home viewers, how many of us are aware of the science that contests manufacturers’ claims and the buzz phrasing of 4K content providers?

The Viewer

There is a contrasting narrative between what purveyors of 4K technology tell us and what science reveals to us about the limits of our perception. Repeated studies and academic research has shown that the average viewer, under normal viewing conditions, cannot distinguish a substantial difference between HD and 4K sources on cinema screens and home televisions. The human eye has a finite sense of resolution in relation to the size and distance of a viewing screen — at a certain point we physically can’t see a difference. With much larger screens, such as Dolby Vision and IMAX, higher resolution is important, but with standard-sized cinema screens and home televisions you would need to be uncomfortably close to discern any noticeable degradation in image fidelity.

Under test conditions, what is of greater importance to our perception of image quality is brightness, tonality, and colour. The HDR aspect of newer projection and TV technologies is ignored in favour of telling consumers that 4K is four times sharper than regular HD; while that may be technically true there are real world considerations that have to be taken into account that salesmen neglect to mention.

Steve Yedlin, ASC — Rian Johnson’s cinematographer for Star Wars: The Last Jedi — has extensively researched modern camera technology and our perceptions about resolution. Some of his findings have been published in an article courtesy of American Cinematographer.

Electronics manufacturers are constantly looking for new and novel ways for consumers to buy their latest products. Over the last several years TV manufacturers have tried to market new product lines they claim will improve our home viewing experience, often with the help of content vendors. After the new wave of stereoscopic film projection came 3D TV, which failed to achieve the same success. Then came curved and 21:9 television screens, which were both meant to provide a more immersive viewing experience, but again failed to garner commercial interest. In the present the electronics industry is looking to repeat the successful transition from standard definition to high definition, marketing UHD/4K television as being an equally significant leap forward. The truth is that it isn’t.

The digital technology companies that create the cameras, the infrastructure, and the devices through which we watch our media perpetuate a false notion about what constitutes a good-looking image. The entertainment providers who promote the latest innovations are giving us a misleading sales pitch. What does it mean to have the ‘best’ picture? What does it mean to have a fuller, more immersive experience?

Filmmaking and television should not be homogenized, dictating what medium, equipment, method of post-production, and final exhibition every feature and series is to follow. The latest and greatest digital cinema camera may not be the correct or viable choice for an artist. Because of its aesthetic, shooting on film may be the desirable option for a director and their cinematographer — perhaps shooting on a smartphone (Tangerine) or an 80s-era video camera (No, Computer Chess) is the most appropriate fit. Making creative choices does not always equate to the expectations of the commercial and entertainment industries.

As viewers the voice and integrity of the artist should be of more importance to us than marketing tactics and slogans.

Coming soon: I Wish I Could Forget The Hobbit Trilogy

Want more from CineNation?

Subscribe, Like, and Follow us on iTunes, Facebook, Twitter, and Flipboard

--

--