Mobile Journalism’s new focus: the cinematic

A report from MojoFest 2022 by Rob Layton, Assistant Professor of Mobile Journalism at Bond University.

Walkley Foundation
The Walkley Magazine
6 min readDec 12, 2022

--

It was apparent from MojoFest 2022, the world’s premier gathering of mobile journalists, that mojo had come of age.

For the first time in its eight-year history, the formerly dedicated event was incorporated into the much larger Media Production and Technology Show in a cavernous hall in West London that attracted 8000 attendees over two days in May.

MojoFest took a side stage alongside the most advanced broadcast technology on the planet. Many mobile pioneers were there — the BBC’s Marc Settle, RTE’s Philip Bromwell, 28-time Emmy Award-winner Mike Castellucci — and Sony used the Mojo stage to launch their latest flagship phone.

It prompted several mobile journalism veterans to ask: has mojo peaked?

Gone were the days, it seemed, that this niche but dynamic field of journalism was compelled to defend its validity. No longer was this merely an enthusiastic sub-culture of journalists obsessing over their phones.

Today, every major news organisation incorporates mobile into their news strategy to some degree. Public broadcasters such as the BBC — and the ABC here in Australia — have embraced mojo, rolling it out nationally to their regional journalists.

Christopher Cohen, chief technology officer for the video camera app Filmic Pro, Tweeted it this way: “The era of novelty is over. The tools of mobile videography must now transcend the systems they formerly emulated.”

Achieving traditional broadcast parity was a significant milestone. But an even greater challenge has been to achieve parity in the minds of legacy practitioners.

Now that mobile is gaining widespread adoption, the mojo journey has begun its next phase; realise, harness and exploit the most rapidly developing camera technology in history.

A few days after MojoFest, in a separate announcement, Terushi Shimizu, CEO of Sony’s Semiconductor division, claimed that smartphone cameras will exceed the quality of single-lens reflex cameras by 2024. This echoed similar sentiments from engineer and ARRI general manager Franz Kraus, who told cinematographer Roger Deakins during his podcast that smartphone computational photographic processes outpaced traditional camera advances.

Artificial intelligence and machine learning are central to how smartphone cameras work. That was the topic of my MojoFest presentation, and it’s what sets smartphones apart from legacy cameras.

The objectives of artificial intelligence and machine learning in smartphones is to enhance and enrich the user experience. AI in smartphones is most evident through predictable functions such as facial recognition, voice-to-text, augmented and mixed realties such as Google Maps, the list goes on. ML in smartphones can be seen in unpredictable functions like streaming platforms, in which algorithms offer suggestions based on the user’s history.

But for journalists, the most compelling and profound applications of AI and ML may be smartphone camera pipelines.

Unlike traditional camera technology, in which the image-making process is largely predictable and reliant on human input, smartphone camera technology is unpredictable, autonomous and content aware. That is, these are cameras that know what they are looking at and react accordingly. Apple calls it “scene-understanding technology”.

Since MojoFest ’22 (my other presentation was mobile workflows), my practice-based and academic research has focused on how to merge cinematic storytelling techniques with established mojo practices. For academic purposes, I’ve been calling it cinematic mobile journalism — cinemojo, for short.

The idea being that packages shot (and even edited) with mobile move beyond the standard run-and-gun or talking heads of everyday news gathering to embrace more considered and composed imagery that enhances and propels the story. Like academic David Dunkley Gyimah’s Cinema Journalism concept, this practice requires more time, telling, an artist’s eye and — perhaps most importantly — emerging and developing technologies.

I started with this short documentary, Beneath the Lonesome Skye, shot with an accessory anamorphic lens and underwater housing to demonstrate the hardware and compositional cinematic potential of mojo for my PhD project (the Isle of Skye’s dramatic landscape being the perfect backdrop). It has since screened in cinemas and at festivals (mobile and traditional) around the world, including at the UN Ocean Conference in Portugal.

My workflow now is leaning more into the computational artificial bokeh (blur) of Apple’s Cinematic Mode, which received a substantial boost with the release of iPhone 14 Pro in September. Cinematic Mode’s debut with iPhone 13 was restricted to 1080p and its bokeh could be obviously artificial, particularly around the edges, so it wasn’t always a good look. Those problems largely are overcome, as seen in this quick 4K Cinematic experiment I shot the week iPhone 14 launched, due to faster processors and neural networks (so-called because they are inspired by organic neural systems, such as the brain) although the tech is still not quite there yet (inability to lock exposure, for example).

Master storytellers like Philip Bromwell, Eleanor Mannion, Leonor Suarez and Mike Castellucci have been doing their own forms of cinematic mojo for a long time (every aspiring mojo must watch Mike’s Phoning It In series), so the concept is nothing new. And I know the BBC’s Nick Garnett and Dougal Shaw, among others, also explore cinematic techniques (I doubt they share my conceit of giving it a lofty label, though).

Those journalists are visual thinkers who look outside the square, and common hallmarks of their work are well-composed scenes and inventive angles that give viewers a sense of depth to the pictures, rather than the flat soap opera-ish look smartphones are renowned for.

They are masters not just of storytelling but of working around smartphone limitations.

Arguably the greatest limitation of smartphone filmmaking has been the lack of bokeh. While we can use third-party apps such as Filmic Pro to adjust frame rates and shutter speeds for motion blur and therefore achieve a film look, cinematic storytelling requires the ability to blur the background or foreground of a scene to isolate a subject for narrative emphasis.

This is where Cinematic Mode has enormous potential for mobile journalism. It adds a much-needed tool to the storytelling toolbox and further closes the gap on legacy workflows.

I presented my ideas at the annual Women in Media national conference here at Bond University a few months ago. Feedback from 60 attendees across two workshops was positive and showed that there’s an appetite for this style of mobile journalism.

The executive producer of a national current affairs program later engaged me to train her producers and journalists in the technique, to supplement traditionally shot pieces when camera operators were not available.

Could they adopt a mobile-only strategy? Yes, if they made some concessions to yet-to-be-overcome limitations. Will smartphones ever fully replace traditional cameras? Maybe not just yet, but that may be only a matter of time.

When you consider the limitations of today’s smartphone cameras, compared to those of just five years ago, you realise this technology is advancing incredibly rapidly.

What mojo will look like in the next five years I don’t know but I’m confident its future will be bright. And, perhaps, cinematic.

Rob Layton is an Assistant Professor of Mobile Journalism at Bond University on the Gold Coast, Australia, where he is undertaking a PhD based on the creation of a video documentary about his local surf community shot entirely with mobile devices.

--

--

Walkley Foundation
The Walkley Magazine

The Walkley Foundation champions the highest standards of journalism in Australia through our awards, events and magazine – join the conversation!