The Real Reason Why Motion Smoothing Is a Thing
Vulture recently published a piece titled “Motion Smoothing is Ruining Cinema”. The latter part of the subtitle reads: “So why is it the default setting on almost every new TV in America?”, and to my surprise, the article didn’t do much to dispel the mythical status of motion smoothing on TVs, and was rather focusing on re-amplifying many of the familiar concerns audiences and filmmakers have with the technology. Hollywood has a much-documented beef with its use on TVs, but they do often end up erroneously asserting that TV manufacturers do it in a semi-arbitrary fashion — to put it frankly, that’s not really the case.
To give a brief overview of what motion smoothing does, it is basically TV manufacturers’ way of filling in the blank spaces within a typical 60 Hz frame delivery time, with intermediary quasi-duplicate/computationally-devised frames to reduce the feeling of judder one gets from content typically being delivered in a non-incremental value proportionate to the TV’s refresh rate. So if you’re watching 30 Hz content, it’s completely fine since every other frame will be a duplicate, same with a theoretical 20 Hz, 12 Hz, 10 Hz and so on. But anomalies like 24 Hz cause a huge problem, precisely because their spread across a limited 60 Hz delivery window is mathematically-unsound, and so the duplicity of frames has to be done unevenly — causing stutter and/or tearing.
Why is this such a problem, is that unlike cinema projectors, TVs are designed to fulfill wildly-different display profiles depending on a given content’s specific traits. Games? Knock down all post-processing and focus on low latency; Movies? Crank up all post-processing to 11, and dull blacks and whites to all hell; TV broadcasts? Give precedence to enhancing the feel of motion, and kick color-enhancing profile to the latter parts of the post-processing pipeline. One of those differences is the framerate at which content is shot — and subsequently delivered — and the huge disparity between broadcast television, and prim-and-proper cinema and prestige TV.
The real reason why motion smoothing is such a desirable default setting on new TVs isn’t due to some industry trend that just hasn’t yet been scrutinized enough through oblivion. It has more to do with the way TV manufacturers have historically marketed their products, and how they’ve been reticent to adopt new technologies even when standard bodies have long finalized their implementation details.
The TV manufacturing industry has a history of lacing their lineups with technologies designed to artificially enhance the look of content through the use of post-processing trickery introduced via proprietary hardware that is in theory, supposed to rid out broadcast television of its many artefacts and relatively-uninspiring look. There was no uniform implementation across TV manufacturers — though most of them settled on a version of improving contrast and increasing brightness, artificially replicating what would now become standard through HDR and the use of higher-quality panels that were more of a luxury back when HDTVs were still in fierce competition with legacy technologies as they were carving out their portion of the market.
Those artificial enhancements to images aren’t implemented solely to satisfy consumers — they’re also weaponized as marketing jargon to yield an advantage in an intensely-crowded industry where the bulk of its profit is achieved through content-promotion deals (the included apps on your TV) and network-effects (buying say, an LG smartphone, after you’ve had a good experience with an LG TV). Where most people are buying their TVs, there isn’t much money to be made. Profit margins are excruciatingly slim, and the business incentive to do anything but prioritize sales numbers over consumer satisfaction is nigh-on inexistent.
If you’ve been old enough to remember TV sets promotional material during the early 2010s, you’d notice there is something that TV manufacturers used to talk about all the time, that is nowhere to be seen now — that was 3D.
There’s no extent to which TV manufacturers’ expenses on VFX shots of content beaming out of your TV can be exaggerated. The typical advertisement pitch was practically fraudulent — if you bought a 3DTV, set it up, wore your 3D glasses, there was little chance of pitch grass growing next to you and a ball heading straight to your face. It is straight up comical that the FTC hasn’t pursued a multi-billion dollar fine from TV manufacturers who’ve all but solidified their status as masters of deception, not-at-all delivering on their promise of amplified immersion, and often subjecting consumers to copiously high prices of admission through the purchase of 3D Blu-rays and useless sports cable packages just to enjoy content specifically mastered for that format.
That was basically the TV advertisement cycle at play. There’s one feature or gimmick TV manufacturers will all flock to and push their weight behind in advertisement spots and consumer showcase shows, but if all there was behind it was basically a huge unnecessarily-extravagant scam, consumers learn to adjust very quickly, and word-of-mouth lays the final crushing blow on their latest fad as it did to 3D.
But over the last few years, TV manufacturers have been given an ace up their sleeves that they’re still hesitant to play, but would turn out to be landscape-shifting if implemented at a large scale. That card is Variable Refresh Rate, or VRR — a technology that graphics and computation giant NVIDIA has helped set the ground rules for back in late 2013 when they introduced G-Sync. It is now becoming a standard add for all PC monitors that need to constantly shift from the use of the standard 60 Hz output to allow for consistent frame times (the pace at which frames are delivered) for a gaming experience that has historically suffered from a harsh cap on framerate (usually halving the monitor’s refresh rate) that introduces further latency, and makes for an overall worse experience playing anything that requires instant feedback. What VRR basically does, is that it circumvents the need to switch the refresh rate setting on your display device every time the pace at which the content’s frame delivery changes. So to avoid judder through triple-buffer V-Sync, stutter through double-buffer V-Sync, or image-tearing through the lack of V-Sync below the framerate target, the display scaler — of which NVIDIA was one of its earliest conceivers — adjusts the refresh rate to match the framerate, eliminating the downsides of a fixed refresh rate altogether.
If the conversation primarily pertains to traditional non-interactive content, why is gaming mentioned at all you might ask? Well, it comes down to the nature of interactivity versus passivity when consuming content, and how that informs consumer attitude.
It is thoroughly documented that consumers have a distinct reticence to properly calibrating their screens and setting up profiles that are appropriate for their specific use — and why would they really? Modern consumer electronics have been designed with simplicity and intuitive design in mind, and anything contrary to that is deemed a failure by TV manufacturers to accompany the pace at which smartphones and other computing devices have been very rapidly evolving to favor convenience over customizability.
Gamers though, are a completely different story — the modern construct of a typical gaming experience encompasses its all sorts of dials, buttons, modifiers, stats, and different gauges to keep up with that pertain very specifically to your player character’s performance, be it offline or online. Players are very well-attuned to what gives them a competitive edge, so when the introduction of judder and framerate inconsistencies was made a reality by the increasing graphical budget of modern video game engines, NVIDIA — who at the time faced price-sensitive competition from AMD — had to come up with an industry initiative to set the course towards the then-hypothetical, now-real future where displays are expected to come standard with ways to account for constant framerate change, agnostic of software-dependent implementation, and only reliant on the presence of a DisplayPort connection between the host machine and client display.
G-Sync — which later came to encompass the much-more inclusive VRR standard — was not only a response to a common grievance by PC monitor customers, but it was also expressed frustration on their behalf that PC monitor manufacturers (who often have their toes dipped in TV manufacturing as well) have forgone the good practice of installing a potent image processor, choosing to instead go with an economic option to further their already-faint profit margins.
The NVIDIA-built technology is understandably very poorly supported by TV manufacturers — currently boasting a whopping single inclusion by HP’s flagship Omen X 65 Emperium — but its VRR cousin is starting to gain some traction among the big dogs — most notable of them are Samsung and LG. VRR in conjunction with Quick Media Switching — a technology enabled through the advancements made in VRR — has been finalized as part of the HDMI 2.1 standard ever since November 2017, and it’s been up to TV manufacturers to go for the more expensive option of installing a display scaler capable of doing refresh rate swapping on the fly with minimal judder, even if it meant slimming their margins on the more affordable TVs where the bulk of their sales are.
TV manufacturers understandably — but still somewhat disappointingly — chose to instead reinforce the status quo and still make their hot sellers with the sub-par image processors they’ve always went for in that market segment. So in the absence of any industry consensus on improving the average quality of a TV display scaler, VRR and QMS just aren’t considered with the same importance by streaming box manufacturers as say, a technology far more prevalent and more widely-adopted as HDR. The latter technology isn’t free of its implementation quirks either, but VRR and QMS widespread adoption could’ve proven far more revolutionary to the experience of individual content consumers than any gimmick we’ve been sold in the past.
Even more than pioneering new panels, VRR and QMS have the potential to completely turn the world of movies and TV on the smaller screen upside down. That bespoke limitation that so many speak of — the fact you inherently can’t watch 24 Hz mastered content on a 60 Hz display without judder — is only existent as far as TV manufacturers are willing to let it go. But since software and hardware support is really small on the display manufacturing side, streaming fronts will see no upside to tailor their experiences to what is ultimately a niche market at the current stage. As if the incredibly slow-pace at which HDMI standards change wasn’t enough of a deterrent, TV manufacturers’ decision to forgo the Variable Refresh Rate revolution for their displays might be the crushing blow to this technology’s mass-market potential.
However, there are two new players that might completely change this, and their contribution and influence cannot be understated enough — the PlayStation 5, and the upcoming new Xbox.
Recent market analysis showed console owners — which make up half of US households — have started to use consoles more for streaming, acknowledging their duality of purpose — consoles are no longer just gaming machines; internet connectivity allows them to also stream content just as well as a Roku or an Apple TV does. Microsoft went as far as to include an HDMI IN to act as a passthrough for cable boxes in a plan that ultimately proved shortsighted, and to put it bluntly, a failure. But as Disney, Warner Media, and NBCUniversal are all gearing up to co-share a streaming space already dominated by Netflix, Hulu and Amazon, they’ll have to sell access to their content on game consoles as well. Working off that perfect collision of intent and happenstance, it is not entirely out of the realm of possibility that the HDMI 2.1-equipped consoles will not only use VRR to enhance the fluidity of games, but also give streaming apps privileged access to 24 Hz so that TVs are simultaneously delivering on a judder-free experience, while eliminating the long-lasting issue of motion smoothing as a result of its seldom presence.
As the upcoming streaming wars rage on, it’ll be up to individual services to differentiate themselves, and the potential wider adoption of VRR and QMS might be one front where rifts start to open. Even stronger than brand deals and content-promotion buttons on a TV remote though, are the prospect that Disney+, HBO Max, and the upcoming NBCUniversal streaming service, will all converge on one implementation and force streaming hardware manufacturers — whether integrated into TVs or sold separately — to set the course for a uniform implementation that in ten years — give or take — could make motion smoothing a thing of the past.
The biggest wrench to potentially be thrown into the cogs of this plan would be the notoriously-slow upgrade cycle of TVs. If the HDMI standard to drive that change already exists, and if widespread adoption of hardware could become a thing as soon as next-gen consoles are on store shelves, it is not a sure thing that consumers will drop whatever 4K TV they just bought and move on to the next thing just because of a setting they could otherwise turn off if bothered by. Nonetheless, as TVs of old start to populate junkyards, and as the technology of now achieves ubiquity, motion smoothing will have to be relegated to a historical anecdote, rather than the filmmakers’ nemesis it currently is.