Non-Linear Editing Systems — a timeline.
BDES 2412 — Session 7: Human Computer Interaction
2001: A Space Odyssey (1968) is not only a great film, but also a great example of linear editing in practise; the art of cutting, gluing, and combining physical strips of film. This was before the advent of digital editing softwares, where all films were edited by hand, laboriously and meticulously, until a master reel was created.
Compare that to today where, in minutes, one can record, combine, and edit all on the same handheld device, with a level of quality that would seem professional-grade only 10 years ago. These applications are what we known as Non-linear Editing Systems (NLE), a designed interface that facilitates both a non-destructive, and organic workflow.
What defines an NLE is two ideas: the ability to access any specific part of a source material, regardless of sequence, and the non-destructive workflow. A familiar example is Adobe Premiere — Adobe’s lead video editing program. In the case of Adobe Premiere, the timeline is the main interface that facilitates this, but the entirety of the program is what makes it a system. One can jump around between any length or number of clips, and can trim without worry of destroying some kind of master.
It is important to note that NLE’s are not reserved for only video editing. Any type of media that can be edited (audio, photo) can make use of non-linear systems.
A brief history
The first NLE was created by CMX Systems (CBS & Memorex) in 1971; a machine named the CMX 600. It could record and playback black & white analog footage, which was recorded in half the frames as the source material in order to reduce file size. These files were then stored on modified “disk pack” drives, which were commonly used to store data digitally on mainframe computers of the time. To interface with the machine, the user had a tool known as a “light pen”, that detected changes in brightness of nearby screen pixels and communicated that to the computer. It was essentially the first stylus. The user would use this light pen on one of two monitors, in order to make cuts and edits to the preview footage. The other monitor would then play the edited footage.
Although it severely lacked in quality, it’s creation highlighted the efficiency and depth that a non-linear editor could provide.
Some years later (1984), the EditDroid came into frame, as a laser-disc based analogue NLE built by Lucasfilms. Although it held many of the same quality issues that the CMX 600 carried, it was the first system to feature the “timeline” that is ubiquitous today.
https://youtu.be/z99wO2utddo?t=722 (watch until 13:02)
The first NLE built for PCs came about in 1989 by Editing Machines Corp., titled the EMC2 editor, which was a PC-based system of both software and hardware. At the same time, Avid released Avid 1 for Macintosh computers with very similar systems to the EMC2. Due to limitations of personal computers and storage, these systems were criticized for their poor video quality (about VHS quality), and for the poor frame rate (15fps). Due to the resources these systems needed to operate, they were both very expensive to own.
Modern editors
There are countless modern NLEs that you can use today. These include Avid, Adobe Premiere, Sony Vegas Pro, Pro Tools, Davinci Resolve, iMovie, Window Movie Maker; the list goes on. They essentially all have the same interface and interactions that spawned from the original NLE’s that came before them. Some are more complex, and offer more features than others, but they all base themselves on “the timeline” which facilitates non-linear editing.
So what’s the problem?
As we’ve seen with records vs. digital audio, there’s always a proponent for the old way of doing things. And in some cases, they can make some good points. The art of physical film editing solidifies one important part of editing: planning. With the precariousness that cutting and gluing film together entails, making mistakes can be a very costly problem. Therefore, in this form of editing, every cut, edit, and combination are completely planned out before hand. Alternatively, with the simplicity and non-destructiveness that NLEs provide, the importance of planning falls to the wayside. In one way, this is a tool of creative freedom that an editor can thrive through. But often it instead becomes a detriment to the overall flow of an edit.
The claim is that linear-editing forces the editor to become much more intimate with the full body of footage at their disposal. As stated on filmreference.com, “what is vanishing from American film are all of the ways that an individual shot can function as a unit of meaning, through composition, production design, lighting, and the actor’s performance as it unfolds in the real time of a shot that is held.”
What’s your opinion?
An HCI perspective
HCI, or human computer interaction, details the interactions a human has with any computer interface, whether digital or not.
In terms of physical, the most notable HCI tool we’ve covered today is the light pen, used with the CMX 600. It’s creation is a good example to highlight the Task-Artifact cycle; a notion that a task influences an artifacts design, which in turn creates or influences the next task. In other words, a given task creates guidelines for the design of an artifact, which in turn helps a user perform said task. The artifact itself, however, can now create new tasks or put new unexpected constraints on the original task. What this means in practise is that HCI is reliant on the innovation of technology. The light pen was a tool designed for a new technology at the time: cathode-ray tube displays. The issue is that we are unaware of where the next technological advancement will take us, and it may render new tools we build to be quickly obsolete, like the light pen.
If we look at the EditDroid interface, and compare it to modern interfaces, one movement of HCI comes into play. “Moving beyond the desktop” is an attribute of the earlier days of HCI’s development. One of the ways it “moved” was from manual finding, to search. In Adobe Premiere, search bars are common in the interface, such as to filter effects, or otherwise. As for the EditDroid, the interface reflects the time/stage that HCI was currently at: still the phase of manual finding (large number of buttons crowding the interface).
The timeline
The timeline has become a ubiquitous tool in all visual and auditory editing systems. The same task-artifact cycle can be applied here as well, but with a different outcome. As the first to use a timeline in a digital NLE, the EditDroid was the main proponent in it’s success. The task at hand was to create a way in which we can visualize different sources of footage and audio that makes sense for humans. The choice of artifact, the timeline, was one of proper thought. Timelines are something that humans have (and had) a firm understanding on; the way they organize information and the way we navigate them. Designing digital tools around concepts that humans physically understand will always results in long-lasting success, like we see with the timeline, regardless of whether or not the team who designed the interface even gave it its due diligence.
The takeaway
As I mentioned above, designing digital tools based on concepts that humans already understand is the key to strong interactions.
And in the grander scheme of HCI, understand that iterative ideating/design is necessary when designing for the cutting edge, so as to avoid creating something already obsolete.
“Many people struggle every day with operating systems and core productivity applications whose designs were evolutionary reactions to mis-analyses from two or more decades ago. Of course, it is impossible to always be right with respect to values and criteria that will emerge and coalesce in the future, but we should at least be mindful that very consequential missteps are possible.”
Keep an eye on the now and an eye on the future, and design there with intent and due diligence.
Sources
Images:
[4] https://www.youtube.com/watch?v=psJIp4e27HM
Bibliography:
“Non-Linear Editing System.” Wikipedia, Wikimedia Foundation, 8 June 2020, en.wikipedia.org/wiki/Non-linear_editing_system.
“Linear Video Editing.” Wikipedia, Wikimedia Foundation, 9 Dec. 2019, en.wikipedia.org/wiki/Linear_video_editing.
“Nonlinear Editing.” Film Reference, www.filmreference.com/encyclopedia/Criticism-Ideology/Editing-NONLINEAR-EDITING.html.
“Light Pen.” Wikipedia, Wikimedia Foundation, 27 Feb. 2020, en.wikipedia.org/wiki/Light_pen.
Carroll, John M. “2. Human Computer Interaction — Brief Intro.” The Encyclopedia of Human-Computer Interaction, 2nd Ed., 2014.