AI a r t

Lumia: a Real-Time
Motion Graphics Environment Built
on Unreal Engine for Performance
Visuals, Animation & VFX

a software concept & personal portfolio piece

Michael Filimowicz, PhD
Published in
25 min readJun 7, 2024


Once upon a time I designed a cool software concept:

MIDI interaction demo (my version of the Wizard of Oz method)

All the mid fidelity user interface designs you will see below showcase my totally ok Adobe XD skills :) I think any designer can create nice rectangles — what really matters are the concepts behind the wireframes!

The Lumia software concept provides a convenient virtual environment for digital artists to rapidly construct geometry, lights, animations, visual effects, and movement pathways through simple drag and drop operations. It then adds yet another layer of complexity by way of generative programming and a complete set of interactive controllers. As well, there is the option to render out linear videos of one’s composition.

There are a number of inspirations behind this software concept, the main
one being ‘a Twinmotion for motion graphics.’ This refers to the idea of
providing a simplified user interface on top of Unreal Engine for a more
limited use case. Just as Twinmotion provides a UI tailored to the exact needs of architectural visualization, Lumia’s UI abstracts a more limited feature set from Unreal Engine to suit the requirements of real-time motion graphics.

Lumia can also be thought of as ‘a real-time After Effects,’ providing robust
motion graphics capabilities while eliminating rendering times typical of
VFX software. Its UI inspiration comes from apps like Davinci Resolve and
Lumen, which distribute functionality across a series of horizontal tabs
which each brings up a new screen with a toolset dedicated to that function

Gallery, Stage, Control, Program, and Sequence are the five main sections
that make up this desktop/laptop tool, and together they offer a full feature
set for real-time motion graphics. The app provides visual media creators in the music industry with simple, intuitive access to Unreal’s basic capabilities, without the need for a large development team or extensive knowledge of game engines, thereby eliminating the steep learning curve associated with game engines.

Gallery Screen

Gallery Screen

The Projects, Performances and Builds tabs are all in one place. There is
some new terminology to learn while working with Lumia, so let’s start here. Lumia’s output is known as Compositions. Because a Composition is the end result of a Project, the two terms are often used interchangeably. Of course, the first Composition in a new Project won’t exist yet, therefore the distinction between the two terms remains relevant.

Assuming the concept of a Project file is thoroughly understood, next let’s
discuss Performances. Lumia’s Composition feature allows users to put
together a moving picture using various digital assets and then play it back
in real time. A Composition can be used to generate an unlimited number of Performances.

Lumia-native performances and video generated performances are also
available. Native Performances can be accessed and played again directly within the app. A performance is a time-stamped record of individual
Composition data values. Depending on the Composition, a performance
can be recorded on either the Stage or Control screen, or in the Program

Note that the ‘wallet link’ seen at the top left of the UIs, or the Mint panel on the Gallery screen, dates this from the period before the complete NFT crash, so please forgive this reference to the hypothetical user’s Ethereum wallet :) This design was completed at the height of the NFT craze so I hope the reader will forgive the few crypto-art elements!

Stage Screen

Stage Screen

You’ll be putting together your Composition on the Stage screen.
Compositions are built using both external resources (such as 3D geometry and texture websites) and internal resources (such as geometry, lights,
animation components, and visual effects) from the Unreal engine.

Media in many different formats, including 2D artwork (to be used as
textures), 3D models (often of the .fbx file type), digital humans (produced
by photogrammetry), rigged characters, motion files, video clips, and audio
files can all be imported and used in the Stage screen.

The Composition Elements panel displays a tree view of all Composition
objects, and the Properties panel allows you to modify the attributes of a
selected object. Tools for modeling (such as Boolean functions, alignment
tools, and perspective views) and viewing performances (such as showing
them in multiple aspect ratios and recording and playing them) are also
available in the central Staging Area, where the Composition is built.

Any of the three main windows (Stage, Control, and Program) can be used to capture live performances of Lumia Compositions. Compositions can get
more intricate by working across these three screen sections. The Paths tool (the top left icon) allows you to create movement paths through the
composition for the main viewing camera and other elements, and the Stage screen itself includes a set of animation objects, so you can create a complex and refined motion graphics Composition without leaving the Stage screen.

A row of 16 Snapshot buttons is available on the Stage screen, as it is on the
Control and Program screens as well. There are two basic ways they can be
utilized. In the standard snapshot mode, each snapshot merely saves an
internal preset of the current camera position and the characteristics of all
Composition objects.

In addition to its primary use as a visual recorder, Snapshots may also be
used as a visual sequencer in time with incoming MIDI data. Since a
complete note is typically divided into sixteen steps of 1/16th notes, this
follows the standard model for beat sequencing. The Snapshot function of
the Lumia visual sequencer can be explored without the need for an external MIDI sync signal thanks to the app’s ability to produce its own internal tempo information.

Control Screen

Control Screen

The Composition’s interactivity can be adjusted on the Control panel. It is
possible to create a fully personalized control interface, either using GUI
components or simulating external control equipment that is plugged in. An extensive library of MIDI and Open Sound Control (OSC) compatible virtual controller components that can be assembled through drag and drop to form custom control setups.

Envelope drawing tools and MIDI keyboards’ pitch bend and modulation
wheels are also included. Other controls include knobs, sliders, buttons, pad triggers, and an XY matrix. Audio signals and MIDI files, and QWERTY
keyboards are other kinds of controllers that are available.

There are also a variety of different ways to view your Composition. You can watch any Performance in a variety of ways, including a fixed video window, a resizable floating window on other screens, virtual reality (VR) with a head-mounted display, or even live-streamed to a frame-sharing program like Syphon (Mac) or Spout (PC) to capture as a video file.

Lumia content can also be transmitted live to projection mapping, VJ, and
other live streaming programs with the help of frame sharing. The motion
graphics can also be displayed on a video projector or other real-time media display via live streaming. In addition, MIDI/OSC connectivity and digital sticky notes for comments are included.

Since any item’s properties can be assigned a controller object, the Control
screen gives additional control over recording Performances. Tempo data,
whether generated internally or from the outside, can be synchronized with the necessary controllers.

As the recorded output of Lumia will frequently include video media that should loop without interruption, a loop toolset is available to ensure that the beginning and ending points of recorded Performances can have their data parameters aligned so that the first and last frames are essentially the same.

Program Screen

Program Screen

The Program screen incorporates a visual programming paradigm inspired
by Unreal’s Blueprint system, which adds a generative layer. The standard
visual programming paradigm of nodes and virtual cables will be
immediately familiar to users.

Nodes in the Program screen can be either complete objects from the
Composition Elements panel or specific properties from that panel.
Additional programming nodes make it possible to access several of Unreal’s most useful capabilities for creating interactive and generative motion graphics.

If the user wants to export an interactive or generative work, the Program
screen also offers tools for testing Builds and exporting the final executables. Unreal is a platform that can export executables on its own because it was made to ship huge applications such as games. In addition to rendering linear media files of recorded Performances, users will be able to create media that retains interactive or generative elements.

Sequence Screen

Sequence Screen

The final screen expands on Unreal’s cinematics system, letting players
arrange their performances into sequences using a clip-based editor. These
new Sequences comprising numerous Performances can then be
transformed into either new Performances unique to Lumia (basically,
composite Performances) or rendered as a video clip.

You can make a lengthier clip of a performance or a music video with the
Sequence tool. It has the usual tools for editing like splicing, cross fading, wiping, fading in and out, and so on, but it also has a new function for
making seamless loops called Interpolated Transitions.

A user can simply align the data values of a clip’s in and out points in this
way, allowing for smooth adjustments to be made through data interpolation of all object property values. This can also be applied to the assembled sequence as a whole to get it ready for looping without any unwanted jump cuts at the loop points.

Users accustomed to video editing software will recognize the dual-viewer
interface, with one window displaying the currently selected Performance
clip and the other displaying the entire Sequence. The recorded
Performance clips can be sequenced with the help of audio added as a
reference on the sequencing timeline, and small audio sting parts can be
inserted at the edit transition points.

Similar to After Effects and DaVinci Resolve’s Fusion, the timeline also allows for keyframe-based manipulation of any given object properties, allowing for real-time automation of these features.

Interactive and Generative Motion Graphics

Lumia fully supports MIDI and Open Sound Control, allowing you to
manipulate visuals in real time just like a musical instrument. Any
Composition object property can be mapped to an external controller
(knobs, sliders, keys, buttons, pads, etc.) or played as a graphical user
interface element within the program itself.

The Program screen can also be used for visual programming by turning
properties and objects into nodes. There’s also the possibility of including a script editor in the Tools palette so that you can write C++ code scripts if
that’s how you like to program.

Feature Tour Deep Dive

Here, I’ll give you an in-depth look at Lumia’s feature set and design process, and show you how you can use the program to create linear media for usage in music videos and VJ loops at resolutions up to 4K.

Lumia can be used as a music visualizer by reacting in real time to sounds,
but its capabilities extend much beyond those of generative apps like the
iTunes visualizer. Custom compositions for interactive and generative
motion design for live performance graphics are supported by a rich feature set. The software’s user interfaces stand out because they were carefully planned to use common software design principles that digital artists from a variety of disciplines will already be familiar with.

The Sequence screen contains tools common to digital audio workstations
(DAWs) and non-linear editors (NLEs), making it accessible to musicians and video artists. Artists who have experience with 3D modeling will recognize the Stage screen, which features a user-friendly drag-and-drop workflow.

The Control screen allows artists who are accustomed to dealing with
hardware interfaces, such as DJs and VJs, to hook up their preferred
performance tools and create a unique user interface for them. Generative
artists and other coders who are used to working in visual settings will feel
right at home on the Program screen.

Lumia’s hypothetically associated digital asset marketplace would allow all
other digital artists to create media for Lumia projects, in addition to the UI familiarity that makes the above art areas accessible to Lumia. Assets for
texturing virtual objects could be produced by many kinds of 2D artists
including illustrators, photographers, and creative coders creating
generative abstractions.

A project’s asset set can be augmented with avatars created by 3D modelers, character designers, animators, and riggers. Motion capture allows choreographers and dancers to create files as importable assets. Sound and video files created by audiovisual artists can be included into production processes.

Therefore, Lumia is an app for all digital artists, whether they utilize the app itself, integrate the app’s material into a media-rich performance
environment, or take advantage of the options in an associated asset store.

The Five Work Areas

Lumia’s UI is based on five tabs which access five main working areas. At the top middle of all screens are the five main tabs described above:

Gallery — for an album view of all Projects and Performances

Stage — where all elements are assembled for a Composition

Control — basic to intermediate interactive control UI elements

Program — a space for more advanced generative programming with visual
programming nodes

Sequence — a space to ‘stitch together’ multiple Performances for the
purpose of making a rendered video.

Features in More Detail

You always start with a project as your starting point. In most cases, there
will be just a single Composition in a Project, while it may be used in a
variety of ways.

What a user creates with geometry, lighting, texturing, animation, and
effects in a Project’s Stage screen is called a Composition. Since the Stage
screen can include moving objects and a camera, it can be used to record
and play a Composition, but the intent is for users to build Performances by using the Controllers and Program screens.

A Performance is a video or audio recording of someone performing a
Composition. Initially, performances are Lumia native, which is they are
made up of recorded Composition parameter-value changes over time.

Templates (not displayed in the UI mockups, accessible via the File menu)
are starting points for new compositions that have already had a basic set of objects created in advance for the user.

Settings for each object in the app, such as geometry, lighting, animation, or visual effects, can be saved as presets (not displayed in the UI mockups,
accessible via each object’s properties in the Properties panel). Collections of controllers in the Controllers screen or individualized nodes in the Program screen can also be saved as presets and accessed from the Objects menu.

Gallery Screen Elements

Gallery Screen

The Gallery screen is divided into four primary sections:

Projects: these are the primary projects for making Lumia arrangements
These are not live performances, but rather recordings of past shows. Lumia ‘native’ Performances and those converted to video files are the two main categories of this content type.

Builds are executable apps (programs) that may be created, and they can
either be interactive or generative.

Files can be kept singly or in subfolders within each gallery space. There is a “new folder” button and any existing folders in each. Gallery items can be
sorted in a number of different ways, including by creation/modification
date, name, and type (such as video or native), and users can choose to
display folders or last in the returned list.

A search box is also provided for inputting the names of the desired files.

Lumia native animations are the only format in which performances can be saved, so they can only be seen and edited within the program itself.
Right-clicking on a saved native Performance initiates a rendering process
that creates a video file.

As noted above, please forgive the reference to ‘mints’ in the right column, as that is dated from the pre-NFT crash era : )

Stage Screen Elements

Stage Screen

The primary goal of this user interface is to allow the user to easily drag and drop the primary components of a Composition, including geometry, lights, animations, and visual effects, onto the Staging Area. The top row contains the Unreal Engine-specific components.

Composition construction takes place in the Staging Area. Its window bar
displays a selection of icons. There are tools for making more accurate
object geometries on the left, and tools for making and viewing
performances on the right.

UI Features

These controls let you easily switch between three different viewpoints of
your composition: front (F), back (B), left (L), and under (U). When you
switch off the perspective view, the view reverts to how it was before.

UI Features

Both icons function as smart object alignment tools, displaying a thin 1 pixel line when edges or vertices of objects are aligned with each other (to aid in exact alignment and placement of geometries), and a snap-to-grid overlay when in F, B, L, R, T, or U perspective modes (the only viewing modes where a grid overlay could be useful).

UI Features

These are the primary Boolean operations for constructing elaborate
geometrical structures. Because of this, the complexity of the 3D objects created by the app’s users can increase (without the app becoming a high-
powered modeling tool, which is outside the scope of this project).

UI Features

A composition can be performed alongside a variety of recorded
performances. The performance name field is a pull-down menu containing all of the available Performances for a given project. The space bar can also be used to activate the play button. The existing condition of the Composition can also be documented in the Staging Area.

UI Features

These pictures represent the several options for the aspect ratio (square, 3:4, and 16:9) that can be chosen. To see how their creation fits inside the
parameters of these common video aspect ratios, the user can preview it in
either the main Staging Area window or a secondary floating window.

When a new project is created that does not yet have a stored Performance,
the Project name will be displayed in this middle top field. For example,
each performance will be located in the directory structure of the project’s
name (Project 1/Performance 1).

The user has the option of working with the same geometric primitives in
either a 2D or 3D form. You can change the shape of a 3D primitive in the
Properties panel to make it look like a cone, a pyramid, etc.

These are the present building blocks of geometry:

UI Features


That would be the Paths app. One dimension is implied by a single path-
drawing icon, but there is no dedicated “1D” part. It features a down arrow since it’s possible to add in extra line drawing tools. A line in only one
dimension is also possible.


— rectangular plane
— rectangular plane with an opening (like a frame or window) circular plane
— an elliptical plane
— circular plane with an opening (like a porthole)
— curved plane (on an arc)
— triangular plane
— rhombus plane (diamond)
— parallelogram plane


UI Features

— cube
— cube with an opening (square tunnel-like) sphere
— ellipsoid
— a torus
— a pipe
— cylinder
— pyramid
— wedge
— cone
— truncated cone

The lights are located to the right of the 3D and 2D geometric primitives.

UI Features

— omnidirectional light
— spotlight (cone-shaped light pattern)
— directional light (analogous to a laser beam or a wider
tractor/teleportation beam)
— area light (a spatial ambient light that is shapeable just like a cube)
— and the reflection probes in dotted outline (these are for helping make
better reflections in Unreal). This is switchable via a dropdown menu
between the two kinds of reflection probes: cubes and spheres.

To the right, you’ll find your primary options for 3D transformation and
animation: Rotate, Translate, and Scale. There is also a distinct symbol (a
dancing person) for “fancy” animation, such as mocap les derived from
recorded human motions or any other type of custom animation data.

UI Features

The final row of icons allows you to choose between different types of visual effects, such as particle or fluid effects. The third icon opens the palette for visual effects in a new window.

UI Features
UI Features

Imported 2D or 3D artwork can be found in the panel on the left. New art
tabs can be added in future versions to handle mocap files, rigged
characters, and video, among other file types.

All items placed in the Staging Area are displayed in the Composition items
panel on the right, where they can be renamed, rearranged, and seen in a
hierarchical structure (similar to Unreal’s world outliner).

Like the Inspector and Property panels in other media apps, Properties
allows you to change the look and feel of the selected item in the Staging

Composition and Performance-related settings, such as camera position,
light qualities, animation parameters, etc., can be captured as snapshots (top right).

To the right of the ‘Help’ button is the S-shaped Path tool, which may be used to sketch up bespoke paths along which objects in the Composition can be dragged and dropped.

For processing tempo-related MIDI data, both the Control and Program
screens feature a 16-step sequencer array that can trigger visual events in the same way as a sequencer triggers audio events in a digital audio workstation.

UI Features

Composition Elements Panel

In this view, the Staging Area’s primary composition’s assembled pieces are
laid out for fast viewing and access. Everything that can be found in the
Staging Area may be found here as well. There is always a Ground Plane, a
Sky Sphere, and one (empty) Construct Group present in a new composition. In the panel’s header, users can search for specific objects or ‘filter by object type.

All the objects that make up the composition in the Staging Area are laid up
hierarchically in the Composition Elements panel. There are also compound objects, which consist of multiple levels of parents and children. The child of a composite object can be revealed or hidden using the open/close arrows at the top of the item.

The convention of indenting nested objects according to their location in the tree serves as a visual representation of the hierarchy present. All objects in a sibling group have the same level in the hierarchy.

Various symbols represents each distinct category of objects. The “eyeball”
icon can make or hide any object. The node icon can be found on the far
right of each parameter. This button generates a node object representing
the object and all of its properties in the Program screen, where it can be
further edited using generative visual programming techniques.

One such item, dubbed “Cylinder 3,” in a particular configuration is depicted down below. The Properties panel, which will be detailed in the following section, displays the attributes of the currently chosen object.

UI Features

Properties Panel

All of an item’s attributes are displayed in this panel. In this section, I shall
explain concepts whose meanings aren’t always clear.

Properties, like objects in the Composition Panel, can be represented on the Program panel as nodes by clicking on the corresponding icon. On the
Control panel, each property has a Dial symbol that links it to a Controller.

Glow and Mesh are the two color modes available. The user can choose to
alter the hue of the object’s glow or its primary mesh texture. The brightness of the illumination is also adjustable.

UI Features

Textures in a virtual world aren’t always two-sided by default because doing
so consumes CPU/GPU resources; instead, the decision to make a texture
one- or two-sided is usually made on purpose. The user interface makes it
possible to switch between different states (one-sided or two-sided, for
instance). Albedo, Bump, Normal, Roughness, and other similar maps (jpeg files) are frequently used to define a texture. Each of these may be loaded independently.

UI features, All values are in the range of 0.0 to 1.0

Media Asset Types in the Stage Screen

The left panel contains six tabs labeled “2D,” “3D,” “Character,” “Motion Capture,” “Video,” and “Audio files,” containing the various media files that are importable and usable. Here are some design variations of a few different user interface elements for these types of assets.

UI Features

Controllers Screen Elements

The Controllers screen, like the Stage screen, contains a group of objects that may be moved around by dragging their handles. The objects on this screen serve as controls for intuitive interaction. The primary function of this interface is to facilitate the creation of unique control schemes.

The top row of the upper portion has features that are nearly identical to
those in the Stage screen, with the exception of the Paths tool, which has
been replaced by icons for wi-fi, bluetooth, and network connections. These are helpful if the performer has to access networked peripherals, such as specialized hardware or wireless controllers, that are not part of the standard set.

The Controllers screen is similar to the Stage screen in that it allows users to design their own unique control panel for playing your Composition by combining a stock set of basic features that provide a great degree of

The items in the bottom row of the top part (reading left to right) are (not all are drag and drop but all serve additional purposes):

UI Features

Container icon: The mockup displays four resizable containers, each of
which represents a container panel icon, a visual organizing tool for dividing up controls into easily understood chunks.

Dial icon: The minimum value is located at the far left of the dial, while the
maximum value is located at the far right.

Slider icon: This slider icon represents a linear control element with a
minimum value at the very bottom (or very left, in the case of a horizontal
slider) and a maximum value at the very top (or very right, in the case of a
vertical slider).

Button icon: Control on/off indicated by the button icon.

XY Pad: Two-Parameter X-Y Grid Pad interaction.

Pads: Like the square pads on drum machines, a tap on this icon will cause
an accompanying percussive sound to play.

The Pitch Bend wheel on a MIDI keyboard controller is represented by a
circle with a P within.

Modulation wheel (represented by the circle with a M) on a MIDI keyboard

Envelope icon: Drop-down arrow with envelope icon. Users can make their
own envelopes by adjusting the vertices of a line segment whose slopes
depict the values’ evolution over time. ‘Function generator’ is another name
for this feature. The user interface prototype displays five personalized
envelopes in the upper right corner of the main body. The second controller is a line-drawing tool without breakpoint or slope parameters, and it can be accessed via the dropdown menu.

QWERTY keyboard icon: a graphical representation of a standard computer keyboard used for input. MIDI keyboard icon, graphical representation of a MIDI keyboard used as a control element.

Audio icon: Select the type of audio input to be utilized as controller data
from the audio icon’s drop-down menu: microphone, line level, or audio file. In order to manipulate visual aspects, it is possible to analyze audio for
feature extraction of various qualities (such as beats, frequency regions,
duration, and timbre). The Program interface can also provide audio nodes.

The MIDI file icon represents a MIDI text file that may be imported and
utilized as control data.

The large gray rectangle in the middle of the UI mockup represents a video
window that can be dragged and dropped anywhere on the Controllers
display. The arrow down allows the user to switch between the television screen and a virtual reality headset. The user should be able to tell if the
virtual reality headset is active or not by looking at a different version of the icon.

To view a larger image, you can click the “Floating Video Screen” icon, which will open a new window that can be dragged to a different screen and resized independently of the Stage window.

The icon for live frame sharing with another app on the same device, like a
video mapping app, via a virtual server can be toggled on and off using this
feature. It can be toggled on and off, but you can’t move it to the controllers.

The sticky notes icon is a self-referential note-taking feature that may be
activated from the controller settings menu. Two such enormous sticky
notes are depicted in the UI mockup.

To link a software control element to an external hardware MIDI controller,
click the MIDI icon. This is not a drag-and-drop element, but rather the MIDI Learn function.

Like the MIDI icon’s MIDI Learn feature, the OSC (open sound control) icon isn’t a drag-and-drop element, but rather the means by which an OSC
controller is associated with a software controller.

One of many potential applications for the time display — if you’re
performing live, you can check the time that has passed by clicking the Time button with your left mouse button. By right-clicking on Time, you can limit the length of the recording if you plan on using the Envelope controllers as your primary means of scoring a piece.

When you press the record button, a video of your performance will be
captured and added to the Gallery’s Performances section. Each video screen (embedded or floating) has its own player controls, so performances can be played back in a variety of ways. The Stage area is also suitable for playing performances. A Play option will also appear if you right-click on a
Performance (native or rendered video).

To create a continuous video loop of your performance, tap the Loop symbol after pressing the Record button. The Loop option ensures that the
parameters of the last and first frames recorded are same, allowing the
Performance to loop without interruption. In the Options section of the
Settings panel, you’ll find a number of options for creating seamless loops.

To enable or disable MIDI tempo information, such as that from an external audio source, press the Sync button. If you turn on Sync and then right-click on a control element in the activate Pad, it will begin to activate in time with the incoming tempo pulses.

Tempo is quite similar to Sync, only it allows the user to choose their own
internal Tempo rather than sync to an external MIDI sync signal. It can be
toggled on and off with a left click, and its tempo value can be entered
numerically with a right click. Similarly, activating a Trigger Pad control
object with the right mouse button makes it responsive to the tempo setting.

The primary (lower) component of the interface mockup depicts a fictitious arrangement of a user’s individualized control elements structured (mainly) within Panel objects surrounding an embedded video window in the center.

Program Screen Elements

Program Screen

Many of the same controls seen on the Controllers panel are also present on the Program screen. The composition can be played and paused by clicking the play button (or pressing the space bar) at the top left. This function is identical to the Stage screen located at the very pinnacle of the Staging Area. Similar to that function, if no Performances have been saved, this one will take on the name of the Project.

The node editor is a visual programming environment that features a search bar for finding object-nodes to insert. The primary goal of the Program screen is to extend the Controllers screen’s interactive controls with a deeper generative layer. The most common scenario for employing stored Performances is expanding upon a simpler version in the Program section.

Previously recorded performances may be viewed in the Program screen,
where several alterations can be made. Settings for performances that aren’t filmed can be modified. Video files of performances are immune to changes made in this section of the software.

UI Features

The user can also make builds by clicking the export symbol, and test them
by clicking the tool icon.

Sequence Screen Elements

Sequence Screen

Unreal’s real-time rendering of sequences sets the Lumia sequencer apart
from other video editors, allowing users to make real-time changes to
Lumia-native content while working with a Performance clip in the editor.
But in general, the process is very similar to that of traditional video editing

Due to the fact that one is typically modifying pre-rendered Performance
data sequences, one don’t require elaborate editing tools. Considering that
most users would be content to toggle between two video tracks when
watching a Performance, a single audio track suffices.

A third type of track, Keyframes, is for keyframe editing in order to have a
powerful sequencer. Keyframe tracks should be in between the video and
audio tracks if you want to automate the values of property parameters over time.

Similarly, one audio lane for editing sequences against should suffice since a user may just want to edit to a single music track. In the space containing the three stacked timelines, the play bar (the current time position) appears as a thin vertical gray line.

When editing video, the standard layout calls for one viewer to watch the
complete sequence, while the other displays the currently selected clip in the left panel or the timeline. Each viewer features standard video controls and time readouts in addition to a slider that can be used to swiftly navigate to a specific point in the timeline.

There are two separate sections in the left panel area:

Top #1, Recorded performance snippets.
Top #2, Short pieces of music or other audio.

The panel’s lower section serves as a sequence organizer. Edited
performances are what make up a Sequence. As a result, a user can make
their own Sequences and new Folders to put them in.

While the majority of this user interface is self-explanatory, there is a
collection of icons that may look out of place, so I will briefly go through
them. These are the symbols for several tools that sit above the three
primary timelines (two for performances and one for audio):

UI Features

These icons provide fast changes to the Sequence’s data. From left to right,
they are:

Cut (splice), Fade In, Fade Out, Crossfade (sometimes termed Cross
Dissolve), and Interpolate are all transition effects. There’s nothing typical
about this emblem at all. The goal here is to fill in the blanks in the data
between adjacent clips. Because the sequences are made up of real-time data values of the primary Composition, this transition feature is designed to make it possible for the data values to be interpolated between two
Performance clips in order to create a more fluid transition.

This is a nonstandard icon for an audio transition. The goal here is to
provide a sonic transition with a few brief ’stings’ (whooshes, impacts,
stingers, etc.). When the transition is placed on a cut, a popup would emerge from which the user could choose an audio file from the audio elements on the left panel, or they could simply drag and drop an audio le onto the transition.

The appearance of this transitional element deviates from the editing norm. The atypical arrow shape is supposed to suggest that the user can set the wipe direction to be in any of several possible orientations (left to right, top to bottom, bottom corner to top corner, etc.).

These two symbols appear on the left side of the timeline:

UI Features

These buttons will allow you to 1) Render a video of the Sequence and 2)
create a Performance from the Sequence, the latter of which will appear as
an Lumia-native Performance on the Gallery screen’s performance area.

These two components are located on the left side of the timeline:

UI Features

The left icon can be used to reduce the height of existing clips. The slider on the right allows you to skip ahead or back in the timeline (it adjusts the play head in the vertical bar). The timeline tracks can be navigated vertically using the straightforward left-side scroll bar.


This has admittedly been a very long article, thanks for reading all the way
down to the bottom!