VJ-ing with Unity

Takayosi Amagi
6 min readSep 4, 2018

--

How I built my VJ system with Unity and performed

Hi, I’m Amagi. Visual art freak in Kyoto, Japan.

On Aug 27, I VJ’d at ADIRECTOR CHANNEL held at Tokyo. This time I performed with my own VJ app built with Unity⚡

ADIRECTOR CHANNEL was an exhibition for technology and art using entire building with 4 floors. I VJ’d as a member of BRDG — a visual-art team in Tokyo — with DJs from Radical Hardcore Clique surrounded by 6 semi-transparent screens.

It was so damn cool…I was really happy to do VJ in such a good stage😍

BTW, this time I built my own VJ app with Unity. Here are the examples:

In this article I’ll explain why I chose Unity, how I built the app, and what I learned from it.

Why I chose Unity

Usually I use TouchDesigner or VEDA for VJ, but this time I had to choose a system which can output to 2 FHD screens (3 screens for front/rear of the DJ).

VEDA is a VJ system for GLSL livecoding which I built for Atom editor.

It’s especially good for generative-art scene or parties for geeks. I VJ’d with VEDA such as in Algorave Tokyo and in NodeFest Tokyo.

Sadly VEDA doesn’t support mutli-display output. It seems quite easy to implement, but I was afraid to use new features in such a big event. In addition, I felt like creating more vivid animations — not livecoding-like, or academic taste.

In TouchDesigner, we need to buy commercial licence to output screens larger than 1280x1280.

As another option we can use Unity. I see many artists using Unity in VJ and exhibitions of generative art. Keijiro Takahashi is one of the most famous artist who uses Unity.

I changed my job and started using Unity since this June. I thought making VJ system with Unity is a good practice for my skill — so I chose Unity.

Architecture

so complicated…😇 here’s the features:

  • 2 outputs + 1 preview output
  • Loading multiple scenes with ADDITIVE mode
  • Using video input with WebCamTexture
  • Control scenes/FXs with Keyboard and MIDI controller
  • Raymaching!!!

3 displays output

As I mentioned above, I had to output 2 FHD screens. In addition, I wanted to see which scene is selected and how the screen will displayed… so I created preview screen.

Unity apps can output to multiple displays by creating cameras corresponding to each displays. This time I wanted cameras to switch scenes and to add FX for front/rear screen, so I had to create 7 cameras on preview scene.

I didn’t have time for rehearsal so I practiced a lot in my office:

Loading multiple scenes

In Unity generally a scene represents a world — but we can run multiple scenes simultaneously when loaded with SceneManager.LoadScene(scene, LoadSceneMode.Additive). For better development experience, I created SubSceneController and placed it to each scenes so that the scenes automatically switches the render tartget, texture or display.

Video input using WebCamTexture

I love VEDA, I wanna use it whenever I can — so I ran VEDA on another PC and sent it to main PC as a video input.

We can use video inputs as textures using WebCamTexture .

Controlling FX with Keyboards and MIDI controller

used the keyboard to control scenes:

  • Number keys: Scene ON/OFF
  • Character keys: FX ON/OFF
  • Switch front/rear with Left Shift/Right Shift
  • toggle when pressed with SPACE

I also used nanoKontrol2 to control parameters:

  • nanoKontrol faders: levels of the scenes
  • nanoKontrol knobs: FX strengths

I know it’s too difficult… but I thought this is the best way to achieve my goal without being annoyed with machine troubles.

I used MidiJack to get MIDI signals:

Raytracing

VJ Raytracing! Here’s the scene I made with raytracing:

Actually this is a bit different from raytracing technologies used in movies or high quality games — this technique is called raymarching. Raymarching is used a lot in demoscenes and generative art scenes as it enables rendering complicated objects such as fractals, metaballs, etc.

I used hecomi/uRaymarching to create raymarching scene. Using uRaymarching, we can create raymarching objects by writing the distance function in the inspector of the material. Super Easy!!

Distance Function Editor in uRaymarching

I copy/pasted the distance function from my Shadertoy:

What I learned

Performance matters

When I started to implement multi-scene project it worked in 60fps, but as the scene become more beautiful they became slower and slower…😇 Especially the raymarching scene was so heavy in multi-scene mode while it worked in 60 fps when it run in single scene mode.

In GPU-heavy scenes, Unity’s stats window doesn’t show accurate FPS. I had to calculate FPS by myself using Time.deltaTime. I used the script below:

An accurate FPS counter for Unity. Works in builds. · GitHub

The main bottlenecks were PostProcessingStack and uRaymarching. I adjusted the parameters of PostProcessing, for example removed trivial DoF and motion blur. Also I struggled with uRaymarching, reduced the loop of raymarching and finally the scene worked almost 30 FPS.

Though I thought it’s smooth enough I was shocked after I saw other VJ’s are playing in 60 FPS… I don’t know how to say but it feels more real when the scene runs in 60 FPS. I’ll take a closer look in performance for next VJ.

Controller must be easy

While I was on stage I missed which scene/FX is ON several times. I wanted to show the state of scenes/FXs on preview screen but I didn’t have enough time to implement. It’s not difficult so I’ll try it till the next time!

Don’t overdose bloom

Bloom is useful to make scenes beautiful, but when we use it too much the screen gets too bright. It’s typical mistake…I should’ve keep scenes more simple and clear.

Being ambitious is good, but not too much

so the code became complete mess😇

So this is my first VJ with Unity. I felt Unity is good for motion graphics as we can built beautiful scenes quickly, and we can use store assets.

I really enjoyed!!! Hope this article help you to create great visuals with Unity😎😎😎

I’m Amagi. Visual art freak in Kyoto, Japan. I create animations and play VJ using Unity, WebGL, GLSL, etc.

Follow me on Twitter and Instagram!!

--

--

Takayosi Amagi

Visual art freak in Kyoto. loves GLSL, WebGL, Unity. Developer of VEDA https://veda.gl/