Unite Tokyo 2019 Conference Notes #UniteTokyo

Chuy
10 min readSep 26, 2019

--

Unite Tokyo 2019 is a developer conference held by Unity Japan Technologies from 2019/9/25 to 2019/9/26. It was a very cool event full of new inspirations for the game and multimedia industry. I would like to share with you the summary notes I took here!

Main Visual

You can find the slides of each session here:

Note: I am not a Unity engineer (but I actively seek to be!) so I apologize if there is any mistake.

Overview Photos

The event was held at Grand Nikko Odaiba, a luxurious hotel. The ticket price is about 26,000 yen so it’s quite an exclusive event.

The Reception
Hallway
Buffet Party (1st day)
Made with Unity games

Understanding C# Struct All Things

  • Struct has better performance than class
  • “Everything is copied” therefore, do not create a big Struct (over 16 bytes).
  • Tips: Change IDE colors between Class and Struct

Unity in Live Entertainment

“Twice Dome Tour 2019 #DreamDay” case study

This case study is about how to synchronize real moving light along with virtual moving light in real-time. In summary, they catch DMX signal emitted from light operating panels and interpret inside virtual space to create seamless lighting effect.

Art-Net

Tools

  • Art-Net (DMX Over Ethernet) — ArtNet.Net library is available
  • UDP Recorder & Player — to record udp signal and replay for debug
  • Hx Volumetric Light — available on Asset Store

Know-how

  • Rent real moving lights for program development
  • Get light mapping blueprint from lighting engineer and position lights
  • Fix light shader to make it more powerful near the light source
  • Set virtual camera at the same position of real camera to create a seamless path

Merits

  • Can adjust colors right before the performance
  • Lighting operators can also control virtual lights in the same way as real lights
  • Can instantly change stage objects
  • It’s thrilling to do it LIVE, not pre-recorded!

Virtual Live — Kaguya Luna case study

KAGUYA LUNA VR LIVE case study

This case study is about how they design and hold the first virtual concert in the world, Kaguya Luna VR live, 1st aired on 2018/8/31 on cluster virtual event platform. It was special because they considered non-Vr users so they prepare two channels for viewers: VR and Live Viewing on theaters.

How to prepare the live

  • Limitation on cluster platform is that you can only have 1 scene. So they use Timeline feature to create a long scene. However, that was not efficient to work as a group so to split work, they changed workflow to develop in different scenes and use “SceneAdditive” to merge them.
  • Core members are only 6 people, all freelance. The project length was two months.

Live design

  • Viewers can get close to the performer and move together, creating more immersive experience.

Live story gimmicks

  • Q Line: refers to Disney Resort queueing. They effectively pump up user tension. Design waiting space
  • OCHI: Climax event, for example, transferring viewers to a different place, etc. to create deep emotion

Modeling Tips

  • Roughness map: must-have to create believing texture
  • Shadow: can’t bake
  • VR scale: create and check 360 angle

Sound

  • Cooperate with song writers to create special sounds for VR

Camera: three types only

  • Rail camera
  • LookAt camera
  • Long-distance Fixed camera

VR Live merits

  • Infinite effects
  • Live viewing and VR have different good aspects
  • Viewers can get in touch with the performer
  • No access limit. Can join from anywhere.

Why do a publishing company and a game company misunderstand each other? Dragon Ball game case study.

This presentation is a panel discussion between the 4 speakers.

  • Torishima-san: chairman of Hakusensha, previously chief editor of Shukan Shounen Jump.
  • Uchiyama-san: Bandai Namco game producer
  • Unozawa-san: Bandai Namco Holdings IP adviser
  • Taira-san: Denfami Nico Gamer chief editor

They talked about many troubles that happened in the past when Bandai Namco was creating games using Shounen Jump’s IP (Intellectual Property), specifically Dragon Ball series.

Dragon Ball Z PS2 Game Project (2003)

  • Bandai was developing a Dragon Ball game without confirming the content with Jump editorial department
  • When Uchiyama-san brought the alpha version of the game to present to Torishima-san, Torishima-san said “Sorry, but please throw this away”
  • The reason was not told at that time. But today he explained that it was not Dragon Ball, it was just an imitation. Without editorial department approval of the game, it was not different from the pirate version
  • Bandai had to do a full-revise of 3D models and animations. Of course, the development cost skyrocketed.

Why did Bandai proceed without confirmation?

  • Dragon Ball animation had finished airing in 1997. The PS2 game we are talking about was released in 2003. It left quite a blank period.
  • At that time, ‘Hokuto no Ken’ revival game was a big hit. Bandai thought that ‘Dragon Ball’ revival would be a hit too so they will just make it for Jump.
  • Therefore, there were no emotions or attachments to the character. This thought was reflected in the finished product.

Reflection of the team: We need to create a ‘character’ not just a 3D polygon.

What you are creating is a pirate version of the work!

What to take from

  • Writing work is very hard. The brand is very important. The editorial department must protect the brand quality to keep its value for as long as possible. Even if the story is finished, they won’t let it be sold cheaply. IP is not just a thing, it’s a collection of living characters. Game developers must understand characters feeling thorough before creating an IP game.
  • Torishima-san said: children are difficult. They have very limited money to pay for your work. So think for them, not think about adult circumstances.

Recreate a scene from “HELLO WORLD” cell-look CG animation file using Unity

Note: this section has no slide shared

This section was a highly-anticipated one. Since we could here the making story of the recently published film animation: HELLO WORLD, release on movie theater on 2019/9/20. I just watched this film animation and it was very well made using all 3DCG. The story is… well, not quite understandable at the last scene but still very impressive haha.

Speakers
HELLO WORLD movie trailer

This section is about how they recreated the cinematic scene using Unity, in order to find a possibility to use Unity for full animation production. It was a collaboration project between Graphinica and Unity Technologies Japan.

HELLO WORLD production concept is to create a 3DCG animation film work that has the same quality as 2D animation. The movie itself was produced with a traditional workflow using 3ds Max and Adobe After Effect.

Verification Theme: Can we create a realtime animation using Unity while keeping the same quality as an animation created with traditional 3DCG workflow?

Real composition example: over 20 layers
Scene recreated with Unity

Recreating the scene is much more difficult than what you see on screen. The team brought 20 layers of AE compositions to render real-time on Unity. 2D-animation-like shadow requires special attention.

Issues

1. The conversion of assets from DCC tool to Unity is difficult.

Models from 3ds max are much more complicated. It can’t be playback realtime. So, Unity Japan team used MeshSync & Scene Cache feature to convert data. The result is that they created a timeline that is fast and realtime scrubbable. They can drag the frame back and forth while seeing the final quality preview. In comparison, 3ds Max would take 1 second to render 1 frame.

2. Toon-shade Texture rendering is difficult

Sample of UTS2 + DXR Raytraced hard shadow
  • Use UTS2 + DXR Ray trace hard shadow to create a shadow for animation.

3. Final look effect realtime rendering is required

  • Pencil+ 4 Line for Unity to create line expression
  • Normally, flare/effects are added on top of pictures using AE. This time they wanted to do everything on Unity so they used post-processing and various tools.
  • Nodes are used (normally not used in Unity workflow)
Reproduction level — 85% You can still see subtle shadow difference comparing Unity rendering (left) to the film (right)

Future of Unity for animation

  • 1. Reduce rendering cost and data cost
  • 2. Composite can be done in parallel during animation = more speedy work
  • 3. Director can check finish quality in realtime. So the speed of the revision cycle will be much faster.

Final words

  • With Unity, we can create an anime-based game in the near future.
  • Although works are still left, Unity has a big potential to be a tool that changes animation creation workflow
New workflow using Unity

From automobile to Cinema movie. The power of Unity Anime.

The last section I went to. I personally liked this the most since it presents about how Unity is applied to real-world applications, not entertainment. I felt excited about the future that we are reaching soon.

Beginning (CRAFTAR, Inc.)

Animation covering more and more fields
  • Animation has been extending to many fields. It plays an important role as the user interface.
  • Unity connects Animation VR and Automobile industries together

DENSO case study

  • DENSO is an automobile part supplier
  • The automobile industry revolution is a trend. CASE: Connect, Auto-drive, Sharing, Electronic
  • Question: How to make travel a pleasant experience?
Future Cabin Concept anime
  • Answer: They made “Future Cabin Concept” animation movie and VR at the same time = No border between Anime and Game anymore.
  • In theory, you can jump into the Anime as you want.
PR video version for business partner
Same animation but as VR application
  • Users can dive into the same anime in VR. DENSO provides a technology to improve experiences (seats/temperature adjustment)

Why VR animation is better than a photoreal one?

  • Users can see what they can’t see in reality: temperature/air current/scents
  • “Reality” rather than “Real”. Anime can address a particular experience better than a Photoreal expression

Cinema Rendering in “Ashita Sekai ga Owaru to Shite” mo case

Ashita Sekai ga Owaru to Shite mo movie
  • This movie created a mob scene using Unity with UnityChanShader & Pencil+ for Unity
Shader
  • Many things were not possible on Unity so they asked for help, for example, they requested the feature to export RGBA from FrameCapture component (normally only RGB can be obtained). Unity developed that feature for them just in one month.

Issues

  • No rendering time but the preparation of materials take too much time
  • Converting 3dsMax from/to Unity is difficult and costly
  • Shadow Map glitches are very obvious. Raytraced HardShadow (newly-developed by Unity) soothed this problem.
  • Currently, the FPS is lower than 30. If cloud services can improve performance, we will be able to render film-quality animation in realtime.

Takeways

  • If you want to create anime using Unity, use Unity from start to end!
  • Use Marvelous Designer for clothing setup and clothing simulation

VR Director talk

How creation will change with Unity

Agile process in animation making: discuss, change and preview in realtime
  • Animation creation process has been a waterfall. It will change to an agile process.
  • We will be able to fix Animation even after 絵コンテ (storyboard) is decided. As in the DENSO movie case, they discussed changes a lot even after the production had started. Realtime rendering is the key for this.
  • All industries will be digitalized. So everything will be “Unity-ed”

How consumption will change with Unity

  • We can change the output to each customer using the same work asset
  • “One Anime Multi-Use” — One asset for animation, application, and VR
  • New interaction inside a car, as seen in DENSO video

Future Project

Eyetracking Virtual PoC
  • QuickCabin VR Virtual PoC
  • You can create virtual mock with many types in VR, with far less cost than real mock.
  • Test eye-tracking heat map to create a system to prevent an accident

“Unity has moved from Game Engine to Game Changer”

Finally

Conference pass

Thank you for reading, or even skimming, through this long article. I had a lot of fun and gained new inspiration from the Unite Tokyo 2019 event. It was my first time participating in Unite Tokyo event. From my perspective, as seen in the last section, Unity usage is expanding into many application fields. Among that, Anime creation features have been upgraded dramatically. It is my dream to create an original Anime so I am excited to try them. Hopefully, next year I will be able to attend the event as a Unity engineer or a Unity creator!

--

--

Chuy

iOS software engineer @ REALITY Inc., Japan. Comes from Thailand.