Unity AR VR — Code Quality and Integrity

Panayot Cankov
Telerik AR VR
Published in
9 min readJul 16, 2018

Software and hardware is fragile, but to ensure code quality and integrity, there are some things we’ve learned to do through the years.

My mom and dad are engineers. When I was a kid, our house was full of batteries, diodes, magnets, wires, weird circuits and we had an 8bit Pravetz PC. Newspapers used to pay authors of new crosswords puzzles, so my dad wrote crossword generator, in basic. He spent two days writing the algorithm and about a month feeding in words database. There was no internet back then, nor hard disks, programs were saved on 5,25" magnetic floppy discs. It turned out that a kid with a magnet, can wipe in a second, a month worth of work.

electronicstechnician.tpub.com

Working on Unity games, or AR VR, is no excuse to be sloppy. Here is how we maintain the Unity assets we have at Progress Telerik.

Source Control

Git

Git is extremely flexible and the protocol has some handy features when it comes to Unity too.

It is easy to revert to a known good state. And with the Unity scenes you will use this often. You can quickly switch contexts using local branches, letting you rapidly experiment different approaches without loosing work in the process.

Pushing a local branch to the server, allows you to save your temporary progress when working on large tasks. You can share it with colleagues to collaborate. Finally submitting a pull request to your master branch will add your changes to the official version of your source code. But more importantly, for pull requests you can specify policies to enforce that CI builds will pass.

And don’t forget the peer pull request reviews.

Microsoft has state of the art integration of Git, Visual Studio, VSTS and TFS. Here is a good getting started point.

When it comes to Unity the only pain-point is merging serialized content. Scenes, prefabs, etc. are saved in YAML format. Unity YAML, although text based, is not as easily readable as XAML. If you have a lot of changes on a scene following various GUIDs around may be tedious. However there is a tool provided by Unity that does semantic merge — Unity Smart Merge. In our experience we hit some limitations when using SmartMerge for all files, so we prefer to use it for Unity files only. Here is my .gitconfig and the commands I use to merge .unity scenes from command line.

TFS

Has splendid integration with Git.

We have an in-house computer with a separate Unity user and seat dedicated for builds. So how does it work?

Unity has a test runner. And that test runner can be driven from command line. The runner produces NUnit results in .xml files that our build is set to collect and attach to the build status.

Running the Unity tests generates a .sln that you can actually build with MSBuild.exe. While the Unity will use its own compiler for your code, using Visual Studio will allow you to do additional checks such as code and source analysis. Humanity survived the tabs vs spaces war and the result was — it doesn’t matter whether you are using tabs or spaces as long as the team is consistent and you enforce this automatically.

Unity, CA&SA, all green!

Here are some scripts that we used to build and run tests .

Our CI will run Unity edit and play mode tests as well as it will rebuild the .sln with warnings-as-errors to enforce perfect code and source analysis.

Organizing the Visual Studio Solution

Unity Generated .sln and .csproj Files

This is what we have used so far.

Unity generates a .sln file with several .csproj references should you choose to use Visual Studio. The .cs scripts from the Unity project’s Asset folder are added to that Visual Studio solution. You can organize the code in projects by using assembly definition assets introduced here. This is what we are using at the moment.

Code and Source Analysis

Using Microsoft Visual Studio Professional 2017 with Visual Studio Tools for Unity will provide you Unity style callbacks when writing .sln and .csproj files. This is the path we took but may as well pivot. So basically this topic has a good example how to setup a ProjectFileGenerator.ProjectFileGeneration callback that will add StyleCop. It get the content of each .csproj when it is generated by Unity and then add the necessary property groups to install things such as CodeAnalysisRuleSet and Analyzers. We also had to add some NoWarn rules to suppress warnings for unassigned fields, analysis is not aware that Unity assigns them during deserialization.

Following this approach we were happy that Unity automatically detects changes in .cs files and reflect them in the Unity editor. But there was a downside of Unity regenerating the .sln and .csproj on occasions forcing Visual Studio to block for a while reloading the projects.

Class Libraries for Managed Plugin

This is what we will experiment with in the near future.

Unity in its current version (2018.1.6f1) can work to some extent with external projects. You can check Unity’s Managed Plugins Manual. You can create a solution, reference the Unity assemblies, configure post-build task to copy the .dlls to your Unity project’s Assets folder. Adding .dll and .pdb files will trigger Unity to generate .mdb automatically so you can also attach use Visual Studio’s “Debug > Attach Unity Debugger” to debug your code.

However the Unity’s platform dependent compilation won’t work out of the box. You will have to setup project configuration by hand that will output different .dll files. And then configure Unity which platforms which .dlls should use.

Unity doesn’t know about the project so there won’t be Visual Studio pauses. But also you have to rebuild the Visual Studio project manually when you change .cs files.

Testing

Build scripts and assets are source code. As such they should stay in source control. Developers should be able to execute as much as possible of the build locally. For our workflow the build PC doesn’t do anything more than running a ./Build.bat. Tests that run on a test PC but cannot run on a developer’s machine should have some extreme dependencies such as specific hardware or truly expensive software.

Tests must be predictable, if they pass on developer’s machine they must pass on the build machine. If tests fail on the build machine they should also fail on the developer’s machine. This is easier said than done.

The testing pyramid for Unity projects as we envision it for the moment:

  • Usability Testing
  • Manual Testing
  • Automated Player/Real Device Testing
  • Automated Editor/Unit Testing

Starting from bottom up.

Automated Editor/Unit Testing

Unity has a built in test runner. And we are using it. The interesting thing about Unity is that it supports two modes in its Editor — Edit mode and Play mode.

We test both.

Developing controls for the standard 2d suites taught us how important the design time is. Automating design time tests for Microsoft’s Blend is not a walk in the park. But automating design time tests for Unity is. Edit mode’s behaviour lifecycle differs from the runtime one. One of our deliverable will include reusable widgets and to ensure high developer productivity we are addressing certain things in edit mode to provide edit mode rendering and property validation. So we will be adding up tests that will ensure our widgets will not generate errors while adding them to your scenes and that setting properties in edit mode will reflect on the control’s appearance.

For play mode the Unity editor emulates the Unity players. Testing here is a must. What is interesting for the Unity Test Runner is that you can go beyond Unit Tests.

The Unity test runner generates a scene where you can add GameObjects and behaviours from code and the test results will be displayed in the scene. Unity is build for game development and as such there are build in APIs to work with cameras and images:

So we were able to scaffold a minimalistic testing framework that can open scenes, find GameObjects in the visual tree, interact with them and then assert the proper appearance of our widget:

These tests catch errors in the integration of the behaviours used to render a chart, errors in the custom shaders used to render various chart elements etc.

19.37% diff when we change the seed in UnityEngine.Random.InitState

So the Unity Test Runner can be used for:

  • Automated unit tests
  • Automated integration tests
  • Graphics tests

Automated Player/Real Device Testing

We do not have tests yet that are automated on real devices.

Graphics tests in the editor will execute on your computer’s GPU and it is almost certain that if you are working on AR or VR app it will run on mobile.

AR VR for Business Adoption Goes Through Mobile

With mobile come certain limitations in GPU and CPU. Unity QAs have a portal where you can find about the Unity Graphics Tests Runner. Now this solution seems close to what an AR VR testing framework would need but the latest released bits are from 2016.

When our codebase grows we will be looking for a solution similar to our Progress Test Studio for mobile, but for Unity’s mobile platforms. We will need something that can build and deploy on devices, simulate user input and collect test results.

Unity supports cloud builds. These automate the build and deploy of Unity project onto your devices. We will have to investigate further if it has integration with a device cloud to actually run automated tests.

Manual Testing

Through the years my experience at Progress Telerik was at building UI components that can be extensively tested by automated tests. However with the state of AR VR development for line of business applications today our story will be slightly different. Our vision includes building actual solutions - actual apps. So we will press a little more on the manual testing here.

Usability Testing

Here is a story from the building of the HoloStock application for Microsoft Build 2018 Developer’s Conference.

HoloStock — app and source code

We can not stress more on how important the usability testing is for AR VR. Progress is relatively large company with lots of employees so we had the luxury to invite fellow colleagues to give our demo a try. We ran the demo on more than 30 people over a month. When we started, we had a still early version of the app, but we also had time to address the feedback.

The AR VR app space has some usability guides out there, but the general public is not used to this new media. At certain points little things can render your application nearly useless for the end user. Little things that can easily be addressed should you open yourself for the feedback.

While you are manually testing the app you train your brain to perform flawlessly all the gestures, to find all the happy paths. Manual testing is in no way substitute for the usability testing process.

Conclusion

I know this was a long read but I hope this will give you some insights how to setup your source control, continuous integration and automated testing when working with Unity.

--

--