eXtending Reality by Spatial Collaboration

Best Practices for Building Spatial Solutions — Part 3

Software development practices for spatial solution are growing, we are sharing our experience and best practices for building spatial solutions using eXtended Reality (XR)

Kuldeep Singh
Published in
8 min readJan 5, 2021

--

< Part 1: XR Product Definition < Part 2: XR User Experience Design <

This article covers best practices of XR development and testing.

Best Practices of XR Development

XR development introduces us to the new set of technologies and tools. The standard development practices such as Test Driven Development, eXtreme Programming, Agile development helps building a better product. XR is influenced by gaming a lot, and many standard development practices for enterprise software development may not be straight forward.

Plan for spikes

XR is still fairly new so it is better to have a plan for spikes for uncovering the unknowns.

CI/CD first

Having a CI/CD setup makes the development cycle very smooth and faster. Investing some time in the beginning to set up the CI so that it is able to run the test suite and deploy to a particular environment with minimal effort, saves a lot of time later on.
We used CircleCI for building both the Unity app, as well as the server side app. For the server side, we integrated with GCP. We maintained different environments on the GCP itself and CircleCI was handling the deployment to those environments.

Quick feedback cycles

We follow a 2 week iteration model and had regular showcases at the end of those. The showcase consisted of a diverse user group (it is important to have a user group to better inputs, see previous articles of this series) , which helped us a lot in covering all aspects of the app, the features, the learnings, the process. All the feedback helped a lot in making our app better. Having regular catch ups and IPMs made sure we had the stories detailed and ready to be picked up for development.

Art of refactoring

A lot of times refactoring takes a backseat while delivering POCs or during tight deadlines. We did not let that happen here. The team made sure to keep refactoring the code as we moved along. In order to achieve refactoring, one must have a test suite to support it and we made sure that we have good code coverage and scenarios getting covered through our tests (unit and integration tests). There will always be scope for more refactoring, the trick is to keep doing it continuously and in small bits and pieces, so we do not end up with a huge amount of tech debt.

Decision analysis and resolution

Collect facts and figures and maintain decision records. For example following decisions we had to take for ThoughtArena App.

  • Unity for the mobile apps — With Unity and its AR Foundation API, we were able to develop our app for both Android and iOS devices with minimal extra effort. The built in support for handling audio and video also helped to speed up the development process.
  • Java and Micronaut for server — We decided to work in the Java ecosystem based on the tight schedule we had and the familiarity of the team with the Java ecosystem. We chose Micronaut for its faster startup times and smaller footprint. Also, we were focusing mainly on APIs and deploying on cloud, so we felt it to be the right framework for us.
  • Websockets for real time communication — There are multiple ways to go about the real-time communication aspect. The first thing that comes to mind is websocket, an open TCP connection between server and client to exchange messages and events in a fast manner. There are other options as well to achieve real time, like queues, some DBs, etc. We went ahead with websockets as this does not require any additional infrastructure to make it up and running and it fits our use case as well — both client and server are sending messages to each other real time.
  • All our data is structured and straightforward, so it was easy for us to choose a relational database for transactional data.

Tech evaluation

Make sure we have a plan for tech choice evaluations to check if the tech is ready for prime time. We have evaluated multiple tech choices, for example cloud anchors, environment scanning, video playback, real-time sync, session management, maintaining user profile state locally etc.

Extendable design — Separate out the input system from the business logic, so that multiple input mechanisms can be integrated such as click, touch, tap, double tap and gaze. We have used Unity’s event system as a single source of event trigger.

XR tech considerations

  • AR takes a bit of time to initialize and learn the environment. Speed of this initialization depends on the hardware and OS capabilities.
  • Tracking of the environment goes haywire when the device is shaking alot.
  • Hard to control the AR system as it is a Platform specific capability.
  • Longer use of the AR might make the hardware hot and also consumes battery as algorithms for AR are really CPU intensive.
  • AR Tech is getting more mature year over year, but good hardware is still a dependency for this technology.
  • AR Cloud anchors are early in stage but getting matured in a rapid manner. It needs proper calibration. In simple terms the user needs to do a little more scanning to achieve good results (to locate the points). We have to de-prioritized this feature due to its instability.

Real time collaboration considerations

  • Define how long the session should be.
  • Handle disconnections gracefully.
  • Define approach for scaling sessions, we have to introduce distributed cache for managing sessions.
  • Cloud considerations — We have used Google App Engine (Standard environment) for our app, because of its quick setup, Java 11 support and auto-scaling features. Also, GCP handles the infrastructure part in case of GAE apps. However, Standard App Engine does not support websockets. So we had to switch to the Flexible Environment, which does su-pot Java 11 out of the box, to keep things on Java 11, we had to provide custom docker configuration in the pipeline.

Best Practices of XR Testing

Our learnings from developer level testing for XR applications

  • We learned that most of the devices have software emulators which can be integrated to the XR development tools such as Unity, it helps developers to test the logic without a physical device. We realized that 70% of the functionalities of ThoughtArena app didn’t require a physical environment/movement like adding/moving of stickies or playing of videos, making of API calls to get data, displaying list of boards or members in a space, adding of boards/deleting of boards. All these could be tested in the editor’s instant preview.
  • Cases where physical movement were required like the pinning and unpinning of Boards were the ones that could not be directly tested in the editor, without manually rotating the user’s POV from the inspector.
  • Things that had platform specific plugins like uploading of Image, saving snapshot of board to local storage could not be tested in the editor.

Unit testing

Unity engine comes with a great unit testing framework, and by plugging in a mock testing framework, we can test units very well.

  • Features dependent on API calls — We had an entire internal mock server setup early into development to reduce dependency on the APIs. Once the API contracts were settled on, we could write tests for them and continue development even if the APIs were under development
  • Board/Notes interactions by mocking user inputs such as click/touch
  • Simulated the interactions as movement via mocking Pinning/Unpinning of boards and moving stickies.
  • 2D UI interactions and use flow

Automation testing

We extend Unity Editor’s unit testing framework and built a functional test suite that can do integration tests and end to end tests for some scenarios on the plugin device. We have open-sourced an Automation Testing Framework — Arium have a look here.

Integration Testing

We have an integration testing suite for the backend, and for testing the APIs. Integration testing helped us to test out the websockets as well and if they are working in the intended way. These tests run on the CI after every push we make to the backend repo. Since our code was also using GCP storage as well, we needed to make sure that we are able to replicate that behaviour. We found out that GCP storage APIs do provide a test storage, which mimics the storage locally and used that in our integration tests.

Functional Testing

  • Since XR depends on the environment, testers need to test the functionality in different environments, different lighting conditions, noise level, indoor, outdoor, moving into the environment to test the stability of functionalities.
  • Acceptance criteria for the XR stories are quite different from what we see in general software development stories, a lot of factors need to be considered.

Keep the user immersion intact while working.

Acceptable Performance on different environment conditions.

A functionality working very well may not work that well if the environment condition changes, then the user may lose the immersion, and the XR app does not meet the expectation.

Testers must consider spikes to define the correct acceptance criterias. The factors which may impact may not be known before, for example what would be the impact of rain on a business story that requires outdoor testing.

Acceptance criteria also needs to be evolved as the product evolves.

  • Make sure we have supported devices with required resolutions for the test team, plan it from the stars.

User Experience Testing

Plan for user experience testing to get user insights and validate the hypothesis with users. It helped us to better understand the impact, benefits, limitations, flexibility of our spatial solution. The user experience also focused on the aspect of collaboration and assessing the effectiveness of the developed concept in terms of usefulness and usability.

It can be divided in two parts: User research and Usability Testing.

  • The user research covers as-is analysis of user, user interactions and tools the user is using relevant to the problem statement of the solution, and also note down user expectation from the changing environment.
  • Usability testing covers how well the solution engages the user, and focuses on usability and usefulness of the solution.

The next section describes the result of user experience testing.

User experience testing should cover quantitative and qualitative analysis. Here are the our key learnings

  • Recruit users for dedicated testing sessions with the UX Analysis team. Recruit minimum 9 users.
  • Plan user interviews before and after the testing sessions, and over user behaviour during the testing.
  • Recruit independent test users, and ask them to share their observations in a survey for quantitative analysis.
  • Recruit users with diverse backgrounds, experiences and familiarity with XR.
  • Define a test execution plan with a set of scenarios for dedicated user testing sessions, and try out mock sessions to see if the plan is working and we are getting the expected inputs.
  • Define detailed questionnaires for independent testers.
  • Collect observations only after a few trials of the app otherwise we might get correct feedback on the app feature if app onboarding/user training is not that great.

< Part 1: XR Product Definition < Part 2: XR User Experience Design <

Conclusion

XR tech is evolving at such a rate that the untapped potential boggles the mind. XR business use cases are growing with convergence to the AI and ML. XR devices are now able to scan and detect the environments, and can adjust to the environment. The standards and practices are growing from the leanings and experiments industry is doing. We have shared our experience from building XR applications.

--

--

Kuldeep Singh
XRPractices

Engineering Director and Head of XR Practice @ ThoughtWorks India.