Week 4(Part5) :Prototyping and workflow

Nissie Bungbrakearti
code3100
Published in
3 min readApr 2, 2017

--

Prototyping and workflow: creating communication diagrams

As a HoloLens team we divided up the work so that the important tasks would be completed first and no two people were doing the same job twice. I modelled some of the prototype holograms in 3DS max to import in to Unity and played with some particle systems (This can be seen in earlier Week 4 blogs). I then went on to creating diagrams and a detailed workflow system for the HoloLens team.

The workflow created is linear, however has many pathways from each major stage. For example, in the first stage, working in 3DS Max would allow us to use a range of modifiers including editable meshes and noise animations and then we would be able to create more complex animations through bone rigging. This would be exported then imported into Unity 3D where particle systems can be used to create a dynamic visualisation. Spatial sound can be achieved in unity as well as spatial mapping with the use of the HoloLens. Once built, the app is launched in Visual Studio where the C# script can be revised and dynamically updated and also deployed either the an emulator or directly on to the HoloLens.

Diagrams were also created to communicate how to prototype could work (as seen below).

LAYOUT: We wanted the orb to be in the centre of the space with two secondary orbs that represented fight and flight. we also wanted to spatially map the pavilion walls and map on some kind of pattern that would represent the orb and the neural connections. However, these are all proof of concept ideas and can be developed further.

INTERACTION: Obviously, interaction is a big part of the HoloLens and we want to incorporate it in to our scene. We want the user to be able to interact with the fight or flight secondary orbs through gesture control. We would also like to incorporate voice control, however this is yet to be finalised.

INFORMATION DISPLAY: To give context to what is happening around the user we want to be able to integrate some text that will provide some useful information. We don’t want the text to be there at the start or just be mapped to a wall as that would take away from the immersion and possibly distract from the orbs and the main scene. We decided a way to integrate text in to the scene was to be able to tap on the main brain orb and have text appear above it that would rotate to always face you no matter where you stand.

--

--