ROB|ARCH 2018 | Robo-Stim: Mixed Reality Robotic Stimulation for Collaborative Design Exploration

Mitch Page
BravoVictorNovember
6 min readSep 20, 2018

--

Ryan Luke Johns (GREYSHED), Axel Kilian (MIT), Jeffrey Anderson (Pratt Institute)

The advent of robotics in the creative and construction industries has led to an amazing revolution, changing not just how things are designed and made, but also transforming knowledge cultures, politics and economics that surround them. As such, the ROB|ARCH 2018 conference — hosted by the NCCR Digital Fabrication and ETH Zurich — will continue this path, developing and revealing novel insights, applications and impacts of this transformation within the scientific, creative, and entrepreneurial domains, including, for example, architecture, structural design, civil and process engineering, art and design, and robotics.

Robo-Stim Workshop

The Robo-Stim workshop, along with the majority of other conference workshops took place in the newly constructed Arch_Tec_Lab at the Hönggerberg Campus of ETH Zurich; home to the Master of Advanced Studies ETH in Architecture and Digital Fabrication and to many of the NCCR Digital Fabrication Program researchers.

Following my recently developed interest in applying augmented-reality (AR) interfaces to various aspects of the architecture and construction industry, the Robo-Stim workshop at ROB|ARCH 2018 seemed a good opportunity to get a general introduction to the software workflows, opportunities and limitations, and possible collaborative uses of AR on iOS and Android smartphones and tablets.

To quote the workshop intro,

“ This workshop will engage a combination of virtual reality, augmented reality and robotics to produce immersive environments for collaborative design and fabrication”.

Attribution: Peter Graham, VRFocus.com
Workshops spread across the Arch_Tec_Lab, ETH Zurich.

Robo-Stim Workshop — Day 1

The morning session began with Ryan Johns leading us through a general introduction to ARCore, Google’s freely available platform for building AR experience for both Android and iOS. He explained that it is through ARCore that your phone or tablet is able to sense its environment, understand the world and dynamically interact with information. It does this through tracking a number of key variables including but not limited to,

  • Motion Tracking
  • Environmental Understanding
  • Light Estimation
  • User Interaction
Fundamental Concepts, developers.google.com

Setting out to build our first generic AR application for mobile, we were divided into those who wished to develop for iOS devices (iPhone’s), those who wished to develop for Android devices (Samsung, HTC, Google Pixel etc.), and those who lost their device at the previous evenings drinks (read: me).

Working in Unity, a 3D and 2D game development software platform, we were able to create and/or load custom 3D model files into our scenes, and compile those scenes into applications for mobile via various ARCore SDK packages. Despite a bit of installation drama, this simple workflow turned out to be quite simple.

Compiling for iOS required the use of XCode, whilst Android compiled straight to an apk. file

If this all sounds a little confusing, well, it was. So, as much to clarify this for myself, as to help anyone reading this, I have put together a little graph below to help explain the hierarchy of software dependencies that result in a success AR application.

Workflow Diagram for iOS and Android

On opening the newly installed application on our device and joining the group session, the phone’s camera would scan the visible surface environment, construct a surface plane, and allow for a 3D model to be placed in AR through a touch gesture on the phone screen. Even cooler was the fact that multiple people (via their devices) could login to this same scene, and see the same geometry in the same place!

Attribution: Ryan Luke Johns (GREYSHED), Axel Kilian (Princeton University School of Architecture), Jeffrey Anderson (Pratt Institute)

Sufficiently saturated with new content, we left the campus via a particular sassy uber driver, and headed to Frau Gerolds Garten for dinner; an urban garden space of reused shipping containers in the heart of Zurich-West.

Yes. That’s a roof-top artificial wave pool.

Robo-Stim (Day 2)

The morning of the second day began with Jeff Anderson taking us through a more in-depth look into Unity, the 3D and 2D game development software platform we had been using to control the interactivity of three dimensional geometry and two-dimensional user interface elements (like buttons and text inputs) in our scene.

If this reads a little heavy, fair enough. However if you have played (or watched someone play) the app Pokemon Go, then you should know what I’m talking about!

Attribution: Giphy — Pokemon Go

For someone who is relatively new to Unity, understanding the implementation of and relationships between three dimensional geometry and two-dimensional user interface elements was challenging but very interesting. There was a number of other topics covered including realtime collaboration via PhotonEngine, ARCore Cloud Anchors, and some matrix operation math, however, for the sake of the brevity I’m not going to detail those here.

Prototype screen recording of template AR application on iPhone 8 Plus

As we moved into the afternoon, Jeff encouraged us to consider the possible applications of what we had learnt so far, and begin designing and developing a prototype application. Choosing to head down the path of AR and interior way-finding, I defined my project as follows,

To overlay 1:1 augmented-reality way-finding geometry and information over a corresponding physical space by utilising the detailed digital models we create in architectural practice.

With the help of Axel, Jeff and Ryan, we each prototyped a couple of example mobile apps based on a given template file. Each prototype covered particular technical concepts specific to our personal projects. For me, these concepts included:

  • Creating 2D User Interface Buttons
  • Importing Model Assets from Rhino and Revit
  • Model Animation in Unity
  • Audio in Unity
  • Local Cartesian Coordinate System Setup via Device Touch Interface
  • Multi-user Participation through Cloud-Anchors.
BVN Office AR Application Sketch Example
RoboStim Workshop Participants

Robo-Stim (Day 3)

Tthroughout the third day of the workshop, much of the tutorial focused on the ability of Unity and the template RoboStim application to live-stream the positions and base locations of the Universal Robots UR-5 and UR-10 six-axis arms to our phone AR application, or a virtual reality environment using the HTC Vive. This clever implementation of PhotonEngine and Cloud Anchors allowed us to preview a physical robot’s toolpath movements in AR and VR; a virtual simulation overlay over it’s physical position!

Whilst this aspect of the workshop was certainly the most technical and advanced, I will confess that it may have been slightly beyond my abilities to follow along.

Ryan, Jeff and Axel’s Robo-Stim Workshop certainly demystified some of my preconceptions about app development for mobile devices, and how accessible it can be to the average user like myself. Whilst my way-finding AR application is still very much ‘in-progress’, the Robo-Stim Workshop was great in providing the conceptual and technical foundations to finish the job myself.

--

--