Introduction
Night at the Museum is a simple virtual reality mobile guide to walk you through Augmented Reality and Virtual Reality applications in art and culture now.
Conceptualization
I had a chance to attend LiveWorx 18 conference in Boston. The host of the event was PTC Inc, a computer software company who owns Vuforia AR and versions of PLM and CAD software. I was able to experience how AR technology integrates into manufacturing and assemblage. VR integrates better into 3D CAD drawing and prototyping. The conference covers industry that is larger than art. In that, I decided to feature how XR (VR and AR) is being used in my area of specialty — Art.
I started with Marcel Duchamp whom made art historically made possible for artists to claim even feces on streets could be an art piece (therefore VR experience could be a form of art). Featuring Chris Marker’s all still image black and white film, La Jetée, 1962 was also important to address human desire to be in other world through the lenses of a camera.
Using Duchamp’s The Large Glass as a core inspiration for the main 3D object at the center, I plan to populate five viewing booths varied by each VR/AR practice/research/experience in art and culture.
The Crystal Palace for the Great Exhibition in 1851 becomes a foundation for the building structure where all the viewing booths will be placed. It was a symbol of evolving new technology that built with steel frames and large pieces of glass.
The target user is an art student who is interested in VR/AR technology and tries to learn for own’s artistic expression.
The app will be built in google cardboard environment which you can get it for $2 from Flying Tiger Copenhagen in NYC. It is a stationary store that sells cute and small trendy objects — very popular among students. As a starter, the app will be interactive through GazePointer activating and deactivating engaging objects. Rendering resolution should be as high as possible. If the graphic is pixelated or grainy, people lose interest immediately. High poly 3D models are really up to the artist aesthetics. I will purposely be using a low poly model to accentuate edges of a glass dorm. For an app to be more engaging and learning experience, I like to integrate active elements to enhance the user experience. I decided to use GroundRayCasting to maximize that point.
Building
Bride shapes is a center piece of this app which I molded with Blander. I also modeled a fog with three windows that the bride pumps out. The crystal dorm is built with ProBuilder, Sliced in half in Blender and exported back to Unity.
Putting together all the assets and arranging in Unity took least time. When it comes to designing UI, I used Singleton pattern that I learned from a local meetup group. After three days of struggle try to integrate GVRpointerGraphicRaycaster script for the Canvas element with World view, Singleton pattern was not suitable for GVREventSystem where I needed to make individual UI per viewing station.
User Experience Testing
- The user testing came at the later stage of the building. I intuitively learned appropriate scale from Puzzler project where I used Game View to constantly adjust each object size.
- Users were given a list of open question to response in complete sentences.
- How is the mood of the space?
- Do you feel comfortable moving around?
- Does any button or interaction awkward or confusing?
- Is anything you like or does not make sense to you? Can you explain?
- Is any object feel too small or too large in comparison to your virtual size?
- Can you share anything from the experience?
Answers
The first thing I noticed is the vivid imagery of the environment and a lot of objects. I see trees, some things that look like barrels, and some additional panels. It feels open and airy, so it feels very comfortable. The movement was fairly smooth. It was fast, in that as soon as I clicked it felt like I teleported, but it didn’t make me sick when viewing it with the headset. All of the buttons I expected to click worked fine and I didn’t experience any glitches.
Clicking the barreled items triggered some sort of informational pop-up that was large and in my face. The text was clear and the scale was natural enough that I didn’t have any issues reading the text. When I clicked the panels on the side, different actions would happen. Sometimes a video would play, sometimes I’d hear noises, etc. It was pretty engaging and as soon as I figured out they were actionable (it felt very intuitive), I was excited to see what the other ones did.
One of the items, the one talking about VR and violence by an artist, did not have an actionable item in the panel and that was a little confusing at first. I wasn’t sure if something was happening and I was missing it, or that that one panel just didn’t have an action unlike the others.
There’s a large floating object in the center which I wasn’t sure how it meshed with the rest of the environment. It had three large photos but they didn’t do anything. There was a floating object next to the photos that triggered a pop-up just like the barrels, even though it wasn’t a barrel. I wasn’t sure if there was something significant in that object being different or not, but it was very intuitive to click into and I’m glad that something happened when I did so.
Overall I found the environment very enjoyable and interactive. There was a lot going on, but because the space was so open, I didn’t feel overwhelmed. At the same time, there was enough going on within my line of sight that I never got bored. I almost wish the world was even bigger with more things to click on, which to me signals that it was a very fun experience that I wish there was more of. I feel like even if everything doubled (number of activities, space, etc) that it would be pretty cool to walk through.
From Feedback to Finale
The app serves as a guide to connect historical artists and their pieces to current VR/AR practices by 21-century artists and art researcher. I fixed some UI error that was addressed in the user experience testing. For example, I add Event Trigger Play() and Pause() to let a user control video at their viewing pace. This also solves slow responsiveness caused by running multiple videos all at once. Now, each video element intractable so it runs smoothly without burning the phone as well as different sounds running all at once.
I also made the scene much more brighter than before and changed the video material to albedo from metallic to make images clear. All of the changes after the user testing were mostly cosmetic and not as much experience related. The most major change that came from experience feedback was moving speed in-between movement. At first speed was too fast for some users so slowing it down made it feel more natural and additional users confirmed this result.
Conclusion
The Bride is the icon for the app. As a user walks through each element, the one will interact with different viewing areas. Each one talks about either VR or AR study case over how art and cultural sector use the technology. Joran Wolfsons’ violence addresses philosophical questions of violence. Rubin Museum integrates iPad AR museum collection guide. Film industry expands visual storytelling through VR space. Like ACTIVATAR, a physical gallery is no more the limit. the app uses every upgrade as a new round of exhibitions carefully curated with Museum of Moving Image, NYC. There is also study performed to see any educational learning benefit of integrating VR in a drawing lesson. A user will able to learn about VR/AR in art and culture while interacting with each image and video with pointer trigger while wondering around the crystal dorm. I would love to make this app much bigger with different scenes where an art student could learn about the anthropological art history of VR/AR as art media through this app.