Immersive Shopping Case Study

Make online shopping decisions with confidence by engaging virtually with the clothing apparel you are shopping for.

Saara Kamppari-Miller
Designer Geeking
11 min readApr 26, 2018

--

Intel and Zappos Immersive Shopping at CES 2016

Virtually Try Before You Buy

See your fit on an Intel® RealSense™️ Model based on you.

An example of the type of new usages that will be possible is an immersive shopping experience from Intel and Zappos that was unveiled for the first time at CES 2016. Initially a beta release, Intel and Zappos will host pop-up shops and events where invited beta testers will be able to capture their body shape and measurements using a device with a world facing Intel RealSense Camera R200 that is transformed into an Intel RealSense Model. For a limited set of denim products, beta testers can virtually try on multiple sizes and styles on their model and understand how it would fit them.

Sanjay Vora, Intel Client Compute Group VP

The Problem

Intel does not make consumer products. Intel makes the processor platforms that enable consumer experiences. To decide what our processors will excel at, we needed to create a vision and a roadmap to new consumer experiences that evolve the relationship between humans and their compute devices. One of the opportunity areas identified was immersive shopping.

Shopping is core to how humans live. We shop for everyday necessities, and once in a lifetime purchases. We shop to pursue our hobbies. We shop for gifts to celebrate occasions. We shop to express ourselves and our style.

We identified that online clothing shopping was a space we could make a large impact for both end consumers as well as retailers. Up to 40% of clothes bought online are returned, primarily due to size and fit.

Revolutionizing the relationship between people and technology.

It’s Not a Shopping Project, it’s a Human Project

What really struck me early in our user research was that this is a deeply human project. This is about people’s bodies and their self-confidence. We dove deep into challenges of self-image and identity, and fiercely protected and respected our users even when that made the technical challenges even more challenging.

Vision to Pilot & Roadmap

This project started with a goal of creating and validating a “virtual mirror” vision for 4–5 years out. It expanded in scope to include a short-term pilot with Zappos, and defining a strategy to create the content ecosystem required for new immersive shopping experiences. A large part of my role included crafting a year-over-year (platform-to-platform) roadmap that defined the stepping stones to enable the future vision, always driven by the human-centered goal of helping people shop online with confidence by engaging virtually with what they are shopping for.

For purposes of confidentiality, I am generalizing the vision and roadmap and focusing on the pilot that we publicly announced at CES 2016.

Censored Summary Slide From Strategy Deck for How Online Shopping Will Be Transformed

Team and Role

I was the lead user experience designer on a multi-disciplinary team. The core team included a business planner, UX researcher, technical architect, and a visual designer who also became our expert on 3D body models. The extended team grew as we executed the pilot, working across groups within Intel and using an agency for web development.

I was responsible for keeping the end user center to all discussions where we made decisions about what our solution is. I went deep into the interaction design when executing the pilot.

To execute on the pilot, we engaged with multiple external companies for body modeling, clothing digitization, and fit recommendation. I created pitch decks and pitched our experience story to retailers and brands to bring them on board. Zappos was our first retail partner, and I led the design collaboration to define what our Zappos VIP customer pilot would be.

Users and Audience

We started focused on the consumer who shops online. We grew to address the needs of the online retailers, and the entire clothing design and manufacturing process in order to create the ecosystem to support the experience for the end consumer.

Understanding the Fashion Design and Manufacturing Pipeline

Design Process Part 1: Vision

During the 4–5 years out stage, I was working on multiple opportunity areas, vetting the concepts and expanding them just enough for the technical architects to perform tech decomposition (analysis of the tech capabilities needed to enable these new experiences).

Facilitate Expansive Thinking

For the immersive shopping opportunity area, I ran a design session to synthesize insights from guerrilla user research and expand upon some initial concepts that were previously vetted through a survey.

Immersive Shopping Design Jam Goals and Structure

Story Maps and Minimum Viable Prototypes

The next big step was to build a minimum viable prototype to test with users and get further buy-in from stakeholders. To determine the mix of hard problems and proof points to focus on, I built a story map with the team.

This sticky note exercise enabled us to quickly agree on what’s in and out of scope. I sketched out the core narrative on the top row, and then we wrote down how this may be achieved below. From this we were able to pick out how we would showcase the concept with a few key use cases, including the hard problem of body scanning. We used an agency to illustrate higher fidelity storyboards and build out a smoke and mirrors prototype to convey the experience enabled by future technology.

Story map made of sticky notes for a virtual mirror minimum viable prototype

Our UX researcher ran a study using the minimum viable prototype, for which I requested a Kano model approach so we could understand what was table stakes versus a delighter. This was important because we were envisioning future tech, where the “that’s cool!” response can bias people towards solving technical challenges for shiny nice-to-have features before solving the big hairy challenges for the core user needs.

Project Investment Approved

With the results from the minimum viable prototype, and other business inputs, our stakeholder VP directed the project team to build something real that could ship in the short term, while simultaneously building out the capabilities and ecosystem for the long term vision.

Design Process Part 2: Pilot

Before we could engage with ecosystem partners to build a pilot, we needed to re-design the user experience journey with short-term technology constraints. Even though we set aside the long-term vision, we kept the user insights we had learned to create a technologically feasible experience that still meets user needs and delivers the magic moment for online shoppers.

The Resulting Short-Term Product High Level User Journey

Human-Centered Re-Design

We ran additional user studies, and did competitive analysis of the current state of shopping online with fit recommendation providers. With the additional insights and analysis I put together our short term vision, outlining the key experience elements that we must address and cannot compromise.

Edited Storyboards & Pushing Back on Technology

I sketched new storyboards, leveraging the long-term vision storyboards as a baseline to show the side-by-side comparison of what’s new or different in the short term versus the long term.

One of the technical challenges we had at this point was how to conduct the body scan. The existing standard was to walk around a person 360º to create their scan. Our user research showed that (A) people’s homes do not have enough space to do a scan like that, and (B) it feels creepy to have someone walk around you like that while you try to stay still.

Working with our technology architect, we defined a new scanning technology approach that would feel more like having a friend take your photo. Now, instead of your friend walking around you, you pose four times (front, side, back, side).

Edited Storyboard Sketches to Show New Short Term Experience Changes

Interactive Axure Prototypes

I built rapid interactive prototypes using Axure, where I had to fake the new technology. As a designer at Intel, part of the prototyping process is figuring out ways to prototype technology that isn’t available yet. In this case, I hacked some HTML to do a camera video passthrough to fake the experience for how you would help capture a friend’s body model. This enabled us to iterate and test the interaction design for body model capture way before the technology was ready to develop with.

Digital sticky notes to mark up screenshots from the interactive Axure prototype.

Service Blueprints

I built service blueprints to understand and explain how all the backstage pieces across multiple companies would need to interconnect in order to deliver on the user experience. The service blueprints were very useful at partner meetings when we needed to go deep into how we would work together.

Service blueprints, from whiteboard to digital. blurred for confidentiality.

Testing the Real Content

As we got key ecosystem partners on board for body modeling, clothing digitization, and rendering, we were able to start testing the experience components with users and see if our technology would be able to deliver upon the promise we were making.

Denim Rendering Samples: You can see how the same jean size on different bodies results in different fits.

Side Note: Our visual designer did an amazing job working with our body model and clothing digitization partners to create the natural pose and default clothing (no naked models!) that would work across body types.

Becoming a Domain Expert

I had to become an expert in understanding and describing the fit of jeans. I co-authored the baseline language for how we explain fit details, because we knew from our research that we couldn’t rely on visuals alone. Never have butts been talked about so much, and so seriously, at work.

Leading Design Collaboration

Finally we were ready to pitch to retailers to find a partner to go to pilot with. Once Zappos was on board, I led the co-design process to decide what our joint pilot would be. If our long-term vision was a cake, and our short-term was a cupcake, then the very first pilot was a mini-cupcake. I used this analogy to everyone aligned about scope and focus, and rally together to make the most amazing mini-cupcake possible.

Side Note: Our entire team got little cupcake charms to hang proudly off our badges after we successfully executed the pilot.

Partner Design Collaboration: 6-up Sketching Exercise and White Boarding Session

Interaction Design

Intel® RealSense™ Model

At CES 2016, we gave a demo of how people would be able to capture their accurate body model using Intel technology, and virtually try on clothes to understand how the different styles would fit them and be able to confidently choose the right size to order based on their fit preview.

The video below is a screen capture of the prototype that I built using Axure and HTML that was used as the demo at CES. The body models are from real scans we did, and the jeans are real jeans we had digitized with our partners.

Intel® RealSense™️ Model Account

Our goal was that people could create their model once and then use it to shop online across any supported retailer. For this paradigm, I designed a simple Intel branded micro-site. The expectation was that people would primarily interface through white-label solutions embedded on retailer’s websites, and only come back here for account management.

Zappos Virtually Try Before You Buy Micro-Site

Many interaction details were obsessed over to create this experience:

  • Your model is front and center, showing both the visual and the written fit details side by side, so there’s never a need to scroll away.
  • Simple 3D interaction — just swipe to rotate. No getting lost or showing disrespectful angles.
  • Just the right level of zoom such that you are focused on the jeans, and in a way that works for all body heights.
  • Changes between sizes and styles is instant and preserves context, so there’s no change blindness.
  • Flats and heels poses for women’s models. We had to fight for this one, as it meant doubling the number of images rendered. It was important to do, because many women’s jeans dragged on the ground and did not accurately reflect how someone might wear the product.
  • Tap to zoom — the zoomed in fit detail updates as you pan around on the zoomed in model. You can still rotate while zoomed in using the rotate buttons (no need to flip back and forth to see zoomed in details from all angles).

Outcome

Even though the project was ramped down due to organizational changes at Intel, we still achieved the following outcomes:

Zappos Pilot

In early 2016, we ran a pilot with Zappos VIP customers to create their body models and let them pick out clothes using our web interface. For the pilot, we wanted to learn how well we were matching their digital selves to the real fit, so we had the VIP customers try on the clothes and compare the digital and the real. The results of the pilot were positive and promising.

IEEE 3D Body Processing Standards Body

Because our project identified body modeling as crucial area where the ecosystem and the technology needed to advance together, Intel invested in the creation of a IEEE 3D Body Processing standards body along with the partners we were engaged with as well as other industry partners.

Epilogue

I believe that it’s a matter of “when” not “if” immersive shopping experiences like what we piloted will become a reality. The latest news is that Amazon bought Body Labs, the body modeling partner we were using in our pilot with Zappos (owned by Amazon).

Saara Kamppari-Miller

Design Strategy, User Experience Design, Interaction Design

--

--

Saara Kamppari-Miller
Designer Geeking

Inclusive DesignOps Program Manager at Intel. DesignOps Summit Curator. Eclipse Chaser.