Hands-on in a Hands-free World

Part two in our series Reimagining Retail

Alan Waldock
Make Associates
5 min readJul 2, 2020

--

Photo by Jacek Dylag on Unsplash

From Leap Motion to the Microsoft Kinect, gesture based technologies have been touted to transform how we interact with both devices and the physical world for many years. Recently, they’ve once again been thrust into the spotlight as a possible solution for helping consumers who have naturally become increasingly conscious of the risks of touching potentially infected objects and surfaces. And it’s an inconvenience that works both ways. Retailers, even those with firmly established e-commerce platforms, are having to experiment with new sanitisation technologies, quarantining and sterilising products that have been either returned or tried out before finding their way back to shelves.

A handful of problems

Whether test driving a car or upgrading to a new TV, bricks-and-mortar retailers have always been a place for consumers to get hands-on with products before committing to a purchase. Even when that purchase was later made online — the practice known as ‘showrooming’— over the last decade retailers have leaned into this trend, designing their flagships accordingly as the point in the customer journey where proof-points are proven and brand loyalties are cemented.

Even retailers who’ve been able to sweat their e-commerce during lockdown are grappling with a host of new challenges, such as how to resell returned items or find enough staff to help process those returns efficiently. Typically, in the US about 20% of apparel gets sent back to its supplier, with 41% of consumers buying multiple colours and sizes of the same product with a view to keeping one and returning the rest. These figures make it one of the most returned categories of goods in the world. With UK government advice currently recommending that fitting-rooms remain closed for the foreseeable future, it begs the question: how could we use contactless technologies to put products into people’s hands while still adhering to evolving safety guidelines?

The Digital Catwalk

From the 24th — 27th of October 2019, the Kühlhaus Berlin played host to Destination Denim, an experiential event for Amazon Fashion fusing technology, music and culture.

Our brief was to concept and create a stand-out experience that captured the breadth of choice available on Amazon Fashion, as well as explore new and innovative ways for shoppers to engage with an online catalogue, without the need for physical product in every available size.

Our response, the Digital Catwalk. A 12 metre LED wall animated with a never ending flow of virtual runway models, choreographed to a bespoke DJ mix. Working with Amazon stylists, we created 36 unique looks crafted from a diversity of fashion labels, all shoppable on Amazon Fashion. Built in the gaming engine Unity, the Catwalk was a fully real-time interactive experience — a product showcase that used advanced, AI driven camera tracking to make shoppers a part of the show, placing them in full control of the outfits as if standing in front of a digital mirror, all shoppable via scanning the screen in front of them through the Amazon app.

The devil’s in the detail

Digital runway avatars may sound like something lifted from the pages of science fiction but the niche world of virtual fashion is quickly becoming a very real discipline.

Surprisingly, creating digital fashion isn’t all that dissimilar to the real thing. Clothing patterns are ‘sewn’ around virtual mannequins, before applying the physics that simulate the properties of the real fabric. For the Digital Catwalk we developed a robust production pipeline to transform each real world item into an accurate digital replica, all the while leaning on stylists’ keen eyes to ensure both the fit and fabric dynamics were true-to-life.

That attention to detail would be meaningless without the ability to be able to fully explore the clothing in 360º, which is where the Catwalk really came to life. Creating a 1–to–1 relationship with a shopper’s digital reflection meant there was absolutely no margin for tracking error when the virtual models mirrored their movements. Using the latest skeletal tracking camera we developed a system whereby shoppers could explore outfits from every angle. As the algorithms kept up with, and anticipated, complex movements it unlocked the total freedom to view outfits front, back and side. When combined with realistic fabric physics, it created an amazingly playful and shareable experience that gave consumers a totally new take on the fitting room.

So what’s next?

With this year’s Moscow and London fashion weeks pivoting to digital-only events it’s only a matter of time before fully rendered clothing becomes a mainstream (virtual) reality.

Advances in real-time rendering offer the ability to create ‘digital twins’ of consumer products that can be interacted with as you would the real thing — an advance that has wide ranging implications, not just for consumer confidence but also sustainability and the design of the shop floor itself. Layering technologies like augmented and mixed reality unlocks new possibilities both in and out of store, fundamentally altering our relationship with retail spaces.

A recent study identified that 80% of 2000 UK consumers surveyed have, or expect to, change how they interact with technology in the public sphere, in light of COVID-19. While certainly a challenge to overcome, this push toward contactless retail has the potential to be more than just an inconvenient new reality, and instead usher in a boom in touchless interfaces rooted in voice, gesture, haptic feedback and mobile, delivering on long-made promises to transform our interactions with the world around us.

--

--

Alan Waldock
Make Associates

Co-founder of Make Associates, an innovation studio built around a team of expert makers. Digital designer and data geek. Email: alan@makeassociates.co.uk