A Design Journey — from Prototype to Dreamforce (Part 3)

Amy Lee
Salesforce Designer
7 min readMar 31, 2015

--

This is the third in a series of other posts how we created a fun, interactive Design Shop shopping experience.

At last year’s Dreamforce 2014, I had the unique opportunity to apply my software engineering skill set in a fun and challenging way. As Rick wrote about our project, there was an Ikea table, sensors strapped under it, pasta containers with RFID cards, USB cables going everywhere, sound effects…

The crazy system diagram of our Design Shop components tied together by RFID, WiFi, USB, and more.

It was software that ties all this together, and this is the journey we took in bringing a joyful shopping experience to thousands of people.

Putting the User in User Experience

I’m a rapid prototyper on Salesforce’s UX Engineering team, lending coding chops to the larger User Experience organization. In our little dev team we tend to geek out about nerdy stuff like the latest in coding frameworks, efficient algorithms, and build systems. However, we always remember there is a customer that uses the product of our work.

For Dreamforce we created our Internet of Things shop, and we wanted the attendees-turned-customers to have:

  • Simple instructions. A glance should be all it takes.
  • No keyboard or mouse input. Just physical gestures or interactions.
  • Immediate visual feedback. Animations and audio would be nice too.
  • Quick sessions. The whole experience should be under a minute or so.
  • Fun!
Perspective diagram of a customer in the shop. Floor diagram below.

There Was an App for That

We had this great idea to put an app on the shoppers’ smartphones, and they would go shopping in a mock retail store. The app would passively gather data from iBeacons in the products, in “smart shelves”, and in various zones. On their way out, they would see a wall of analytics dashboards.

We tried a few iOS prototypes with Estimote beacons and found that in small spaces there was too much BLE signal noise. (That’s a whole other blog post!) We couldn’t get the accuracy to know exactly who was looking at exactly what.

Everything is Awesome: the Lego of Microcontrollers

Really cool things happen when you leave yourself open to new ideas.

We were talking with Jon McKay of Technical Machine, who introduced the Tessel microcontroller to us. We told him about our iBeacon woes, and he mentioned the Tessel could be extended with an RFID module. This turned out to be the solution we were looking for!

Tessel v.1: 4 module ports on the side, GPIO pins at bottom, USB up top.

The flavor of RFID we selected was highly accurate for our case because it requires cards being in close proximity to the sensor — about 3" away — including going through particle board tabletops. We put a Node.js program on the Tessel to read the card IDs and send them back across USB to the main PC.

It dawned on us: we can build “magic shelves” with hidden Tessels that know when a product was picked up. This would require getting a half dozen Tessels and enough RFID cards/tags for the products. All we needed then was to identify our shoppers, and it turned out all the Dreamforce 2014 badges would include RFID cards — jackpot!

JavaScript/Node is Eating the World

Running JS in the browser and Node.js on devices+servers saved our sanity. We were able to develop in a common language on almost every device:

A mapping of the programming languages, frameworks, and data formats. (Compare to the system diagram above.)
  • Tessels ran a Node client that listened for RFID card connects and disconnects. Outputs were short key-value pair packets over USB.
  • The PC ran a host server that aggregated all the Tessels’ packets and could connect to Heroku. The PC also ran a Node-based Express web server.
  • The Main monitor showed the main store UI that polled the PC via AJAX.
  • Heroku ran another Express server that handled the PC’s AJAX requests, and updated a Postgres database.
  • The Android tablet used a Java library to access its NFC reader and passed that to a web view that used AJAX to send to Heroku.
  • The laptop ran a custom Mac app with 4 web views that used AJAX and JavaScript.

Let’s Play Tag

As we got closer to Dreamforce we had to adjust the whole shop concept from being a self-directed shopping experience to being more of a one-at-a-time session. Customers would register in line so we could collect their name, T-shirt size, and T-shirt color. Then they would enter our “shop” (it became more of a booth) by tagging their badge on a podium. They would browse design choices by picking up containers, then tag their badge to check out. Finally, the analytics wall turned into a laptop.

Tagging an RFID card to a Tessel or an Android tablet to get the ID string was a piece of cake: Rick and I used a Node.js PN-532 library, and teammates Ivan and Adam worked with Android’s NFC NDEF tag dispatch system.

We had 5 designs to choose from. Instead of going with 5 individual shelves, we went with a common Ikea shelf. We secured 5 Tessels under it and put 5 plastic coasters on top to indicate where the containers should go:

Rick uses some hardware store know-how to mount the Tessel microcontrollers in breathable housings under the table.
Pasta containers concealing RFID cards stuffed into the bottoms, sitting atop 5 square coasters. This provided a visual cue where to put the containers — plus we could take away the coasters and prove the table was just an ordinary table.

The real trick here was the insight that the microcontrollers weren’t being used to detect when a container was put down, rather it was when the container was picked up. The disconnection of the RFID card from the Tessel is what mattered.

And with some of Jon’s help we got reliable polling on the Tessels down to 200ms. We created a compact key:value string because it was quick to parse, easy to read, and able to grep on a per-line basis. We were going for brevity for the connect/disconnect packets, so they ended up looking like:

Zippity Doo Dahs

Software can be dry and boring, but adding a little meaningful motion and audio can make all the difference.

We wanted the screen to reflect the actions of the user. When the container was picked up they would see their product preview rise up onto the screen. If they picked up 2 containers we showed a side-by-side comparison:

Customers got a preview of their selected T-shirt design. Animations and sound effects confirmed their actions.

Fun audio cues let the user know when they successfully tagged on, when they picked up a product, when they put it down, and when they confirmed their purchase. It added weight to their interactions.

(It’s like how a scary movie’s sound effects make the movie resonate more with you emotionally. Without audio cues, the most intense scenes would fall flat.)

Same Data Through Different Lenses

One of the most unique things about the experience we put together is the connection between 4 different Salesforce properties:

Heroku: the fast and easy way to deploy a web app in the cloud

Sales Cloud: the secure cloud database solution

Early version of Journey Builder

Marketing Cloud: targeted 1:1 marketing journeys

Custom Analytics build for the Design Shop

Analytics Cloud: making data visualization a no-brainer for everyone

I was also still new to Salesforce when DF’14 happened, so it was a pleasure for me to see how fluid it was to connect all these things together. We wrote our basic cloud server app in Node.js and published it to Heroku. We provisioned a Postgres database with ease. We set up a development organization in Sales Cloud. We created custom database objects and synced the data with Heroku Connect. And with a little guidance from our other teams we had simple integration with Marketing and Analytics Clouds.

We felt it was important to emphasize that everything the attendees experienced was real and available from Salesforce, buyable on the Internet, or obtainable from a hardware store. This personalized shopping setup could be refined into a polished product.

Inspiration Is Part of Our Vision

This project was typical of the work we do in Salesforce User Experience. We love inspiring innovation and demonstrating the possibilities with a little out-of-the-box thinking and crafting working prototypes. We’re a band of coders, designers, and researchers that want to show you how it’s possible to create engaging, data-driven experiences.

OK, we’ve covered the workflow, hardware, and software angles. The final bit that makes our Design Shop enticing is solid visual design. Definitely read Eli’s upcoming post on the design aspect of this journey.

Special thanks to fellow UXE + UX peeps: John Agan, Rick Boardman, Ivan Bogdanov, Eli Brumbaugh, Adam Doti, Bruno Fonzi, Mark Geyer, Jia Huang, Cordelia McGee-Tubb, Jon McKay, Jenton Lee, Brian New, Alan O’Connor, Adam Putinski, Stef. Sullivan Rewis, Sönke Rohde, Ryan Scott.

This is part 3 of a 4 part series:

  1. Turning passion + technology into an experience, Bruno Fonzi
  2. Physical build-out of the shop, Rick Boardman
  3. Coding apps for surprise and delight, Amy Lee
  4. Designing for interactive experiences, Eli Brumbaugh — Coming soon!

Follow us at @SalesforceUX.
Want to work with us? Contact
uxcareers@salesforce.com
Check out the
Salesforce Lightning Design System

--

--