How We Used IoT to Improve Our E‑Commerce Experience

Ville Lahdenvuo
Grano
Published in
5 min readNov 27, 2017

I admit it, the title is kind of clickbait-y. I’m going to tell you how we used AWS IoT service to improve our e-commerce experience. Earlier I wrote about a nasty bug we had to hunt down before releasing our new service.

Our storefront with Christmas decorations

We’re currently working on a new e-commerce solution for Grano, aiming to provide a better customer experience by modernizing our platform, improving automation and integration to our production services. Our tech stack is reasonably cutting edge, considering that we’re working with an enterprise-level product and environment: Node.js, Angular, Redux, AWS accompanied with high code quality and modern development standards.

A little bit of background

Grano does mainly print and the webstore is not an exception. We sell anything from business cards to big roll-ups for exhibitions and trade shows. The one thing in common with almost all print products is that you need something to print.

In our store the customer uploads a PDF file as the print material. A simple file upload to S3 you might think. We want to do more, though. In order to reduce manual work in the order flow we want to check the PDF for compatibility with the selected product before allowing the customer to place the order.

Waiting for the PDF report…

This checking or “preflight” step is done using a third party software called Enfocus PitStop Server ©. It runs on a Windows instance and that’s a good reason if any to also run a virus check on the uploaded PDF files before allowing further processing and downloading of the files. This is achieved by using an AWS Lambda function to check the file and move it to the right S3 bucket.

How it looks if you uploaded a file with the wrong size. (Localization improvement is on the backlog.)

What does IoT have to do with all this?

All that checking and processing takes time, PitStop is known to take up to minutes if checking a complicated PDF, so we needed a way to tell the customer when the process has been completed and the checkout can be finished without telling the customer to refresh or use polling — I mean come on, it’s 2017.

Our first idea for the problem used a nifty little library called Gun.js. It’s basically Firebase real-time database, but you can integrate it easily to your existing Node.js application and host it yourself. It worked during our tests, however we had a couple of problems with it.

First of all we obviously expect our project to be a great success and therefore we want to design it with high availability, performance and fault tolerance. To achieve this we use the AWS Application Load Balancer to start up API instances when the load increases. This is a problem because Gun.js keeps the database in-memory using an optional storage adapter such as S3, but to keep the real-time part of it working we would need to make the API instances peer dynamically when instances are created or terminated.

Secondly we didn’t find a smart way to limit read/write access to the Gun.js database. So anyone could modify PDF reports and the changes would be updated to any connected clients. On top of that we took a look at the source code and — albeit well tested — it made us feel uncomfortable.

The solution

We set out to find a better solution that we could easily deploy to AWS and that wouldn’t increase maintenance work. With some searching online we ended up reading about AWS IoT.

AWS IoT is a managed cloud platform that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT can support billions of devices and trillions of messages, and can process and route those messages to AWS endpoints and to other devices reliably and securely. With AWS IoT, your applications can keep track of and communicate with all your devices, all the time, even when they aren’t connected.

We found out that AWS IoT provides a nice JavaScript SDK that supports real-time pub-sub messaging via WebSockets and the cost is $5 per a million messages, not bad for a hosted solution especially since for our needs we only need to send one message per file upload.

The required code changes to the backend were quite minimal.

We use the generated file key, which consists of a random uuid plus the filename, to scope the report messages. That should be sufficiently secure and in any case the reports don’t include any personally identifiable information.

The changes to the frontend code were a bit more involved since we now had to fetch the report status from the API and subscribe to the IoT messages. This is because the customer can close/reload the tab and later return to finish the checkout and have missed the IoT message.

We made a nifty service that takes a file key and returns an observable that simply gives you the initial status and any updates from the IoT service that we link to our Redux store to update the view.

The biggest problem we had was with authorization in AWS. We had to somehow link AWS Cognito and our API with our production and development AWS accounts so that our API instances from both environments could publish messages and our clients from both environments could subscribe to those messages. But after some policy wrangling we got it working and the messages were flowing.

The final file upload flow

The Takeaway

The lesson of the story is that there is always a better solution — you just have to find it. You also have to avoid getting attached to your code, always thinking how to improve it. To summarize:

  1. Don’t be afraid to throw away code, even if it works!
  2. Prototype ALL THE THINGS
  3. The solution might seem strange at first — IoT, seriously?

Don’t forget to clap and follow this publication if you enjoyed this piece and want to read more.

P.S. We’re hiring!

Ville works as a Lead Developer at Grano, the most versatile content service provider in Finland and the leading graphic industry company in the Nordics. Currently he is working on Grano’s E-Commerce solutions.

--

--