Building a PWA in Argentina

Leonardo Pittelli
6 min readJul 24, 2017

--

Early this year, the retail company where I work (https://www.garbarino.com) decided to create a PWA.

It was a great moment for a web technologies fan like me, being part of it. Argentina has not reliable mobile connections, our clients might not want to install the native app and, fortunately, 80% of our mobile traffic is from chrome. So, let’s do it! 💪

The key objective

To improve the experience on mobile.

Project setup

Before writing any line of code, we had to define some important aspects.

This was the perfect time to face an expected redesign on our mobile user experience.

We made ourselves some questions like the following (and I expect to answer all of them before the end of the article):

  • Which metrics will we measure in order to know if we are performing better?
  • Fulling the home with product carousels is the best approach in order to sell more? I think no, but the first version has the same components than the mobile web and we couldn’t do anything different. A business decision. 🤷
  • Should we use our existing web applications and progressively build on top of them? Or should we create a new app? As we wanted a new design and a SPA approach, we opted for the second option. That implies more work but gives us more flexibility.
  • Which trendy framework will we use? Vuejs? React? Preact? We opted by react because more people in the company has previous experience on it. We love some Vue features but we are happy with react either way. 👍

Technical approach

I started the project based on the marvellous post series written by Addy Osmani: https://medium.com/@addyosmani/progressive-web-apps-with-react-js-part-i-introduction-50679aef2b12

As the project evolved, the team became bigger, we added more features, the stack grew up and we ended with a first version which has:

  • Basic offline support (custom screen instead of the downosaur)
  • Add to home screen support
  • Server side rendering
  • SPA approach
  • Static files caching
  • Route based code splitting
  • Same URLs than desktop site
  • And much more is coming…

The elected stack was:

  • Node.JS for the server
  • React for the frontend
  • And many useful libraries like material-ui

Project challenges

We are a retail company. SEO is our god, so SPA approach needed mandatory server side rendering.

Doing SSR is more than saying: we use react and node, both are JS. What can fail? We named this requirement: the dark side of server side and it should be the title of a talk in the future. Meanwhile, you can read this excellent post if you want to have an idea: https://reactjsnews.com/isomorphic-react-in-real-life

If you tried SSR with react, surely you saw this message more than once:

Basically it’s saying that your markup on the server is different than the one on the client and, with that, you are heavily affecting the performance of the site. You would say, it will never happen if I’m using the same components on both sides. Well, check the previous link or imagine this simple situations:

  • We have daily offers, with a countdown of the time that lefts for each offer. If you render the countdown on the server, when it arrives to the client, the remaining time will be different (because of the network) so, the markup will be different. The solution, calculate the time after the component is mounted only on the client side.
  • We use a dependency that choices incremental ids for the component, the counter will be absolute different on the server and the client, so, also the id and the markup. The solution: fortunately that dependency accepts an instanceId prop and we can generate an unique id with the same algorithm on both sides.
  • We are using redux, all the API calls are async. But the server doesn’t know it. So, how to say to the server that it must wait until the response arrives in order to render the markup and only on that moment send it to the client? Dan Zajdband had the answer! Dispatch a "serverReady" action, update your state when you have the data and wait for it before rendering the app on the server. Something like this:
const unsubscribe = store.subscribe(() => {
if (store.getState().serverSide.serverReady) {
unsubscribe();
response.render('index', {data: yourData}
}

Other challenges

One beautiful challenge we’ve not resolved (and we will never resolve) is the constantly evolution of the JS ecosystem. For example, we started the project using webpack 1, then we moved to webpack 2 and now we are using webpack 3 (only 6 months later). The same happens with many other dependencies. You never end migrating major releases!

AB Testing

So, one day we had the MVP ready and we needed to A/B test it against the mobile web.

Well, it’s not so easy...

  • We want to separate Google analytics measurements on a new view but without losing the whole picture on mobile.
  • We want to measure conversion rate but cart and checkout are outside the PWA.
  • We have the same URLs for both sides of the experiment.
  • I’d be good if we don’t mess the URL with weird query parameters.
  • People who see A, should continue seeing A and people who see B, should continue seeing B.

What did we do?

For the first two points, we configured a new view on Analytics based on a custom dimension which is populated directly on the code for PWA and based on a cookie for the cart and checkout.

For the last three points, we sent all the traffic to the PWA. There, a random draw decided if you continue there or if you are redirected to the mobile site.

We stored that choice in a cookie “forever” if you see the PWA and for a day if you see the old mobile site. (This cookie helped us to know which user came to the cart page from which version).

And, gradually we increased the A/B testing percentage starting from 10%, then 20%, 50% and finally 100% (removed the A/B testing).

Why did we decide that?

Basically, we saw improvements on almost all the metrics we were monitoring: Conversion Rate, Average ticket amount, Bounce Rate, Page Views per Session, Returning visitor, Session Duration.

Statistics

I’ve seen a lot of case studies talking about all those huge new conversion rates, all that wonderful world of PWAs and always wondered about the reliability of the numbers. You would wonder the same about the following numbers but, at least, I’ve cleared my doubts.

The following are average percentages during two weeks of A/B test.

Conversion Rate

We saw a conversion rate 27% higher.

Average ticket amount

It didn’t have significant changes. Which makes sense because the users and the way in which they add product to the cart, are the same.

Bounce Rate

Decreased by 9%.

Because it loads faster? Because it’s cutest? Because it’s easier to find what you are looking for? Who knows! But the number is great!

Returning visitor

Increased by 13% and same questions we made for the bounce rate, are valid.

Page Views per Session

I think this is one of our best achievements. It increased by 35% and it’s important to analyze it together with the next metric.

Session Duration

It’s almost the same than before. Which combined with the +35% on page views could mean: better performance, lower load times and people navigating more the site.

In other words, during the same time, people sees 35% more pages. So, as the flow is the same, they are now more engaged with our site.

Conclusion

That wasn’t a simple job. SPA approach was (at my impression) one of the best improvements.

We have a lot more to do: test push notifications, add more sections of the site to the PWA, and a long backlog of improvements and new features.

It’s a great time for web development. More APIs and capabilities are coming and the results we can obtain are amazing.

For any question or message you can find me at: https://twitter.com/leopittelli

--

--

Leonardo Pittelli

Software Engineer. Fan of web technologies. Google Developer Expert for Web and Cloud. FullStack JS Dev / Chapter Lead Web @olxtecharg. Prev: PWA TL @garbarino