How we have been breaking patterns with adidas GLITCH

Around one and a half years ago our team started working on adidas GLITCH as Frontend Developers. The project was to be built in React Native. As a relatively new framework at the time, we didn’t have a vast amount of experience developing with it. Even though we had faced several challenges together before, we felt that this time we would need to leave our comfort zone and try something completely new.

The idea and the concept were really promising and we knew that the delivery of this project would be a milestone for us and the whole company.

adidas concept

Project roadmap

  • 2016 May — Start
  • 2016 November — UK rollout, IOS only
  • 2016 December — UK rollout, Android
  • 2017 June — German rollout
  • 2017 October — French rollout

The Concept

GLITCH is a new approach to commerce and customer experience — a new way to sell an exclusive product.

A pair of boots which is distributed only via an application

Deep-dive in React Native

With limited React Native experience, we had to learn quickly. The first sprint was dedicated to getting used to this technology, so we spent our days and nights trying to understand React, React Native, Redux, Flow.

Fortunately, one of the partners in the project, NewStore, helped us take these first steps. They organised several workshops in Germany to give us the chance to get closer to React, as well as to learn the best practices in React Native.

This period took much longer than three weeks, but the learning curve of React made it possible for us to start the development immediately after the beginning of the second sprint.

The key to success: GLITCH effect

First, we had to deliver the alpha version of the product selector, which is the key point of the GLITCH app. This is where users can choose the boots that they want to buy.

It took some time for us to realise that there was not other way without the help of OpenGL, since the first experiments were really far from the designed experience.

inspiration for GLITCH effect

After spending most of our time programming in JavaScript, we realised it was really challenging to understand OpenGL. We had to learn new expressions like uniform, sampler2D, vec2, vec4, float to implement our fragment shaders, and we had to manipulate images pixel by pixel.

The final GLITCH effect in the configurator

The task was even more challenging because Adidas wanted a perfect skin/sock separation effect, in order to demonstrate how the users should pull up the inner skin of the boots in real life. We did not have any 3D models, so we spent a lot of time with OpenGL Shaders to alter 2D assets, trying to make the outcome more realistic.

Shader with magic numbers

We used several tricks with the assets to deliver the user experience. In the default view of the Product Selector, we used four layers per boot.

Layers used

At an early stage, our team shared our progress with the client. They were impressed by what they saw and that was really motivating.

Teaser video of Product Selector

Asset caching

After creating the Product Selector, we loaded a great number of retina images, which could easily burn the users’ mobile internet limit, even though we integrated Claudinary API to load images with the necessary sizes. So, we had to think about a reliable offline caching method, and that’s why we designed a Redux based automatic image caching system. The method lazy-loads the images for the configurator while it downloads other assets in the background. We used react-native-fs during the implementation, which perfectly met our needs.

Here is an example of how the cache works:
const download = ({ url, fileUrl, background }) => {
const downloadJob = RNFS.downloadFile({
fromUrl: url,
toFile: fileUrl,
background,
});
  return downloadJob.promise.then(() => {
return { url, fileUrl, downloadJob };
}, () => {
clearTimeout(timeout);
return { url, fileUrl, downloadJob };
});
};

Behind the boots

Another challenging exercise was the growth of the many third parties that were required for the app. We tried many things before finding the perfect way of handling API requests — responses in React and Redux. After testing several best practices and modules, we realised that we needed our own solution for API — Service — Redux level integration.

The first step was to create the API layer. For this layer, we agreed to use only pure functions. This part is responsible for managing service calls, which might sound like an easy task, but as we have more than nine third parties integrated into the app, it is actually quite challenging. We ended up creating our own request handler, since the fetch was not flexible enough to serve our needs. Each API has its own way of working and we can’t influence them. Our solution had to be unified, as well as flexible enough to parse JSON, XML or plain text if needed.

The Service layer is built to manage and observe multiple API calls and keep the Redux Store updated. To unify the service methods, we created a Redux Promise Action Handler.

Here is an example of how a promise action is triggered:
dispatch(promiseAction({
type: [
'START_ACTION',
'SUCCESS_ACTION',
'ERROR_ACTION,
],
promise: // add any kind of promise(s) goes here
}))

The goal behind the creation of the Promise Handler was to make it able to merge 4 or 5 service calls with a single action and a generic output.

To implement the Redux Reducers, we used a redux-create-reducer because it helps to keep things simple. To retrieve data from the Redux Stores, we chose Selector Functions. I believe this is a really important part of Redux and it plays a key role when a project has to be maintained over a long time. We performed many calculations in these selector functions to return the necessary data.

How does data calculation take place in the selectors?

Following this logic, we make selections in multiple stores without messing up our Reducers. In Redux, we tried to follow a simple rule: Reducers should be pure functions, without magic logic. It can be very confusing when the Action’s payload is modified too much while stored in a reducer.

Selectors are mostly used in the mapStateToProps function of the Containers, so they are called on every time a property changes in the Store. To improve the performance of the functions we used Reselect, which helps us execute costly actions only when it’s necessary.

Integrated 3rd parties

  • Commerce API — Product handling, stock levels, checkout and order history
  • Content API — A CMS developed by POSSIBLE, responsible for storing events, FAQ contents and other app settings / data
  • Customer support chat API
  • Ratings&Reviews API
  • 2x Push Notifications Services
  • Order tracking in Germany
  • Order tracking in the UK
  • Order tracking in France
  • CDN Image API

Debugging

Compared to a web page, it’s pretty hard to get useful information from a mobile app when it is in release mode. When it comes to development builds, React offers good tools for debugging — like redux-logger — but these tools are not available in Release mode. Unfortunately, most of the crashlogs don’t contain enough information about the exact issue. Sometimes, random things happen during testing and these are not reproducible, so we needed to build a reliable tool to help the effectivity of QA and Dev cooperation.

It took a while, but we managed to have a unified tool that can be used for testers and developers. The basic concept was to have a simple interface and an API and integrate it everywhere in the code.

Besides logging useful redux events, we log analytics, network requests and other system events to help with further debugging. When designing the Debugger’s log API, we tried to keep it as simple possible.

Here is an example of a logging call:
Debugger.log({
label: 'LOG TITLE',
type: Debugger.ACTION_TYPES.SYSTEM,
logType: Debugger.EVENT_TYPES.NONE,
data: {}, // any custom data here
});

With this development, we could decrease the invalid issues reported for the development team and thus improve the velocity of the whole team.

Distribution

We did not have a totally set up development and distribution environment in the beginning. We used hockeyapp to distribute the test builds, but the number of the devices was limited because of our Provisioning Profile. In the early stages, the lead developer compiled the project on his computer and uploaded it to hockeyapp to distribute it. This process was very time consuming, so we couldn’t wait to have the CI done, since we planned regular releases after each sprint.

After a certain point, we started to invest time in automatisation, to prevent human errors. In our CI setup, we use two branches to trigger builds: Staging and Production.

We created a script that manages updates automatically on both branches and we have other scripts for configuring the app environment or changing the app version.

Automated test

We used mocha as unit testing framework to cover our business logic in the app, such as APIs, Services, Redux Stores + Selectors. Instead of unit testing complex components we built them to support UI testing. This means that the developers added a testID attribute to the key elements of each screen in order to make them accessible to the UI test framework. For this purpose, we used Appium and some custom .sh scripts which do a perfect job and help the release process by taking some pressure off of QA.

Big up for React Native

To be honest, we used mostly iPhones during the development of this project, simply because it’s easier to develop on an IOS device with React Native than on Android. This means that we did not really take Android issues and performance into account until the IOS rollout in the UK.

Actually, it was a nice surprise to see the app looking so “good” when we started the Android development. Despite a few layout issues and a need for improvement in performance, it was working quite well!

React Native offers a few ways to write platform specific code:

// Mostly for components:
filename.android.js
filename.ios.js
// Mostly for styles:
Platform.select({ ios: {}, android: {} })

In the end, it turned out that our Android app had mostly the same issues in every component and we barely needed this kind of platform-specific customisation. We got used to how the nested views are displayed correctly; however, this is not true of Android. When a view has no width or height parameter, it will cut their children’s views instead of expanding dimensions, as images without exact width and height values are also hidden as well.

// This image is not visible on Android (RN 0.35)
imageStyle: {
position: 'absolute',
top: 0,
bottom: 0,
left: 0,
right: 0,
},

After fixing these layout issues and doing some performance fine-tuning, we achieved a successful Android release in December 2016 in the UK.

Big up for the team and for POSSIBLE

While some developers might not like working for an agency — some say there are tight deadlines, greater stress and “lower” work quality — I believe POSSIBLE gave a good example of how an agency can deliver a project of high quality and amazing user experience.

I believe everyone’s hard work got payed off when GLITCH won several Awards in 2017.

Cannes Lions — Digital Craft: User Experience Design (Bronze)
Cannes Lions — Media: Use of Co-Creation & User Generated Content (Silver)
Cannes Lions — Media: Excellence in Media Insights & Strategy (Bronze)
Cannes Lions — Direct: Durable Goods (Bronze)
Cannes Lions — Promo and Activation: Durable Goods (Shortlist)
Cannes Lions — PR: Social Community Building/Management (Bronze)
Clio Sports Awards for Product Innovation (Gold)
Clio Sports Award for Apps (Silver)
Campaign Creative Tech Awards — Best Customer Experience (Gold)
Campaign Creative Tech Awards — Best Audience Engagement (Silver)
Masters of Marketing Awards — Retail & Ecommerce (Gold)
Masters of Marketing Awards — Mobile & Apps Marketing (Gold)

I hope this unbelievable ride continues!

Thanks for reading this article. I hope you enjoyed it!

If you are on IOS or Android ;)

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.