Realtime Big Data Ingestion with Meterial

Learn how we gain realtime insights on rapidly growing data at Myntra using our new processing and reporting framework.

Background

I got an opportunity to work extensively with big data and analytics in Myntra. Data Driven Intelligence being one of the core values at Myntra so crunching and processing data and reporting meaningful insights for the company is of utmost importance.

Everyday millions of users visit Myntra on App or website generating billions of clickstream events which makes it very important for the data platform team to scale to such a huge number of incoming events, ingest them in realtime with minimal or no loss and process the unstructured/semi-structured data to generate insights.

We use a varied set of technologies and in-house products to achieve the above using but not limited to Go, Kafka, Secor, Spark, Scala, Java, S3, Presto and Redshift.

Motivation

As more and more business decisions tend to be based on data and insights, batch and offline reporting from data was simply not enough. We required realtime user behaviour analysis, realtime traffic, realtime notification performance among others to be available with minimal latency to business users to base their decision on. We needed to ingest as well as filter/process data in realtime and also persist it in a write fast performant data store to do dashboarding and reporting on top of it.

Meterial is one such pipeline which does exactly this and even more with a feedback loop for other teams to take action from the data in realtime.

Architecture

Meterial is powered by:
1. Apache Kafka
2. Data transformer based on Apache Spark
3. Memsql realtime db
4. React.js based UI

Deep Dive

Our event collectors written in golang sit behind amazon ELB to receive events from App/website, they add a timestamp to the incoming clickstream events and push them into Kafka.
From Kafka, Meterial-ingestion layer based on Apache Spark streaming ingests around ~4 million events/minute, filters and transforms the incoming events based on a configuration file and persists them to Memsql row-store every minute. Memsql return results for queries spawning across millions of rows with sub-second latency

Our in-house dashboarding and reporting framework (viz. UDP: Universal Dashboarding platform) have services written which queries memsql every minute and stores the result in UDP query cache from where it is served to all the connected clients using socket based connections. 
Results are displayed in form of graphs, charts, tables and other numerous widgets supported by UDP.
Same UDP apis are also used by slackbots to post data into Slack channels in realtime using Slack outgoing webhooks.

As all transactional data currently lies in Redshift and there are requirements where reporting of commerce data with user data every 15 minutes is needed, meterial also enables this ad-hoc analysis on data for our team of data analysts. Every fifteen minutes data from Memsql for that interval is dumped into S3 from where it is loaded to Redshift using our S3 — redshift ETLs.

We selected Spark as our streaming engine because of its proven scale, powerful community support, expertise within team and easy scalability with proper tuning.
For realtime datastore choice, we did POC on multiple db’s and drilled down to Memsql.
Memsql is a high-performance, in-memory and disk based database that combines the horizontal scalability of distributed systems with the familiarity of SQL.
We have seen memsql to support very high concurrent reads/writes very smoothly at our scale with proper tuning.
Currently we are exploring MemSql column store as OLAP db for our AB Test framework(Morpheus) and Segmentation Platform(Personify).

Sample UI screenshots

Traffic
Notification

Future of Realtime Analytics at Myntra

Using realtime data with predictive analytics, machine learning and artificial intelligence opens altogether new doors to understand user behaviour, what paths and funnels leads to commerce, etc. Getting such information in realtime can definitely help us boost our commerce and take corrective actions if something goes wrong as soon as possible and we are constantly working on improving and enhancing it.