Comparing Locale.ai and Uber’s Kepler.gl on their Capabilities

Comparison of how Locale is different from open-source Kepler.gl on capabilities

Anubhav Pattnaik
Locale
11 min readJul 23, 2020

--

Introduction

Hello! Thinking about moving over to Locale.ai? Or trying to decide which platform to use for the first time? In this blog, we talk about the differences between Locale.ai and Uber’s open-source tool, Kepler.gl to help you choose the right tool depending on your need and use case.

With the advancement of GPS technology and the rise of open-source tools, collecting and mapping geospatial data has been easier than ever. Kepler.gl and Locale are the two tools that companies use to obtain insights on location data.

What is Kepler.gl?

Kepler.gl is a web-based platform to visualize large-scale location data. It was created and open-sourced by the visualization team at Uber with the mission to create industry-grade open source frameworks to supercharge big data.

There are four major suits available in Kepler.gl- Deck, Luma, React map, and React vis — to make beautiful data-driven maps. It is built with deck.gl and utilizes WebGL (A JavaScript API) to render large data faster and efficiently.

What is Locale.ai?

Locale.ai Console

Locale.ai is an operational analytics tool built for companies with hyperlocal operations. We ingest location data of users, supply (partners + vehicles), and operations to give real-time precise insights for city and operations teams.

Locale.ai uses a wide range of powerful open-source tools to handle large scale datasets in front-end and backend. The frontend is powered by Uber’s Deck.gl for high-performance visualizations, Nebula.gl for additional editing capabilities, and Mapbox-GL for rendering maps. Locale’s backend is powered by python, PostgreSQL, and PostGIS for powerful data processing and geospatial operations.

In the next section, we deep-dive into the comparison of both the tools and conclude the kind of situations when one is a better choice than the other.

(1) No Pre-Processing Effort with Locale.ai

Kepler.gl

Kepler is one of the best geospatial visualization tools for exploratory data analysis (EDA). The Uber team has done thorough and detailed research on the kind of visualization features relevant to mobility companies.

However, one of its biggest drawbacks is that any kind of processing that needs to be done on the data needs to be done outside. Kepler.gl is, unfortunately, a desktop tool that takes data only in the form of CSV, JSON, and geoJSON files which means:

  • Not Apt for Large Scale of Data: Since Kepler is a desktop tool, it's not the best choice to handle a large amount of data and unfortunately, crashes quite often!
  • Lots of Code + Queries for Insights: The set-up that companies have today ends up being very painful and manual. Typically, companies collect user behavior data in event analytics tools and supply data in different databases, which means that processing scripts need to be written, excel sheets need to be downloaded to get the overall picture of city performance across different dimensions.

Locale.ai

A lot of our visualizations have been inspired by Kepler.gl. One of the value propositions that Locale offers is the data model that we have built, which forbids you of any pipelining effort. We ingest all your data across different databases, formats, and systems:

  • User events data (Mixpanel, Amplitude, Heap, Clevertap)
  • Supply Ping data (Amazon S3, Big Query, Cassandra)
  • Operations data (PostgreSQL, MongoDB)

This helps organizations in multiple ways:

  • All location data in place: Entire company-wide location data in one place, thus giving the visibility of insights across teams
  • Catalogue of metrics and decisions: Create a standard vocabulary of all metrics and their definitions in addition to having a log for all decisions and their impact.
  • Cross-team collaboration: Teams across departments can collaborate on how their city is performing and what they need to improve.
  • A single system of boundaries: At Locale, we use Uber’s H3 to aggregate geospatial data at different levels. Not only, this helps to view data at different granularities but also creates a standard system for all verticals.

(2) Locale.ai is Built for Specific Industries and Use Cases

Kepler.gl

Kepler is built for general-purpose geospatial visualizations for business users who need insights for different kinds of decisions they need to take. Data Scientists also use it for quick visualizations to view and analyze the results of their models. Additionally, since Kepler was built by Uber, a lot of its features are very relevant to the ride-sharing industry.

Locale.ai

We think of our target market as any company that has moving objects on the ground- users, delivery partners, vehicles, or salespeople. Our specialty lies in movement analytics. Hence the industries that Locale targets are:

  • Ride-Sharing: Micro-mobility, Ride-hailing, Carpooling, Office Shuttles
  • Hyperlocal Delivery: On-Demand food delivery, Grocery delivery, Medicine Delivery
  • Logistics: E-Commerce, Third-party logistics providers, Shipping, Trucking
  • App-Based Workforce: Hospitality, FMCG, Insurance and Finance, Services at Home

Here are some of the ways in which we can help different teams with their decisions:

Marketing and Growth

  • Acquisition: Finding out the right areas to acquire users based on already existing latent demand.
  • Conversion: Debugging the reasons for user drop-off right before booking across different areas and increasing conversions.
  • Retention: Doing very targeted hyperlocal promotions using user order patterns or doing route-based promotions for mobility users.
  • Impact Analysis: Measuring the conversion via different offline advertisement avenues in different areas.

City and Central Ops

  • Monitoring Operations: Monitoring key metrics in real-time and getting alerted in case of anomalies. Also measuring business health across all verticals and areas.
  • Demand-Supply Gaps: Evaluating the gaps in supply re-distribution to ensure high booking fulfillment.
  • Revenue vs Cost: Improve topline by analyzing the revenue vs cost (time is taken or distance traveled to reach the destination) in different areas.
  • Inventory Analysis: Analyzing inventory shelf time, damaged or returned at different warehouses.
  • Cancellations: Analyzing cancellations patterns and reasons across partners, couriers, and users in different areas.

Strategy and CXOs

  • Movement Patterns: Finding out how your power users move in the city and where they go.
  • Cost of Shipping: Readjusting the price or delivery feesMeasuring the true cost of shipping to users in different locations.
  • Delay and SLAs: Understanding which deliveries, routes, partners or couriers cause delays and due to which lap of the journey.
  • Productivity: Measuring fleet performance and rewarding the top performers or incentivizing the bottom ones.
  • Courier Performance: Analyzing the performance of all 3rd party logistics companies across SKUs in different areas.

(3) Features Comparision: Kepler.gl vs Locale.ai

Visualizations

Kepler.gl: Kepler.gl has different types of geospatial visualizations that is customizable as per your use case. This means the same metric you can view as grids, points, or clusters! Their styling options are great that help you embellish your maps.

Locale.ai: At Locale, we have a completely opposite philosophy. Our visualizations are pre-built and completely fixed to the type of use case and decision you want to take. Since we work very closely with lots of companies in one industry, we want to present the best possible visualization for all the use cases, helping you debug problems and focus on the right insights.

ETL Pipeline

With Locale, you get an in-built pipeline that combines location data across databases, formats, and systems. You don’t need to pre-process data from your database, download it in the form of excel and then visualize it.

Data Catalog

Locale serves as a company-wide location dashboard with all metrics, dashboards, insights, and decisions across teams in one place. You can also invite other teams onto the platforms.

Actionability

At Locale, actionability is a core features for us. We have a workflows module that lets you take certain actions or send notifications every time a set of conditions is met. We also have features such as sharing and commenting to foster discussions. To facilitate visibility and agile experiments, we have a feature called “Decision Logs” where you can note down all the decisions taken with their impact.

Locale.ai: Workflows Feature

External Data

We are also building a “ Data-Marketplace” feature to provide external insights on top of internal data that companies find it difficult to get access, clean, and process. Marketplaces allow teams to access external insights that they can just overlay very quickly and get much richer insights.

You can read more about it here:

Data Science

Locale also has in-built intelligent data science models such as:

  • Clustering and Similarity Analysis: Our clustering model helps in diving areas based on similar properties so that you can apply similar strategies on them irrespective of where they are.
  • Anomaly Detection: Anomaly calculation today doesn’t involve the location as a parameter. Adding the context of location along with anomalies can make the insights very actionable for us and makes decision-making very simple.

More on this here:

We will keep working to make our platform more robust and intelligent so that we continuously make it simpler for our users to take complex decisions.

(4) Build vs Buy: Cost and Time Comparision

Kepler.gl

Although the visualization part of Kepler if free (being open source), using it as a base for your internal tool ends up being an expensive affair.

Connecting Kepler to your internal pipeline (or Kafka queue) and building a system where analysts can write queries can take anywhere from four to six months to set up and a team of six working full time. With the added cost of infrastructure and maintaining it can cost you around $300k! Refer to the image below for cost split up in detail.

In our experience of working with tech companies all around the globe, we have seen that such systems are rarely scalable and they end up burning valuable amounts of cash for its maintenance, a situation that you would rather not end up!

Locale.ai

Locale is the most powerful end-to-end location analytics product that converts your raw lat-long data into very contextualized and localized insights. In approximately just one-third the cost, it acts as your ETL pipeline robust enough to handle streaming data for real-time insights as well as historical analysis to go back in time.

With Locale, you get a large number of features packaged into one system, you get so much more without the trouble of wasting time writing queries or building dashboards again and again and maintaining your internal tool. After all, you wouldn’t want to work on something which is not your core business.

We have built Locale for fast-moving and high-performing business teams to focus on their core business-figuring out where the problems are with the right insights, taking decisions, measuring impact and being agile.

Final Synopsis: When should you choose either of the tools?

(1) Kepler.gl

The complexity of the design and lack of tooling for location data makes it a daunting task for companies to build scalable maps for their operations teams. As mentioned before, quite often we have seen companies building internal tooling on top of Kepler or using Kepler itself.

What ends up happening is it takes a sprint of developer’s bandwidth to get a new live dashboard up and running- mostly built as a one-time use case. As a result, what ends up happening is the following:

  • Lack of Real-time Monitoring: Quite often, we have seen companies write queries to give insights on top of Kepler, resulting in a lack of real-time monitoring of their operations.
  • Loss in Agility: Dependency on engineering and analyst teams leads to loss of agility to make quick decisions and move faster.
  • Unrealized Revenues: Often, when the right insights are not available at the right time, decisions end up being gut-driven. As a result, companies end up losing money on the table since their decisions and strategies are not very curated.

Which brings me to my next point:

Kepler serves you very well if you don’t have large datasets and don’t have a very frequent use case- say, once in a month or a couple of times in a quarter.

(2) Locale.ai

Locale.ai Console

Just to summarize, Locale is the fastest way to get insights on hyperlocal operations.

Locale makes sense for you if you to deal with large amounts of data, owing to the scale of your business and you need to monitor your operations in real-time and get insights more frequently. We offer:

  • Same day integration: We have integrations with 14 data sources currently including S3, MongoDB, etc. And new integration required can be built in a week.
  • Built for Large Scale: We have built Locale for large scale, high-frequency businesses. Currently, we are handling around 55 million location events in production!
  • Real-time + Historical Insights: Our data model caters to both real-time monitoring as well as historical analysis to go back in time.
  • Granularity: We go from the country level to as granular as a building to help you go lower than city or the area level.
  • Industry Best Practices: Since we work across companies in a particular industry, we incorporate the learnings and best practices working with all our clients.
  • Security and Privacy: We follow strict data and privacy guidelines. Our security docs can be found on our website.

With Locale, you get an ETL, a data catalogue, visualizations, actionability features, external data and data science model- so that you can focus on what you do best, running your business and making more dollars.

To know more, get in touch with me on LinkedIn or Twitter

Similar Reads:

Originally published here on July 23, 2020.

--

--