A (small) peek inside In Loco’s Location Platform

Larissa Passos
Inloco Tech Blog
Published in
5 min readMar 8, 2019

Mark Weiser first proposed that Ubiquitous Computing would be the next wave of computing around thirty years ago. The current smartphone and Internet of Things advent has finally started to bring about his vision. Here at In Loco, our mission is to make people’s lives more efficient through the use of ubiquitous technology. We hope to achieve that using our location technology. Location can be used to define context and intent. We can then leverage our contextualized location data to empower applications. Wouldn’t it be cool to receive discounts in a book purchase just by wandering into the bookstore?

In order to accomplish such diverse goals, it is necessary to have first a location platform, upon which the most varied applications can be built. This article intends to present a big picture overview of In Loco’s Location Platform, its challenges, and characteristics.

Platform Overview

Our platform is made up of three main parts: In Loco’s SDK, our classification service and our data infrastructure. Data collected by the SDK in our partner apps are sent to the contextualization service, which can then add context to the location received. This enriched data is then published to our data message broker system, where they are routed to our applications. This can lead to innumerable applications in different areas, such as smart cities planning, audience measuring, seamless authentication, etc. Our platform architecture is shown in the picture below:

In Loco's platform overview

Mobile SDKs

In Loco SDKs are available on Android and iOS, along with several hybrid platforms. They are designed in a modular way, with the Location SDK being the sole responsible for collecting and sending data to our services. This data can only be collected if all of the following criteria are met:

  • The user has given consent to data collection;
  • The user has allowed the use of location data through system-specific permissions;
  • The user has location enabled at that instant.

Sensors data collection can be battery taxing. Who hasn’t used the navigation to go somewhere only to discover that the battery had gone down by 20% in the meantime? One of the greatest challenges of our SDK team is to be able to collect data at the right moment, to prevent such problems. We use several techniques to predict the right time for data collection, and, even more importantly, when to avoid doing so. The most important events to us are visits and exits. A visit occurs when the device has become stationary for a certain period of time. An exit represents that the device is not stationary anymore. Whenever the SDK detects one of those types of events, it will send the data to the classification service. We are able to do this using less than 0.5% of an Android phone battery in an entire day!

Visits Classification Service

The classification service receives data from the SDK and anonymizes device specific identifiers, generating encrypted and hashed versions of the values, to be used by the applications. It will then process sensors data through a series of pipelines, in order to correctly classify the visit event. We apply several clustering techniques to create what we call Environments, signal representations of a given physical space. These environments can then be attributed to Places, meaningful and delimited locations in the world (more info here). This results in a more accurate localization in indoor spaces than GPS, which is subject to reflection across buildings and other signal interferences.

Our environments are dynamic, so they will improve over time, as the algorithm is able to improve them with new data from other visits. They are also created automatically, allowing us to account for real-world changes seamlessly, in the same way that Waze can account for road closures, for example.

This processing must be done in real-time, as the classification is used by the SDK itself (it can adapt its detection algorithm to specific places, such as shopping malls) and by our real time applications, such as push notifications. The classified data is then published to our data platform, using Kafka.

Data Platform

Kafka is the spine of our data platform. It allows us to isolate applications responsibilities and deal with issues such as backpressure (when more data is produced than what can be consumed at a time). Applications can produce and consume topics, without knowing anything about the producers or consumers of their data.

After the classified visit (visit data enriched with the place classification) is published to Kafka, it is also captured by different injectors (applications that consume topics to send it to some kind of database). This data then becomes available in our warehouse, where it becomes accessible to Spark jobs, ad hoc queries and other APIs. More info regarding our data platform can be found here.

This data is an improvement on raw location data because our applications use the enriched data (such as place category, opening hours, etc.) to attribute semantic meaning to events. Our Ads product can use, for example, multiple visits to auto dealerships to infer intent and provide a useful advertisement for the user. The Location platform has allowed us to provide this unique kind of location intelligence for our products.

Putting it all together

Let’s analyze a specific use case to see how everything fits together. In Loco For Apps push notifications feature allows app owners to create push campaigns using real-world places as triggers. Then, when a user of the app visits such places, a notification is sent and the user is engaged.

Let’s say Alice has a book club application with the In Loco SDK already installed, and she made a partnership with Bob, a bookstore owner. Bob will give Alice discount coupons and she will, in turn, promote his bookstore in her app. Alice can then create a push campaign in In Loco For Apps targeting that bookstore, with the coupons info. She also will be able to use In Loco’s analytics to have a better understanding of how her users behave in the physical world.

Whenever a user arrives at that particular bookstore and remains there for a certain period of time (at least 5 minutes, for example), the SDK will detect a new visit and send sensors data to the classification service. The data sent does not contain any personal identifiers such as name, e-mail or telephone, and the device identifiers are hashed and encrypted at the server.

The classification will assign the brick and mortar bookstore place to that visit and will publish it to Kafka, as well as return the info to the SDK.

The In Loco For Apps Push service consumes visits data from Kafka, and checks for corresponding campaigns for the classified places. If all campaign prerequisites are met, it will then send a push to the device using the application delivery system of choice (e.g., Firebase Cloud Messaging), and the user will get some sweet discounts!

References

--

--