The Smart Instant Book Filter

Exploring a feature designed to help guests book with confidence

Yi Hou
The Airbnb Tech Blog
8 min readDec 19, 2017

--

By Yi Hou, Li Fan & Tao Cui

People like it when things are fast. We want our deliveries within an hour, our downloads within a minute, and our news in an instant. That is why we need to help our Airbnb community to book listings they want instantly. Instant Book is a key lever in our product to empower the Airbnb community to book places immediately without prior host approval and increase acceptance.

Our landscape of Instant Book has changed incredibly over the past few years and today 2 out of 3 booking are made using Instant Book. Since 2016, we have been steadily improving Instant Book, and one of the biggest levers is the smart Instant Book Filter. In the following sections, we will provide an overview of the smart Instant Book filter and shed light on how it helps our Airbnb community to book instantly and belong anywhere.

Our Mission

At Airbnb, we are always aiming to offer a convenient and successful booking experience. Before we offered Instant Book, guests using reservation requests can often get multiple rejections, and the situation may turn out worse for travelers who want to make their plans quickly. With Instant Book, guests can filter on our search page to find places where reservations can be confirmed instantly. Still, we’ve realized some guests select Request-to-Book listings without knowing the chance of rejection and possible delay in response.

Motived by this, we initiated a smart Instant Book Filter — intelligently applying the Instant Book filter to searches to encourage a better user experience.

We were faced with the following two challenges before we could move forward:

  1. When should we automatically apply smart Instant Book Filter
  2. How should we inform guests that an Instant Book filter has been turned on?

Within our Search service, a broad knowledge of idiosyncrasies of all listings and market dynamics are available. We built a decision engine embedded in search service leveraging this data to to answer question #1. In addition, we have a powerful channel in platform to deliver informative insights to our guests — Market Insight. This has addressed question #2.

Next we will explain the workflow and experience of the smart Instant Book Filter.

Architecture and Experience

In this section, we start by conceptualizing the model, then we explain the data, algorithm and model. Finally, we will describe the user experience.

At a high level, when a guest uses Airbnb to search for homes, the API server first collects basic query information and performs an initial eligibility check. Next, the API server queries the Search service and talks with the Market Insights service to decide if the Instant Book filter should be triggered, generating corresponding candidate market insights in real-time. Finally, the API server combines the information collected from both services and reflects the result in the UI.

Figure 1. Smart Instant Book Filter workflow

As Fig. 1 illustrated, the workflow is as follows:

  1. Guest first lands on our platform.
  2. Guest searches for homes via the search box. The API service will validate basic information in this phase for example whether the query contains check-in and check-out dates to flag queries that are eligible for smart Instant Book filter.
  3. API server sends a search request with smart Instant Book filter eligibility flag to the Search service, which talks with Nebula — our in house storage platform, to perform ranking and generate search results. Our Decision Engine sits inside the Search service which consumes search results and uses a deterministic model to decide whether to trigger the smart Instant Book Filter.
  4. The Search service passes market availability metadata to the Market Insight service where candidate insights including the smart Instant Book Filter insight will be ranked based on their values to the guest.
  5. The Search service returns results to server.
  6. The Market Insight service returns final eligible insights to server.
  7. The API server combines all information and powers the front-end to display.

Data

Our decision engine inside the Search service incorporates heterogeneous sources of data—user data, listing data, platform data, and query dynamics—to make a deliberative decision.

  • Real-time data: user data, platform data and query data are the fundamental information transferred from the API service. We’ve trained models on a per platform basis given various user experiences on different platforms and we respect last minute bookings differently by applying looser thresholds.
  • Aggregation data: we aggregate search results in terms of demand, supply and listings’ quantity and quality, together with other information including geographic drift, per search request. Listings’ quality is measured based on discounted cumulative utility which encapsulates the facts of each listing’s review rating, page view and other quality scores. Some of the aggregation data is pre-computed and supplied by an internal tool.

Algorithm and Model

We aim to shift Request-to-Book to Instant Book by smartly applying the Instant Book filter. As a result, we only show Instant Book results for our guests under certain circumstances. Yet the biggest challenge here is how to avoid booking loss during this procedure due to reduced supply size. An analogy of this scenario is shown in Fig. 2.

Figure 2. Shift Request-to-Book to Instant Book

So we consider the following problem setup. Each search is associated with a ranking score s, which is composed of a Instant Book ranking score and a Request-to-Book ranking score. We use a function g to serve as the proxy between booking Bs and ranking score s so we have: Bs = g(s). Note here we don’t assume any form of function g. Thus the problem is reduced to increase Instant Book without decreasing Bs.

Let s0 be the ranking score without the smart Instant Book Filter, then we can have a Taylor expansion of function g about s0 in equation 1.

By approximating this to the first order, we have equation 2. Then the booking change percentage can be represented as equation 3, where α is a constant depending on platform and seasonality. Similarly, we can calculate the Instant Book percentage change in equation 4. Here β depends on proportion of Instant Book results in current search results.

We can backfill the values of α and β by analyzing previous Instant Book ranking experiments. With α and β, we can reliably predict future experiment’s performance based on offline computed ranking score change. With all these statistics, we can optimize our deterministic decision tree based decision engine according to fixed strategy. More formally, we want to:

where w is our feature vector, which is described in data section and servers as the branching factors in our deterministic decision tree, and s1 is our target booking score.

We can interpret equation 5 as maximizing Instant Book ranking score while keep the ranking score for overall booking slightly positive or neutral. We can also maximize the overall booking ranking score given target Instant Book ranking score vice versa. We’ve computed our feature vector values offline by searching in the dimension space using Python toolbox to best promote Instant Book. One example of the dimension space distribution is illustrated in Fig. 3, where we search the best value for Instant Book score gain given distance ratio dimension and discounted utility value dimension.

Figure 3. Instant Book score gain as a function of discounted utility value and distance ratio

Next, we will illustrate the user experience of smart Instant Book filter.

User Experience

When the Search service finds it eligible to turn on Instant Book filter, with a sufficient supply of high quality, we will automatically turn on Instant Book filter and alert guests with a tooltip.

Smart Instant Book Filter with Tooltip

If guests keep zooming in the map to narrow down searching area while Instant Book filter is on, which consequently find fewer and fewer Instant Book results, a market insight card will replace a single listing card to remind guests they can release smart Instant Book filter with one click.

Contextual Market Insight Reminder Card

Key Takeaways

Rethinking the problem from the top funnel

Over the past year, we’ve closely partnered with user experience researchers, product specialists and data scientists on how to promote Instant Book and convey the message — Instant Book is the future, to our Airbnb community. A tremendous amount of effort has been dedicated to host side. This project is where we started on guest side and landed with good results. As a result, our work has driven Instant Book percentage by more than 5%, blending in a substantial way of company growth.

Bridging the gap between guests and hosts

Most bookings are made via Instant Book today than Request-to-Book. As guests are finding it easier to leverage the Instant Book filter, a second order effect we have identified in experiment is the tendency of hosts to adopt Instant Book, which is comparable to strong host-side Instant Book adoption campaigns. The boundary between guests and hosts is blurry — many hosts are guests, as well. This signifies that the interaction between guests and hosts is remarkably important, which paves the way for more advances in this area.

Future Plans

Market dynamics are ever-changing. Airbnb has been a pioneer in exploring and leveraging advanced technologies, including building automated machine learning infrastructure and workflow management platform. As we are constantly iterating on smart Instant Book filter, in addition to the current state of model, we are planning to leverage automated ML infrastructure and integrate system with an online scoring model to capture evolving conditions.

We work together, across boundaries, to fullfil the mission and deliver our commitments. We feel passionate for the future of this framework and are excited about possible directions this work has opened. Special thanks to Li who contributed the original idea. Thanks to Yi, Li and Hui who iterated and made the first shippable version of this product. And thanks to Tao who helped revising the model and brought performance to next level.

We’d like to thank Surabhi Gupta, Ricardo Bion, Brendan Collins, Martin Nguyen, Peng Dai, Jeff Fossett for their generous help in many ways. Shoutout to Trunal Bhanse, Lenny Rachitsky for the input and Adam Neary for reviewing this article.

Interested? We are hiring!

--

--