How Uber & Lyft Use AI to Improve Ride Experience

Synced
SyncedReview
Published in
4 min readJul 8, 2019

US ride-hailing public companies Uber and Lyft have revolutionized urban transportation. Uber drivers complete 14 million rides each day, while Lyft’s 1.4 million drivers have served 23 million passengers with over one billion rides as of September 2018, according to prospectuses the companies provided for their IPOs this spring.

At last month’s RE•WORK Applied AI Summit 2019 in San Francisco, AI experts from Uber and Lyft shared insights on how the companies are leveraging machine learning algorithms to improve their services for both riders and drivers.

Uber: Improving Driver Communication With One-Click Chat

Imagine you’re standing curbside waiting for an Uber. As time passes the dispatched car icon on your smartphone screen remains frustratingly far from your location, prompting you to use the app to send a message to the driver — “Is everything okay? Can you drive faster?” If your Uber driver texts you back while at the wheel that is both a safety risk and illegal in most jurisdictions.

To provide a safe and seamless pick-up experience for both drivers and riders, Uber data scientists have created One-Click Chat, a mobile feature on UberChat that enables fast, dynamic and personalized smart responses. Uber Senior Data Scientist Yue Weng explained: “One-click chat addresses some text messaging-related driver-partner safety concerns and provides a more stress-free on-trip experience. It leverages a combination of unsupervised and supervised [machine learning] methods.”

Here is how one-click chat works: After a passenger writes a message to their driver, the Uber back-end service automatically sends it to Uber’s machine learning platform Michelangelo, which uses natural language processing (NLP) to preprocess and encode the message, then generates prediction scores for possible intent. The service then provides the top four suggested replies based on prediction scores using a reply retrieval policy, and sends these back to the Uber driver, who can now respond to your question with a single click.

A key component of the one-click chat process is intent detection. Uber researchers first trained a Doc2vec model on millions of anonymized UberChat messages to map each text message to a dense vector embedding space. The model implicitly learns the relationships between words and clusters them based on semantics by calculating the centroid of each intent cluster. Researchers also trained an intent detection classifier to predict the message’s possible intent by calculating the distance between incoming message and the intent cluster centroid on a dimensional projection.

The related paper OCC: A Smart Reply System for Efficient In-App Communications has been accepted by KDD 2019, which will be held August 4–8 in Anchorage, Alaska.

Lyft: Architecting a Real-Time Optimization Platform for Driver Positioning Products

One challenge facing Lyft engineers is how to balance immediacy and quality of a response in automated decision-making. They have responded with a real-time optimization platform for driver positioning and rider-driver matching.

In his 15-minute RE•WORK talk, Lyft Research Scientist Hao Yi Ong introduced architectures the company has built to enable fast, iterative, science-heavy model and product development for real-time workflow optimization. For example, PPZ Maps is a homegrown architecture that can help drivers earn bonuses by identifying high demand areas (personal power zones). PPZ Maps factors in variations including rider model, driver model, budget and forecasting to represent forecasted demand. This information is then used to refine supply allocation.

Ong said one lesson Lyft learned building PPZ maps is that any new approach necessitates improved signals and the ability for multiple riders and drivers to quickly test and iterate on them collaboratively.

In his talk Ong offered some design principles for engineers:

  • Science development and operation is every bit as important as building the science models.
  • It’s important for research and data scientists to develop models with an understanding of how the engineering infrastructure will affect their development.
  • Similarly, it’s important for engineers to work closely with scientists to understand the infrastructure needs, and not over-index on a specific business application.
  • Developers’ challenges are as much sociological as they are technological.
  • Strategies should empower scientists to rapidly and independently iterate on models.
  • Building the platform is less about scaling a solution than scaling its development.

Journalist: Tony Peng | Editor: Michael Sarazen

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global for daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global