Yandex Self-Driving Meetup

Yandex Self-Driving Team
Yandex Self-Driving Group
4 min readMay 22, 2019

Controlling an autonomous car is a complicated task that requires a stack of technologies including computer vision, machine learning, AI, navigation, and assorted hardware solutions. We want to introduce you to how these pieces work together to create a driverless experience.

On June 8, we will hold our first big meetup dedicated to our driverless cars. It will take place in our garage, where we assemble and test our unmanned vehicles. We will show you how our cars work from the inside out, and talk about the challenges we face. We’ll be happy to answer your questions and demonstrate our vehicles in real-world conditions on Moscow roads.

If you want to attend the meetup, please fill out this form.

What we want to share

First we’ll tell you how it all began, from choosing the right car for a prototype and passing early tests to our first attempt on city roads and testing abroad.

We’ll tell you about our hardware: how the cameras, LIDARs, radars, and other sensors work in bad weather conditions, dust storms, or snow. We’ll also explain how we deliver data from these sensors to the computer in the trunk of the car, guaranteeing seamless operation of the entire system.

We’ll also show how our car accurately determines its location within a couple centimeters: which sensors are responsible for that, and how we analyze their data.

One of our most serious challenges is predicting the behavior of others on the road. Our car recognizes objects around it (as well as their speeds and trajectories) in order to take the safest, most effective route. We’ll cover how we are solving this issue.

Interactive booths

Although presentations are a good way to explain things, we have also tried to visualize some of our technologies to give you a chance to interact with them.

In particular, we’ll demonstrate our virtual simulator for running our vehicle and monitoring how it behaves. It enables a complex virtual environment with intersections, cars, pedestrians, and cyclists. You will be able to place obstacles in the car’s way and see how it reacts.

We will also examine our autonomous car hardware in detail: how the computer and network function, which resources they use, and why satellites and our own embedded devices are crucial. We will show you the difficulties and limitations we face, as well as how we overcome them.

If you’ve seen our videos, you probably noticed a tablet at the front of the car, showing how it sees the world around. This is called Yaviz, and you’ll get a chance to have a closer look at it. You will also learn how Yaviz collects data from sensors, draws a 3D model of the world, and helps us improve the system.

It is surely not enough to just install sensors — they not only need to be calibrated independently, but must be calibrated relative to each other. We will go through the types of calibration, reveal why this is critical for the algorithms to work properly, and the challenges we face in this department.

Last but not least, we will answer the most important questions: how do we see the future of our driverless cars, and why do we do what we do?

How to get to the event

If you want to spend Saturday, June 8 at our meetup from 10:30 AM to 8.00 PM, please fill out this form. We have only 100 spots to ensure every participant sees the booths, communicates with the team, and sees the vehicle in action. We will send you a task in response to your application — it must be completed before May 29!

That task may take time: it comes from our daily practice of segmentation of the LIDAR cloud based on real data. We provide an example of both real and synthetic clouds so you can test your solution more easily.

See you in the garage!

Written and published by Yulia Shveyko

--

--