Sensor fusion is the task of combining data from multiple sensors to build a robust understanding of the surrounding environment. For example, we might be building software for a vehicle with multiple radar and lidar units. One unit might tell us that we have another vehicle directly ahead of us, and then a different unit might tell us that we have another vehicle ahead of us and to the right. We use sensor fusion to determine whether those are two different vehicles, or the same vehicle.
Here’s some work Udacity students have done in this domain.
Sensor fusion turns out to be a highly mathematical discipline, and Mithi uses this post to succinctly review the linear algebra behind extended Kalman filters. This is part of a series of posts Mithi wrote about how Kalman filters work.
“For the radar, I’d have to linearize the extraction matrix
Has mentioned several times before. To linearize, I compute what is called a Jacobian matrix this is based on first-order partial derivatives of the function that converts cartesian to polar. You can see the derivation here. The value of this Jacobian is based on what the state is if the radar sensor measurement is correct.”
Once upon a time I tried to teach myself how to build an extended Kalman filter by reading the Wikipedia entry. It didn’t work out.
Raul starts with the Wikipedia entry and proceeds to build a coherent explanation of how Kalman filters works, incorporating both time and uncertainty.
“As reminder, F is the Transition Matrix (the one that deals with time steps and constant velocities) and Q is the Covariance Matrix (the one that deals with the uncertainty). Let’s say there is a relation between the Uncertain and the the velocity, the Covariance Q is proportional to the velocity being bigger the Uncertainty whith higher velocities and more accurate with lower ones. This Process has a noise wich notation can be written as ν∼N(0,Q) wht means zero mean and covariance Q, Gaussian Distribution is the proper name.”
A while back, Peter messaged me on Twitter, asking about some open-source lidar data that he could fiddle with. I pointed him to some data the Voyage team had open-sourced while they were still working at Udacity. My former colleague, and current Voyage CEO, perhaps knew more about that data than I did and pointed Peter to KITTI, instead, which is probably the right answer.
Nonetheless, Peter spent a while getting the point cloud files spun up and learned how to work with them.
“For my purposes, there are two datasets of interest which each contain camera, GPS, lidar, and steering/throttle/brake data. The driving data is stored in ROS bags and are downloaded via the torrent links found below. I recommend you start downloading these now; they are pretty large!”