Car Talk: Federated Learning Object Detection for Autonomous Vehicles
Shenghong Dai∗, S M Iftekharul Alam†, Ravikumar Balakrishnan†, Kangwook Lee∗, Suman Banerjee∗, and Nageen Himayat†
∗ UW-Madison † Intel Labs
If you live in city with a self-driving car pilot program, you might wonder why the cars, often from a handful of startups, can’t learn from each other while they roam the streets.
Federated learning (FL) is a distributed machine learning approach that can help autonomous vehicles to train a model collaboratively without a single organization holding all the data. This protects data privacy yet allows each client to contribute to the model’s development. However, current FL benchmarks focus on limited data collected offline, which fail to capture the real-world dynamic behavior of these applications.
So how can FL match the needs of autonomous vehicles? That’s the focus of a team of researchers from the University of Wisconsin–Madison and Intel who collaborated on a framework called Federated Learning across AutonoMous vEhicles, or FLAME. FLAME leverages the popular open source physics simulator CARLA and FL framework OpenFL to allow collection of streaming data from autonomous vehicles for online collaborative training.
FLAME enables real-time transfer of continuous data streams with automatic annotation from CARLA to OpenFL, while modeling data heterogeneity, annotating data at practically zero cost, and performing reproducible, continual FL experiments.
To demonstrate the framework’s effectiveness, researchers implemented the continual FL framework and compared it with a centralized training approach and a local training approach for a popular object detection use case.
Two participants in the same town drove in autopilot mode, accumulating images with annotations for object detection. The FL framework outperformed the local training and achieved close-to-central performance, converging more quickly compared to local training, especially for the second driver. Driver two gained significantly from the insights of driver one, highlighting the advantages of working together in a federated collaborative learning system.
The FLAME framework is a significant step forward in enabling federated learning on OpenFL using continuous streams of data generated from autonomous vehicles running on CARLA. It provides FL researchers a way to generate new and reproducible datasets in practical settings.
FLAME was presented this year at the IEEE Consumer Communications and Networking Conference (CCNC) and garnered strong interest from the audience. If you’re interested in trying out FLAME, check out this GitHub* repo for the source code.
About the Author
Shenghong Dai is a Ph.D. candidate at the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison, with research interests in machine learning and systems. Previously an intern at Intel Labs, her recent work focuses on federated learning and continual learning.