Collecting CAN Bus Data with ROS 2 and Qt

Tamas Foldi
HCLTech-Starschema Blog
6 min readOct 25, 2023

One obvious way to learn something new is to use stuff you already have. This was my thinking when I decided to invest my time in learning sensor fusion at the edge — more precisely, performing sensing and perception in ROS 2 to see how autonomous vehicles work. I drive a Tesla as a company car, and it’s outfitted with tons of cameras and sensors, so I dusted off my Raspberries and got to wiring.

Looking at CAN frames and signals in my car was the first step in the effort to collect all required data.

To meaningfully perform perception, we need sensing, and for sensing, we need sensors. AVs typically use the following telemetry and sensor data to see and understand the vehicle and its surroundings:

  • Car telemetry like speed and distance to support odometry. These usually come from the CAN bus, the car’s internal communication backbone;
  • GPS position signals, also usually coming through CAN bus;
  • Cameras like dash cams;
  • LIDAR and RADAR point clouds.

We need to fuse all these sensors to gain full awareness of our and surrounding objects’ position. Thankfully, ROS 2 has a lot of prebuilt packages that can do the heavy lifting, which helps me focus more on the collection and analysis rather than reinventing and reimplementing sensor fusing algorithms.

Collecting CAN Bus Signals

I bought a CL1000 adapter to log the CAN bus signals. To connect it to my Tesla 3, I ordered an OBD2 16PIN-26BIN adapter along with an OBD2 adapter. Wiring them up wasn’t hard, but as it’s mentioned everywhere, do not forget to turn off your car before this step, otherwise, it can harm your vehicle. CSS Electronics also has a nice blog post on how to connect their logger.

I know, I need to clean my car more frequently. Anyway, the logger simply connected to the Tesla’s CAN bus.

Out of the box, the logger supports Qt-based CAN bus analyzer tools like SavvyCAN, so I was able to quickly check if the connection was OK:

I was able to see CAN signals in SavvyCAN in real time.

The CL1000 already comes with a Qt5 QtCanBus plugin that makes it easy to use with any Qt application. The advantage of this QtCanBus-based connection was that I could use my Mac for debugging and reverse engineering, as socketcan (the standard can interface) primarily works with Linux.

Adding Qt CanBus Support for ROS 2

The next step on my list was to feed the data from the car to an ROS 2 topic, decode it and dispatch it as standard messages like Twist, NavSatFix or DiagnoticArray to make them available for standard sensor fusing libraries.

I didn’t find any ROS 2 library that captures and decodes CAN frames the way I wanted, so I ended up building my own set of packages.

I also couldn’t find a standard CAN bus message type in ROS 2, but ROS 1 Industrial’s can_msgs/Frame from https://github.com/ros-industrial/ros_canopen/ was love at first sight, so I used it as QCanBusFrame.msg.

ros2_qtcanbus_msgs/QCanBusFrame, same as can_msgs/Frame from ROS1 Industrial

To build the publisher node, I used libros2qt, a small library that provides a ROS 2 executor that enables running QApplication and ROS nodes from the same process.

The processing node’s code is fairly straightforward: when a frame becomes available from QtCanDevice, the code publishes it as a QCanBusFrame message to the /from_can_bus topic.

Publishing frames from QtCanBus callback

Decoding Signals with DBC Files

The next step is to make sense of the data collected from the vehicle. So far, we have the frameId (do not confuse this with ROS’s frame_id) and 8 bytes of data (or less) per frame.

The solution is to use database definitions like DBC files. There’s a really good write-up on what DBC files are and how they work here.

DBC file entry for a frame and a signal, image from https://www.csselectronics.com/pages/can-dbc-file-database-intro

I went ahead and picked Josh Wardell’s Model3.dbc file, with additional records and collections from Greg Zimmerman’s GitHub repo (here), which had more up-to-date signal definitions.

After I had a DBC file to decode the data and a node publishing CAN bus data to a ROS topic, the next task was decoding. I chose to go with a Python node this time instead of C++, as I was somewhat familiar with the cantools library.

The process is:

  1. ros2_qtcanbus/qtcanbus_sender node sends messages to /from_can_bus topic in format:
id: 599
data: [43, 88, 34, 4, 2, 64, 17, 1]

2. ros2_candecode/candecode_node decodes the message using DBC database.

For message #599 from above, the DBC frame definition is:

BO_ 599 ID257DIspeed: 8 VehicleBus
SG_ DI_speedChecksum : 0|8@1+ (1,0) [0|255] "" Receiver
SG_ DI_speedCounter : 8|4@1+ (1,0) [0|15] "" Receiver
SG_ DI_uiSpeed : 24|9@1+ (1,0) [0|510] "" Receiver
SG_ DI_uiSpeedHighSpeed : 34|9@1+ (1,0) [0|510] "" Receiver
SG_ DI_uiSpeedUnits : 33|1@1+ (1,0) [0|1] "" Receiver
SG_ DI_vehicleSpeed : 12|12@1+ (0.08,-40) [-40|285] "kph" Receiver

VAL_ 599 DI_uiSpeedUnits 1 "DI_SPEED_KPH" 0 "DI_SPEED_MPH" ;

Then, decoding the data translates to:

{'DI_speedChecksum': 221, 
'DI_speedCounter': 13,
'DI_vehicleSpeed': 46.96000000000001,
'DI_uiSpeed': 48,
'DI_uiSpeedUnits': 1,
'DI_uiSpeedHighSpeed': 0}

3. If we enable the decode_choice option, numbers will be replaced with text values according to the DBC database.

# replaces the number value in the message with the matching string value
{...'DI_uiSpeedUnits': 'DI_SPEED_KPH', ...}

4. All frames will be published to /diagnostics topic with the type DiagnosticArray. One array will be one CAN frame with all signals as KeyValue. While I do not like the idea of storing values as strings, this seemed the most standard message type for the decoded messages.

5. In the case of GPS coordinates or speed/IMU datasets, the package supports translating these messages to Twist and NavSatFix messages for sensor fusing, like for the localization of the car.

Default config file with parameters for my ros2_candecode package

After running the two nodes (ros2_qtcanbus and ros2_candecode), all the configured topics should be accessible with the respective decoded images.

Now, let’s have a look at what we’ve collected.

Visualizing CAN Bus Data with Foxglove

Don’t get your hopes up too much just yet, there isn’t much we can do with the data we have — the fancier sensors (cameras, LIDARs) are still down the line. Still, seeing is believing, so let’s look inside:

Speed, pedal position, and steering wheel angle with a neat map — all from our Tesla’s CAN bus in Foxglove Studio.

By using Foxglove Studio (or with Rviz2) we can see the freshly populated topics, including hundreds of sensors from different systems in our very own car.

Summary

We’ve taken the first step to collect the necessary data to understand how autonomous vehicles work. With not-so-expensive cables and CAN loggers, you too can easily start your ROS 2 and AV journey, with Foxglove providing a great tool to see where exactly you are in your physical journey.

Find the source code of the packages here: https://github.com/tfoldi/ros2_qtcanbus/

--

--

HCLTech-Starschema Blog
HCLTech-Starschema Blog

Published in HCLTech-Starschema Blog

Data contains intelligence that can change the world — we help people discover, manage and use this intelligence.

Tamas Foldi
Tamas Foldi

Written by Tamas Foldi

Helping enterprises to become more data driven @ HCLTech, co-founder & former CEO @ Starschema