The Decision Behind Using Articulating Sensors on Cruise AVs

JM Fischer
Jul 1, 2020 · 8 min read

Written by JM Fischer, Engineering Manager, Embedded Systems and Charlie Mooney, Senior Software Engineer, Embedded Systems

Cruise self-driving vehicles use sensors as the eyes and ears of the vehicle. When we introduced the Cruise Origin earlier this year, we showcased an articulating sensor that has the ability to pivot 360 degrees and “see” in light or pitch black at superhuman speed. Our sensors will be able to see beyond what’s humanly possible and react faster and more safely to the things that people are able to see.

Image for post
Image for post
Cruise co-founder, CTO and President Kyle Vogt introduces the articulating sensors on the Cruise Origin during a live event.

In this post, we’ll dive into some of the software we created for the articulating sensor on our existing fleet — the Articulating Radar Assembly (ARA). The ARA is one of many exciting pieces of our vehicles that our team interacts with everyday. Let’s start with a high level view of what the ARA is and why we use it.

Image for post
Image for post

Two Articulating Radars sit in front of the A-pillar on both sides of our Track 3 vehicles. They each hold a radar that our self-driving software uses to intelligently look ahead (and back, and to the side) during various maneuvers. We use the articulating radars to drive safely through certain maneuvers, such as unprotected left hand turns.

An archival clip of a Cruise self-driving vehicle performing an unprotected left turn.
An archival clip of a Cruise self-driving vehicle performing an unprotected left turn.
An archival clip of a Cruise self-driving vehicle performing an unprotected left turn.

During an unprotected left turn, oncoming traffic may approach at speed (e.g. approaching a green light) and have the right-of-way. In order to ensure the path is clear for sufficient time to complete the turn, we use a long range radar to look far down the road and scan for any approaching vehicles. This is especially valuable in bad weather conditions where camera or LIDAR-based solutions fall short.

Why Articulating Radar?

LRRs are special among radars in that they are very directional — like zooming in all the way with your camera. They allow you to see farther away at the expense of a smaller field of view. You can detect far away objects with great detail, but may completely miss something just one degree to either side.

The AV needs to be able to sense distant objects clearly, but those objects are not always in the same spot. High-resolution, low-latency, long-distance sensing over a large area is not easy to achieve in general (although we’re bringing that capability to the Cruise Origin). Solving all those requirements together makes for a quite demanding task.

Imagine that you’re a wildlife photographer trying to snap pictures of birds — you need the highest quality pictures possible of distant objects in extremely dynamic settings. This is quite a challenge, and one surprisingly comparable to the one we’re solving with the ARAs.

There are two basic approaches to solving a problem like this:

  • You might set up many sensors, all pointed at a multitude of different angles. If this is done carefully, you can cover a very large area and it could work very well. If the whole area where the AV might need to detect an obstacle is covered by at least one sensor at all times, you can’t miss. Exactly how many you would need depends on how much resolution is needed and at what distance the targets are expected.

This approach has the obvious benefit of never missing your shot, and it’s very simple to use and maintain. You’re effectively building one mega-sensor.

The downside of this kind of solution is just as obvious. You might need a lot of sensors to cover the whole area. Many problems scale with the number of sensors. Cost concerns, developing a network that can handle the sheer amount of data being produced, and even just physically placing the sensors on an AV can easily be showstoppers.

  • Alternatively, you could set up one sensor, but move it quickly to point in the direction you’re interested in at any given moment. This manifests in this case as an articulating radar.

The general benefits and downsides are completely flipped now. You only have one sensor to deal with, but higher technical complexity, including an entirely new system that needs to decide which direction to point. You also need to develop reliable actuators to do the pointing.

At Cruise, we have developed a sensing solution that we believe strikes a balance between these two approaches. Our system has three ARAs on Cruise’s current AVs: each one consists of a radar on a motor that can rotate it left and right to make sure it’s always pointed in the best direction.

Image for post
Image for post

Springing Into Action

The Embedded Systems team handles the first layer of software that runs on our vehicles. We connect the AV to reality, bridging the gap between the AV’s sensors and actuators and the highly sophisticated self-driving stack — all while maintaining stringent automotive-grade quality.

Using their suite of sensors and a highly detailed map, our vehicles know where they are in the world. This is called “localization”. Then, our planning and routing systems look ahead and determine where the vehicle should go. Combining this information, the software looks for upcoming traffic patterns, which commands our radar system into action.

Image for post
Image for post
As demonstrated by the red arrows, radar assemblies activate on a Cruise AV during an unprotected left turn. The ARAs monitor for potential hazards in various directions before returning home after the turn is completed.

With a specific notion of where to look, Embedded Systems sends commands to point the ARA in the designated direction. We maintain the software in the self-driving brain that acts as the bridge from the conceptual world of pure logic to the physical world of metal, wires, rubber, and plastic — the “ARA Bridge.”

The brain of the AV decides when and where it needs additional radar data from the ARAs, and transmits that request to the ARA Bridge. We take it from there, interpreting the request and using the current location and pose of the vehicle to determine the perfect angle to turn the motor. This is a critical application, and an extremely challenging task. Numerous considerations go into making these computations as precise as possible. For instance: assembly tolerances, sensor and motor controller versions, the speed and direction of the AV, all factor in. All this happens before a single command reaches the actual ARA hardware.

For the command to reach the electric motor, it must traverse a path through the vehicle’s electrical system to reach the motor assembly. This path crosses multiple protocols (CANOpen, Ethernet, and IPC mechanisms) and flows through several electronic control units before the motor actually turns. This pathway is owned by Embedded Systems and it’s our job to make sure every command makes it to the other end accurately and reliably. Positional encoder information, error states, debugging info, and diagnostic data needs to be transmitted back over this same path to the AV brain as well.

An additional challenge that we tackle with the ARA system is integrating with upstream perception and calibration subsystems. Vehicles, bicycles, and pedestrians are different at each new intersection and often moving quickly. It’s critical that we not only detect them rapidly, but also provide positional data back to our perception systems accurate within milliseconds and fractions of a degree. We also need to quickly detect and react to faults, such as corrupted packets. On the Embedded Systems team we work to ensure our systems are both highly performant and highly robust.

Managing moving parts and machine reliability

With safety being the top priority across teams at Cruise, the Embedded Systems team writes and maintains software that evaluates if the ARAs are in good condition from the moment the vehicle is started.

At startup

  • Is the entire communication path from the main computer to the motor controller and all intermediate steps connected and reachable?
  • Is the motor properly calibrated?
  • Can the ARA move through its full range of motion?
  • How much resistance is encountered when moving the ARA? Is it possibly broken, jammed, or wearing out?
  • …and many other tests and diagnostics to make sure the system is safe to operate.

During runtime

  • Loss of communications with the intermediate gateways along the communication paths or the motor controller itself
  • Over or under voltage
  • Temperatures of various components out of range
  • Operating currents on the motor growing too high
  • …and many others

If any of these runtime parameters enter a fault condition, we take the appropriate action. Our sub-system will look at each fault and react by sending a message to the vehicle’s brain. Each potential fault is mapped to an action to take which could range from, “return to the garage after this ride ends” to “pull over immediately” depending on the severity. We call these “degraded states”, which are interesting and broad enough topics to warrant their own post.

Image for post
Image for post

Looking ahead to a clear path

Join us

The Embedded Systems team lives in C, C++, & HDL in the world of: ECUs, FPGAs, ASICs, bare metal, Linux kernel, HALs, device drivers, and all sorts of embedded applications. We often find ourselves discussing the physical, data link, network, and transport layers, using deeply embedded interfaces such as CAN, I2C, 100BASET1, UARTs, LIN, as well as high-speed interfaces such as FPD-Link, PCIe, and 10GBASE-T.

We are hiring! Come join us in solving new challenges every day.

Cruise

A self-driving service designed for the cities we love.

JM Fischer

Written by

Engineering Manager, Embedded Systems at Cruise, 12+ years in software development

Cruise

Cruise

Cruise is building the world’s most advanced self-driving vehicles to safely connect people with the places, things and experiences they care about. Join us in solving the engineering challenge of a generation: https://getcruise.com/careers

JM Fischer

Written by

Engineering Manager, Embedded Systems at Cruise, 12+ years in software development

Cruise

Cruise

Cruise is building the world’s most advanced self-driving vehicles to safely connect people with the places, things and experiences they care about. Join us in solving the engineering challenge of a generation: https://getcruise.com/careers

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store