AMAC — Another Step Towards an Autonomous Future

We are a team of Berkeley students building autonomous mobile robots.

Autonomous Motion at Cal (AMAC) is a team of Berkeley students working to change the way autonomy is pursued. We aim to accelerate autonomous innovation through our development of revamped mobile robots that will be able to navigate the densely populated UC Berkeley campus.

There is currently a huge influx of capital and research resources within the autonomous vehicle space. Unfortunately, the barrier to accessing capital and resources is often very steep for undergraduate students and these assets are often funneled into research labs. We wanted to work on actualizing these advances while making them more accessible for students.

We’ve built a 1/8th scale Autonomous RC car from the ground up as in initial project. Our initial goal is that the vehicles should navigate through multiple distinct routes and on various terrains around campus. These cars would be durable, adaptable, and agile. Our end goal is multi-agent cooperation through a fleet of mobile robots.

Current Progress

Hardware Included

High Level Scheme

All Implemented Software Modules

For this car, we prioritized sensors that would aid in environment mapping, localization, and nearby object detection, so we chose a LiDAR, Inertial Navigation System (INS), and 3 cameras. Most of the RC car components were purchased off the shelf so we could focus on sensing, planning, and actuation. The computer, an Intel NUC, was essential for processing all of the sensor data into vehicle control commands. The battery (7 cell 29.4V, 6000Mah, 144Wh) was selected to account for the heavy power demand from the computer, drivetrain, and sensors. The design criteria for the car includes having robust components for the car in addition to features such as path planning and object detection, following, and avoidance.

Implementation Details

Constructing the Vehicle

Power Budget
3D Print of Electronics Chassis

Drive-by-Wire Communication

Drive-by-Wire zooms

The Software Stack

ROS Modules

LiDAR

We’d also like to thank Ouster for supporting our team with a LiDAR. They’ve been really helpful with providing us with any support that we would need.

INS

Shout out to InertialSense for supporting us with their sensor!

URDF

URDF Capture
TF Tree

The navigation stack uses a TEB planner to develop a local and global cost map which utilizes the occupancy grid and “tf” transform. The TEB planner uses an Ackermann steering model. When selecting a navigation goal, it calls the local and global planners to find a path via the planner, which then publishes waypoints (in the form of a Twist) to the cmd_vel topic. These waypoints are then converted to PWM values for our steering and drive motors. The drive-by-wire computer communication node then subscribes to these values to send the necessary signals to the ESC and servo.

Ackermann Steering

Cameras

3 Cameras working on one USB port

The current implementation of our object detection algorithm was created using OpenCV. It can detect a set of predefined known objects and respond with one of four actions: forward, reverse, turn and stop. For example, the vehicle will stop when in close proximity to a person. This information is published to the “cmd_vel” topic as a Twist and is processed by our drive-by-wire communication node to move the car.

Software and Visualization

Team:

  1. Travis Brashears, Engineering Physics
  2. Philipp Wu, MechE and EECS
  3. Malhar Patel, EECS
  4. Bradley Qu, EECS
  5. Gan Tu, CS
  6. Amanda Chung, Poli-Sci and Journalism
  7. David Yang, MechE
  8. Daniel Shen, MechE
  9. Carl Cante, MechE
  10. Andy Meyers, MechE

Thanks for all the love and support from our friends in Supernode!

Next Steps

We believe that the road for autonomous motion is endless. In the future, we hope to even have a fully sovereign rover on the moon. While our work is largely focused on the ground right now, we’re excited to shoot for the stars in the near future.

If you have any thoughts, suggestions or comments about what we’re working on, please feel free to contact us at trbrashears@berkeley.edu and malhar@berkeley.edu. We’re also looking for additional support (contributors and sponsors) so let us know if you’d like to get involved!

Other Cool Projects

If you’re interested in reading about our first step, check it out below.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store