I Raced a Donkey in China (for Big Data)

Erich Clark
Jul 26, 2018 · 11 min read

The Invitation

My brother Chris and I flew to China this March to race hobby scale autonomous cars at the Guiyang Big Data Expo for Pix Moving, a self-driving car startup. You can read about the event here in the words of the fantastic Nancy Lee. They flew us to China, put us up in a brand new hotel, and fed us like kings!

Challenge Accepted

The competition was called DIY RoboCars Kuiakai, and there were two categories- one for small and one for full sized cars. Small car autonomous racing is becoming extremely popular these days, a fun hobby that is generating useful data for larger car systems. Without the need to worry about pedestrian safety, expensive crashes, and prohibitive costs, small car racing allows innovation on problems that would be too hazardous and expensive to test on a larger scale.

Chris was just finishing his Udacity Intro to Self Driving Cars Nanodegree when he heard about the competition, and we decided to take on the challenge. After all, we were playing to our strong suits. Chris has an astounding knack for shattering software learning curves, and I see the world with an inventor’s eye and touch it with builder’s hands.

We were advised to build on the very popular and successful Donkey Car platform, which has a $250-$300 base build cost and a helpful, vibrant community to back it. We looked over the documentation and decided we could definitely pull it off. Chris lives in Taiwan, but he flew out to our house north of Boston so that we could build two cars together.

Before the body kits and paint, they were just Donkey Cars…

Donkey Car?

The Donkey Car uses a Raspberry Pi brain to control throttle and steering on an RC race car. Its sole input is a front mounted, wide angle camera mounted atop a 3D frame, and you pair it with a laptop over WiFi for all its essential functions using SSH (Secure Socket Handling, a command-line interface in a Terminal window).

The car operates on a “stateless” model, which means that its decisions are based on moment-to-moment reactions to what it sees. It doesn’t have a plan or know where it is. If it did, that capability would be called “localization.”

To train the model, a human operator must drive around the track while the Donkey takes pictures, paired with your throttle and steering values. The data is organized into sets called tubs. Afterwards, the tubs are run through software on a more powerful computer that builds a driving model, an algorithm that says “if you see something like this, you should use this throttle and steer like this.” It makes guesses, then compares those to what you did. Then it makes adjustments to its guesses and compares again. Each of these passes is called an epoch, and it continues to produce epochs until it determines that its guesses are not improving. That last set of guesses becomes your final model, a file that you load into your car and then give it the command to start driving on its own.

Hats off to Will Roscoe, Tawn Kramer, Doug LaRue, acbot, and the incredible community behind Donkey Cars today! They were so patient and helpful with us as we worked, and I am sorry if my layman’s introduction does it no justice. Like I said, I was the hardware guy. (For those interested, Donkey cars rely on an end-to-end Neural Network). You can hear about the project’s inception here from Roscoe himself.

Yeah, I went nuts on the bodywork.

The Road To Kuiakai

As you can see in the pictures, I went a bit nuts with our build, so keep in mind that those are not $300 cars. I would like to say that I made some cool improvements to the build. We took a few days and got our cars up and running on a small, craft ribbon-based track in our driveway. It worked straight out of the box, and we were pretty confident as we left for China.

We went a few days early to see the contestants in the other part of the event: the full sized autonomous car race! The task was insurmountable for small teams, so they combined forces into one large team, building a self-driving Honda Civic for use on a semi-closed road. We marveled at the harmony between the engineers from such different cultures and backgrounds; there was an air of truly inspiring cooperation and respect that grew into warm friendships. For us, befriending the international teams, working with and near them was worth the trip alone.

The Pix Moving office is in Guiyang and they let us play in their fully featured maker space. But this is not your local “maker space,” like a couple of 3D printers and a t-shirt station. This is a full-on prototype car lab, so of course I fell instantly in love.

Smooth Acceleration

As we shadowed the full sized car team, we puttered on our software. My biggest ask from Chris was an improvement to the throttle system for the human driver. The stock Donkey uses a PS3 controller’s analog sticks for both steering and throttle. As a video game racing enthusiast, I was dissatisfied with this arrangement, so Chris put his mind to it and soon gave me an ultra-smooth, trigger-based throttle.

It felt so good to drive! I zipped around with my flashy car, and everyone watched and took pictures of it. Pix Moving featured me prominently in their publicity videos and even put pictures of my cars on their Twitter feed. At the Expo, some excited college students approached me for pictures and showed me the tweet!

This is not the Tweet. It’s from Nancy’s article.

However, we ended up just training with a constant throttle value. The problem is that Donkey Cars record throttle data, not speed data. Since they run on batteries, the throttle values only occasionally correlate to speed. Say you are driving your car down a hill or into a steep turn. You release your foot from the gas pedal. That’s a zero accelerator value, but your car has not stopped. Or say you are towing your Uncle Larry’s trailer home off your property again; you hit the gas hard just to make it up the hill, but you normally wouldn’t floor it past your neighbor’s house. Similarly, a Donkey Car reads zero (stopped) at the curves. When your Donkey’s battery runs low, your car can be a great deal slower than on a full charge. In short, it learns the wrong things.

So all Chris’s good work didn’t help us at all in the race, but man it looked cool to have all that throttle precision when I tore around the Pix Moving lab.

This was BIG

Friends All Over The World

Then the other small car engineers arrived, and the whole thing went to another level. It had that great chummy hackathon feeling, with sharp young men from around the world thrown together on a great adventure. It was magic. We met some incredible people there, made some friends, and were even able to help some other racers to get moving.

Race Prep

We finally saw the racetrack two days before the Expo, marked out in tape on a shiny concrete floor in one of the massive convention halls. Right away, we asked our host Taniya if she could talk to the Expo staff about lighting. Would this be the race day conditions? Fully half of the ceiling lights were off. Would they remain off during the Expo? Tanya was a fantastic host, and true to form (anything for her Small Car Racers!) she set off to find someone in charge. She had a talk with them, and they assured her that everything would be the same.

The day before the race, Chris and I ran our cars indefinitely around the track; we let them go for about five minutes, then pulled them off to let others practice and train. They worked flawlessly. We tried not to look too elated, but perhaps did. Blame my over-expressive facial muscles.

Race Day: DISASTER!

The first disaster was massive light interference. Apparently, the Expo staff just said whatever would make us stop asking questions. They’d had no intention of using more lights during practice or using less during the Expo; they just didn’t want to deal with us. Remember, our Donkeys used only visual input to drive, and suddenly the room looked entirely foreign. The slick floor now reflected hundreds of new and confusing things: many new high-intensity ceiling lights, plus a concert-style booth next to us, complete with scanning lasers and sweeping searchlights. We could even see halogen booth lights in the floor from across the hall. Our poor cars were so confused. They had no idea how to follow the course.

Tiny print, but hey, we were in business!

The second disaster was the WiFi. Donkey Cars use WiFi to start up, train, pull data, load a model, or start autonomous driving. With so many attendees at the Expo, cellular data and WiFi channels were clogged solid; we couldn’t communicate with our cars.

Because Donkey brains are Raspberry Pis, it is possible to hack into your car using its HDMI out and a USB keyboard. Chris and I luckily had a keyboard and a 5" monitor with us, so we could start one car at a time without a network. I drove five minutes, then we ran back to our laptop and tried to pull the data. I made a WiFi hotspot with my iPhone on Airplane Mode (not an option on Android, sorry), and we connected to the car, but the data transfer was less like a trickle and more like humidity. It was taking minutes to transfer each 160 x 120 pixel image, and we had over 20,000.

The Faraday Lift

“I wish we had a Faraday Cage,” I said. Maybe if we got away from the crowd a bit, we reasoned, just maybe we could get better transfer speed. We found a possible solution over in the corner near the bathrooms, where furtive smokers lit up under a huge No Smoking sign. There stood a big piece of convention equipment, a telescoping platform lift car for setting up scaffolding and changing the lights in the 100 foot ceiling. Well, it was no cage, but it was big and metal and electrical… We placed the laptop and the car between its telescoped legs and prayed.

It worked! Suddenly our bandwidth spiked; we tried to stifle our glee and enjoyed an unspoken agreement with the illicit smokers: we won’t tell if you won’t. Twelve epochs later, we got our model onto our cars and hurried back to the track.

Crash and Burn

The new model was junk; the wheels jerked around haphazardly. We have no idea why, and it was a tough blow.

This one’s a PixMoving Tweet. Notice the nice, shiny floor?!

We used our practice day model, because it was better than nothing. We had high hopes that we would place. We had never considered that we might not even finish. All the contestants drew sticks of wood with laser-engraved numbers on them. Hopes dashed even before the race began, we waited for our turns.

Most other racers faced even grimmer realities. Many had been unable to even start their cars, let alone train them. Some wandered hopelessly between the dispirited huddles, pleading for WiFi. The wheels had fallen off.

The Final Four

Out of 13 cars that made it to race day, only four managed even a single lap. The gray car, Pit Viper, did not complete. We knew it wouldn’t, because we’d seen it fail earlier. But the red car above, dubbed Stone Tiger, using the exact same model, miraculously made a complete lap! Wrong car, wrong day, but hey!

That lap got us into the Finals, because there was only the Final Four, but Stone Tiger never completed another lap. Well, at least she looked hot. The finals were a long, emotional roller coaster. So many attempts and failures…

The Winners!

However, the top three teams definitely deserved their winnings! The winner, a team of two Chinese computer vision students, put their camera on a pole like a ship’s mast and converted the camera output to black and white. Theirs was the only car to keep at least one wheel inside the track for an entire lap. The other two winning teams focused on error recovery, training their cars to get back on track when they wandered off. Second and Third went to an independent Chinese student and a group of local Guiyang university students. By far the most jovial team, it was fun to see them celebrate, and their professor was radiant!

Rematch?

Perhaps we could have a better, more competitive race in more controlled conditions. I would love to participate again in another Pix race, and they tentatively expressed interest in an indoor event at their workspace. That would help with both consistent environment and WiFi problems. Also, I would dearly love to return! The friends are wonderful and the noodles are very spicy, but so, so delicious.

Judging by the people I met at Kuiakai, we have the minds to solve these problems and more. I have high hopes again. See you soon, I hope!

Even it’s name was BIG.

The View From Here (my daydreams and current course):

Perhaps a non-reflective mat could be used to eliminate glare from the road surface, and perhaps it could be rolled up and used in different locations. But let’s be honest, real cars drive in rain and snow and down the Las Vegas strip. And really, who wouldn’t love a nice Maple Valley race track replica for one of these races? How about some scenery?

Perhaps we could make a Faraday tent? Would a metal net or foil tarp work if it were draped over a pop tent? People could then go inside to start their cars and transfer data.

We need an affordable speed sensor to replace the throttle input. I’m building one soon. I know I’m not the first, but maybe you’ll like my design?

For now, Chris and I are switching our priorities to the Duckietown project. It’s a much slower self driving car environment with ROS (Robot Operating System) robot cars called Duckiebots around a town with traffic rules, multiple cars, and pedestrians (yup, the pedestrians are little rubber duckies).

Also, I’m cooking up a pretty cool surprise app for you guys, but I can’t talk about it just yet.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade