Steering With Vision

Tom Jacobs
Aug 23, 2017 · 2 min read

When driving, finding where the road is and deciding where to steer is easy. For humans. For machines, not so much. Here’s my progress so far on writing an OpenCV lane finding system. It works, and next up, I’ll make it more robust to different lighting conditions, lane boundaries, and angles. Oh, and yes, see if you can guess what I’m using for the “road.”

First, we detect vertical edges:

So dizzy.

Then, we accidentally run edge detection on the wrong image format and briefly experience a psychedelic freakout.

2001 : A driving odyssey.

Then we go back to a sensibly tuned and optimised edge detection system:

Next, we classify vertical line pixels on the left as left lane (red), and those on the right as right lane (blue). We run a small window up from the bottom of the image on both sides, tracking this line as it goes up.

Some tuning required.

After a bit of tuning, we get it working pretty consistently for sensibly angled lane lines:

Then I try some perspective warping and original image overlay, which gets this cool receding perspective look on the windows, but doesn’t detect the lanes as well:

And putting it all together, without the warping, and adding a centre line so we can then calculate steering angle, we get:

Driving down the rainbow remote control road.

Australian Robotics Society

The Australian Robotics Society is a community of robot builders.

)

Tom Jacobs

Written by

Making robots.

Australian Robotics Society

The Australian Robotics Society is a community of robot builders.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade