How we made our vision system almost 100% reliable, without a Limelight

The most common vision solution in FRC: the Limelight.

Our team loves to be in control of everything that happens on the robot, from basic code to the vision system running on our coprocessor. Being in control of everything in the vision pipeline removes a majority of the trust that you need to have in other peoples’ code working reliably, while also teaching the team a lot about a vast range of topics. This desire for control is one of the primary reasons we’ve repeatedly decided not to get a Limelight. No ability to SSH in and poke around and no open source repository of the code running on the system means that we have no way to change anything to fit our use case or audit the code for possible bugs. Essentially, this left us with one option: using our own vision code, which we’ve been doing with varying degrees of success since 2016.

Being in full control introduces a fairly significant issue, too: you need to be able to trust your own code, or have monitoring systems in place to help debug any potential problems. This year, we decided to introduce a new system to help monitor the status of our vision pipeline in the form of a small, 5 inch screen on the side of our robot with color-coded status messages about the varying systems.

The status screen on the side of our robot, seen here in its 3D printed case.

The screen (attached to our coproccessor, a Jetson TX1) monitors four different systems and fail-points: radio connection (ping, robot connection (ping, network table connection, and whether the vision program is actually running. If any of these systems is down or not functioning, the status message will be updated and colors changed accordingly.

The various statuses of our vision system.

Having this fancy screen on the side of our robot helped immensely in debugging problems with our robot and vision system. While we didn’t record the exact number of times it was used, I can specifically remember at least a dozen times in which the screen was used to fix problems in the vision software or its communication with the roboRIO, including a few times on the field before a match.

And it didn’t just help us in debugging. Having the colorful (or rather, hopefully all green) status screen displayed prominently on the side of our robot drew the attention of a few big name teams and judges alike who contacted us to ask more about it. It even helped us win the Autonomous Award at one of our events!


The code for the status screen was written with appJar, a super simple GUI library for Python. If you’re interested in checking out the code, contributing, or using it for yourself, it’s all available in our vision repository, here.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store