Light Rail

Tracking the MBTA in real time, on my wall


Residents of the Boston area will instantly recognize the ragtag collection of lines pictured above — it’s a rapid transit map of the MBTA, commonly known as “the T”. Since I started school here two years ago, I’ve become really interested in transit systems, in particular Boston’s, which is rich with history and character. At the same time, I’ve also been working towards a degree in electrical engineering and computer science. It was just a matter of time until these two interests collided, and I’d like to present the result.

This is eight or nine meters of Adafruit NeoPixel strips driven by an Arduino Uno, which in turn takes orders from a Python script running on a Raspberry Pi. Every ten seconds or so it calls the MBTA API to grab the GPS coordinates of all the trains in the system. It maps those to some LEDs, decides which ones actually need to be changed, and then sends that information to the Arduino, which does the bit pushing. In addition, I’m writing a tiny webapp that lets me change visualizations and adjust the brightness for when I need to sleep. I’ve put together a full explanation with some photos below, so read on if you’re interested!

The Map

Every transit map represents a compromise between geographic accuracy and diagrammatic simplicity. For my project I needed something simple, but that would show distances on an accurate scale. By far the best MBTA map I could find for that purpose is this one by designer Cam Booth, whose Transit Maps blog I check on a regular basis:

Credit: Cam Booth

In particular, I appreciate how this map handles the Green Line, which breaks into four branches of drastically different size. Many maps show the E branch, rightmost, as being the same size as the adjacent D branch which is many times its length. Since I wanted the speeds that trains move across the wall to be roughly correct relative to one another, a map showing the true length of all branches relative to one another was essential. I tightened the angles and made some simplifications to Cam’s map to save myself some soldering, but overall I credit it as the basis for my project. I omitted the Silver Line and the Ashmont-Mattapan trolley both for simplicity and because I originally believed these branches don’t have realtime data, though I’ve since found out that the Silver Line does (why don’t they put it up on the signs?). I used a projector to add reference markings made of masking tape, which remained unexplained on my wall for about a week:

It’s for…a thing.

The Hardware

I bought most of the stuff I used from Adafruit, which sells its NeoPixels brand of individually-addressable LED strips along with a dead-simple Arduino library, recommendations for other components you’ll need, and a large library of support material. I went with the 60 LED/meter strips, figuring the light density, price, and power requirements were about right. Adafruit says that each NeoPixel draws 60mA of current when fully lit, but in typical installations will take about 20mA (that’s one of red, green, or blue on at full blast). My calculations indicated that this put me juuuuust under 10A of current, so I splurged and bought two 5V/10A power supplies for some extra juice. The voltage drop across hundreds of LEDs really adds up, so I also needed a bunch of splitters to parellelize power distribution so that no segment runs more than two meters without a shot of Vdd. Completing the circuit are some lovely DC barrel jack to 2-pin wire adapters. I made this wiring diagram to explain the power distribution to my idiotic future self trying to transplant the installation to another wall:

Red is power and blue is data. Yeah. That sounds like something an electrical engineer would say.

The Grind

I had a rare week of totally free time to get this project done and it occupied almost all of my waking hours. After a trip to buy solder, capacitors, and an embarassingly large quantity of assorted LEDs “for testing purposes”, I was ready to roll when the Adafruit goodies arrived on Tuesday. By “ready to roll” I mostly mean, “panicking, because I have never soldered anything in my life”. This seemed like an embarassing gap in experience for a third-year EE major, and it was a huge problem because I was going to have to do a lot of soldering. The NeoPixels strips don’t bend laterally at all, so I needed to make a flexible joint for power, data, and ground at every bend or gap in the map. I spent about an hour practicing soldering wire first to itself and then to a few sacrificial NeoPixels (RIP) before I was confident enough to try it on the real thing.

also RIP my phone camera, 2013–2013

But solder is very forgiving, and within [REDACTED] minutes I had my first joint soldered together and ready to rumble. I conscripted an Arduino at this point — I thought I’d just use it for a few tests before moving on to direct control from the Raspberry Pi, but, just like my life, it soon became consumed by this project and is now mounted to the wall.

The testing apparatus. I ran power through some 2-pin JST connectors sold by Adafruit.

I plugged it in and held my breath…it worked! One down, nineteen or something to go. It was at this point that my roommate Ostin came by and gave me some excellent soldering tips, er, pointers, but I still spent dozens of hours over the next couple of days soldering the strips together and prodding them gingerly with a multimeter.

Artsy

I joined each line separately, also breaking the northen half of the Red Line off from its branches because the two parts would receive power in parallel. Days of thankfully non-lead-based soldering later, my desk looked like this:

I guess this is my life now.

And I looked like this:

On the internet, nobody knows you’re a penguin.

But I pressed on. At this point Ostin realized that the whole thing would be over with much faster if he helped, and so we agreed to stay up all of Friday night mounting these things to the wall. He soldered a bunch of bullet connectors from his own stash to run the data lines together between different strips and the Arduino, for which I am immensely grateful. At about four in the morning, we also had to make a trip to MITERS down the street where a sleepy wizard opened the door for us and let us take as much wire as we wanted. The sun was up and we were delerious by the time this sad partial timelapse was taken of us mounting the strips to the wall with purpose-made silicone clips from Adafruit:

Not actual speed

The Second Coming of Ra, the Sun God

Wear sunscreen.

I tested the bejeezus out of the strips with some sample code that comes with the NeoPixel Arduino library. Damn are these things bright! My eyes hurt after staring in awe for a few minutes, so I took a break for some sleep. With all of the strips and part of my brain working, I could now start the fun (or at least less carpal-tunnely) part: writing the code.

Enlarged to show texture

The Code

As I mentioned earlier, I originally planned to drive the strips directly from the GPIO pins on a Raspberry Pi. This is made complicated by the fact that Neopixels have strict timing requirements and require an 800KHz data stream, where the computer sends a new bit almost every microsecond and then holds for fifty microseconds to say “I’m done”. Regular old user-mode programs on Linux can’t do this, because they are constantly paused so OS-level interrupts can run. There’s some deep magic from before the dawn of time that lets you drive NeoPixels from inside an interrupt, but in the interest of simplicity I decided to use the Arduino instead.

Kernel interrupts are like these ducks, but less cute in every way.

Meanwhile, on the other end of the pipeline, I wrote some code to read train data from the MBTA API and to determine the color of each pixel in the map. The API was surprisingly well-documented and easy to use, and when I emailed the MBTA asking them to raise my request limit, they bumped it the next day from 10,000 to 100,000 requests with no questions asked. Overall I was really impressed with how well they accomodate developers, though to be fair most of them are writing much more useful clients than mine.

Mapping the precise GPS coordinates provided by the API to rough locations on a one-dimensional line was an interesting problem. I considered working with GPS coordinate paths of each train line or with the arrival sign data from consecutive stations to determine where each train should appear on the map, before deciding on a simpler approach. First approximating the track between each pair of stations as a line segment, I found the line segment with the mininum geographic distance from the train, giving me the two stations the train was currently between.

I’ve exaggerated the track curve in this animation — the linear approximation is very good pretty much everywhere in the system.

From here I could take the ratio of train-station to station-station distance and pretty much know where the train should appear on a line of pixels. After counting and recounting LEDs I had a representation of which lights appear on which lines. I also had to tell the program which lights represented specific terminus, transfer, and branching stations, but I let it interpolate most station positions in automatically for now. The output of the program at this point was an ASCII map of each line:

It was finally time to connect the client application to the Arduino over serial! At this point I had to get realistic about timing and memory and make some tradeoffs. Should I treat the Arduino as a dumb client that does nothing more than pass color values from the Pi to the pixels? This would mean I’d need to send data about a lot of pixels over serial very quickly. Or should the Arduino take a more active role in changing the state of the lights? If so, I’d be fragmenting controller logic across languages and devices, and I’d have just a few hundred bytes of memory left over on the Arduino to work with. In the end, I went with the first option, having Python serialize its map representation and send it to the Arduino. At most I had 115,200 bits per second to work with, so I came up with a two-step “diff” that compares the previous state with the new state and consolidates adjacent pixels as much as possible. This saved a lot of bytes and made it really easy to set the color of entire strips very quickly.

Two Green Line trains pass (or collide?)

The last piece of the puzzle which I’m gonna mention real casually even though it consumed my life for two days is the timing of data across the serial port. Writing to 500 LEDs takes about 15 milliseconds, an eternity in processor time. Thus, I had to coordinate the Python script and the Arduino to make sure that I wouldn’t be overflowing the serial port with data while this was happening. In practice this just meant the system is in one of two modes: writing to Arduino or writing to LED strips, and I had to spend a long time tweaking millisecond “sleep” values to make the timing as tight as possible. At this point it can update the main visualization about 15 times per second, enough to make it look pretty smooth. All the code for this project is available on Github.

Duuuuuuude

Prior Art

Several times I’ve googled “LED transit map” to see if anyone else has come up with this idea. I didn’t find anything quite like mine but I find a couple of cool pieces of art based off of the “L” in Chicago, including this neon parallax sculpture hanging in CTA headquarters and a crazy cool stock ticker in Motorola’s Chicago office. There are also these neat neon interpretations of well-known subway maps by Petr Koll. Finally, a friend of mine taking MAS.863 “How to Make Almost Anything” at the MIT Media Lab came across a post from one of last year’s students who tried to build a very similar map as a final project but didn’t get it working in time, because the class isn’t called “How to Buy Almost Anything From Adafruit”.

Conclusion

It’s become apparent that this is one of those projects with a first 90% and a second 90%. Although I’m really happy with how the installation works now, there are already a lot of improvements that I’m looking to make:

  • I’m working on a web app that lets me turn the lights on and off from my phone so I don’t have to use SSH (or the power cord) every time.
  • The MBTA API only updates the position of a train on average once every 10–15 seconds. I’ve tried using the position and velocity of trains to interpolate where I think they are during that interval, but it often ends up being wrong and the effect is jarring (sometimes trains jump back a few lights, or the approach between two trains is replayed several times). If you live in Boston you already know that that T defies any kind of prediction except possibly the kind that involves compacting with the Old Gods. Regardless I think that if I train a state estimator on train position data for a week or so, it might give me better predictions than a simple linear interpolation. Or the T may realize that I’m on to it and simply catch fire all at once.
  • There are some improvements that I could make to the way I send data over serial. The strip diffing technique I explained above is neat but not necessarily optimal, and finding the best solution sounds like a vaguely interesting dynamic programming problem. In addition the serial protocol is still kinda dumb and I think that if I make it a little bit smarter, especially by caching colors (3 bytes apiece) on the Arduino I could coax it into being roughly twice as efficient — the reason I haven’t done this yet is just because the current incarnation is good enough for now.
  • I have just enough LED strip left over for an extension of the Green Line to, let’s say, Route 16 and Union Square. Your move, Charlie Baker.

Thanks for taking the time to read this! My name is Ian Reynolds and I’m a junior in course 6–2 at MIT. If you have want to get in touch (especially if you want to build something like this yourself) drop me a line at idreyn [at] gmail.com.

Many thanks to Michael Harradon, Ostin Zarse, Ben Chrobot, Kenny Friedman, and Tiffany Chen for their help with this project.