Using Go and Braille to Render Images and Video to a Terminal

Or: I’m Sad and Jealous to be Missing GopherCon This Year, so I Wrote This in Absentia

Kevin Cantwell
5 min readJul 11, 2017

I generally hate going to tech conventions but I do love me some GopherCon. The talks are rarely boring or a waste of time, the community is generous and kind, and the passion for the language is driven less by fanaticism and more by pragmatic endeavor. I’ve been to every single GopherCon so far and even had the honor of speaking at one, but sadly I’m going to miss this year’s event. So, in a nod to GopherCon’s tradition of hosting a day of lightning talks (a series of short talks by many presenters, each lasting 5 minutes or less) I present you with a brief story about a fun little tool I created — Dotmatrix — which uses braille Unicode characters to render images, animated gifs, and even streaming video. I hope you enjoy it :)

Inspiration

A while ago I was tickled by a small update to the Heroku cli which added a unique spinner I hadn’t seen before:

I didn’t know about the braille Unicode set before then and for some reason my brain wouldn’t shut up about it. I wondered: If arranged in a matrix, couldn’t this character set be used to render pixel-perfect monochromatic images? You know, just like one of those old dot matrix printers? I knew of tools like cacaview and imgcat that print images to the terminal using ANSI color codes, but the braille method could display images at twice the resolution. I had just found my new side project!

A Type Alias is Born

Turns out, each braille Unicode pattern can be calculated from a simple algorithm [source]. I declared a type alias of a 2x4 int array and implemented a method to calculate the corresponding Unicode rune:

https://en.wikipedia.org/wiki/Braille_Patterns#Identifying.2C_naming_and_ordering

I then wrote a command-line tool that would accept an image as input, carve it into a grid of Braille types, print the result as lines of text, and…voilà!

Well, would you look at that! Just like those old dot matrix printers but without the perforated margins. I could have left it there, considered it a job well-enough done, but I was unsatisfied. Although this program worked for high-contrast images, it kinda sucked for anything with lots of shading or color. Even this image of Saturn loses all detail on the planet and its rings.

Diffusion Confusion

I knew I was limited to strictly black and white pixels, but I thought there must be some way to “shade” neighboring areas of color in grayscale like our man David here on the left. I’m pretty ignorant when it comes to image manipulation techniques, so I called in some backup. My brother is a skilled photographer and when he took a look at my work he said many things, most of it going right over my head. However, I did catch at least one phrase that made my ears perk up: “Floyd Steinberg diffusion”. Hmm, where have I seen that name before… Oh! That’s right, it’s right there in the draw package:

https://golang.org/pkg/image/draw/#Drawer

Thank you standard lib! Now I could achieve much more accurate results without any additional work, which was perfect because I hate work ;)

Look how much more detail the below rendering captures using the FloydSteinberg drawer:

You can make out the rings and even some of the surface detail. Cool!

A few more examples:

High contrast black and white images look best.
R2 looks good in any setting.

Animated Gifs

At this point static images looked good and I was pretty happy with that. I even added a few image manipulation functions to my tool such as brightness and contrast adjustments. But the image/gif package is capable of decoding animated gifs and I just had to have that feature. Scope creep!

Each frame of the gif had to be calculated according to the method specified in the gif file, converted to a Braille matrix, and printed to the terminal with the correct delay. I also had to make some use of ANSI escape sequences in order to reposition the cursor after each frame.

This took quite a bit of extra time to get right, but the results are pretty satisfying (if I do say so myself). I also learned a lot about the various gif-rendering algorithms in the process (AMA). These are some screen-caps from OSX’s Terminal app, scaled down to reduce file size:

Beyoncé’s Lemonade and Akira

Video Streaming

I’m nowhere close to being able to decode most video formats, but there is one common format — used by many webcams — that was a natural fit for the Dotmatrix tool: Motion JPEG (or mjpeg). This format is nothing more than a bunch of jpeg files concatenated together by a special delimiter. It seems that most webcams can deliver their video in this format when captured from a browser, but if you own a Mac you can also capture the iSight camera feed and convert it to an mjpeg stream using ffmpeg:

ffmpeg -r 30 -f avfoundation -i FaceTime -vf hflip -f mjpeg -loglevel error pipe:1

Dotmatrix accepts this mjpeg stream as input and you can use it to render live video to your terminal. Here’s me while writing this article and “printing” myself to iTerm at 12 frames per second :)

Thanks for Reading!

That’s it, short and sweet. This was fun to write and I hope you enjoyed it. If you want to take a closer look at the code responsible for this, check out my Github or feel free to DM me via Gmail or Twitter. To all my peeps at GopherCon 2017, especially those organizing and speaking, I salute you. I shall be living vicariously through you on Twitter over the next few days :)

References:

--

--