Very Slow Movie Player

Walking around Brasília some years ago I had the distinct feeling that I was doing it “wrong” because, of course, I was. The center of Brasilía is organized along the Exio Monumental, featuring an array of government and other important buildings that form a long spine. This is a place designed to be “read” at the speed of a vehicle, so taking in Brasília by foot is like watching a movie in slow motion. It turns out, both can be rewarding in unexpected ways.

With a little bit of patience, the details of both reveal unexpected and delightful moments. In Brasília, pedestrians are rewarded with an opportunity to discover the subtle variations between what look to be mega-scaled buildings. Rhythmic reflections and shadows bring surfaces to life under the tropical sunlight in beautiful and nuanced ways. Just don’t forget to put on sunscreen, because the distances are intended to be enjoyed from the comfort of a motor vehicle.

It may not be an obviously human-scaled city, but Brasília holds many delights for the curious pedestrian. Photos from 2010.
The Quintet of the Astonished (2000) by Bill Viola

On the other hand, watching movies in slow-mo is not something that I’ve had experience with outside of seeing the occasional Bill Viola installation. Until, that is, I started to tinker with ePaper components and Javascript in the depth of Michigan winter, looking for a way to celebrate slowness.

Can a film be consumed at the speed of reading a book? Yes, just as a car city can be enjoyed on foot. Slowing things down to an extreme measure creates room for appreciation of the object, as in Brasília, but the prolonged duration also starts to shift the relationship between object, viewer, and context. A film watched at 1/3,600th of the original speed is not a very slow movie, it’s a hazy timepiece. A Very Slow Movie Player (VSMP) doesn’t tell you the time; it helps you see yourself against the smear of time.

I’ve described VSMP in more detail below, but watch this video explains it more readily.

VSMP is an object that contains a Raspberry Pi computer, custom software, and a reflective ePaper display (similar to a Kindle), all housed inside a 3D printed case. Every 2.5 minutes a frame from the film stored on the computer’s memory card is extracted, converted to black and white using a dithering algorithm, and then communicated to the reflective ePaper display. This adds up to playing the film at a rate of 24 frames per hour, which is in contrast to the traditional speed of 24 frames per second. That’s the slow part, obviously.

Films are vain creatures that typically demand a dark room, full attention, and eager eyeballs ready to accept light beamed from the screen or projector to your visual cortex. VSMP inverts all of that. It is impossible to “watch” in a traditional way because it’s too slow. In a staring contest with VSMP you will always lose. It can be noticed, glanced-at, or even inspected, but not watched. That’s one of the things I like about the Bill Viola pieces. You don’t watch them because they’re not films; they’re portraits so you see them, and it just so happens that you see them in four dimensions.

The screen technology used in VSMP is reflective like a Kindle, instead of emissive like a television or computer, which means that the image one sees when they look at VSMP is always a compromise with the environment. Hang an LCD on the wall and it looks more or less the same in darkness or light. LCD, OLED, and other emissive screen technologies do not respect context, they obliterate it. Displays using ePaper are still largely black and white but here’s something ePaper can do that emissive displays cannot: work politely in the dark. The images below show VSMP progressing from full afternoon sun into civil twilight. If the room is dark, VSMP’s imagery is dark.

If the light is warm, VSMP is warm. If the room is bright, VSMP is bright. The device’s interactions with shadow and ambient light are its most distinguishing feature. In theory a movie is the same wherever it is played. Cinephiles have strong opinions about projector types, screen quality, audio (and everything else) but the idea is that the film is the film because ultimately the light that shoots out of the projector is the light, regardless of when and where it’s projected. With VSMP the weather, location, and orientation are all factors that define what you see.

Delightful moments of unexpected cooperation between celestial and digital show up, such as this spotlight (below) that the sun added to Jimmy in Quadrophenia at 10:55 AM on March 15, 2018 in Detroit, MI with a North-North-East facing exposure. That’s a mouthful, right? But the video content, date of playback, time, location, and orientation of the device all coalesce to produce the image that you see, so the whole mouthful is the only useful way to describe the performance.

Shadow, color, temperature. This reminds me of Alvin Lucier’s “I am Sitting in a Room,” in that VSMP is a performance that relies on place. Lucier’s loops gradually overwhelm human speech with an assist from the shape, size, and materials of a specific room. On the other hand, VSMP is fundamentally feeble without an assist from the light that floods through windows, bounces off walls and other surfaces, and eventually illuminates its display. Rather than create a battle between context and content, as in Lucier’s recording, VSMP collaborates with place in a small way by borrowing the light of the room it lives within.

The nice thing about a technology that is enhanced by place is that it immediately has architectural implications, so I’m excited to bring some of this exploratory work into future projects at Dash Marshall. We would like to experiment with larger format displays as wall surfaces for both interior and exterior.

The precedents for ePaper at an architectural scale are limited, with notable examples including OMA’s renovation of the Delegates lounge at the UN featuring a wall of small ePaper panels. Future Cities Catapult did some top notch speculative work demonstrating the politeness and sensibility of ePaper in public spaces, including some new interactions which are lovely. There’s still plenty of work to be done in this area, particularly in light of the low power requirements of ePaper. Sydney deployed realtime ePaper bus stops earlier this year by Visionect (below left) and they’re beautiful. Compared to LinkNYC (below right), which uses big bright LED screens, the ePaper displays are more urbane because they are significantly more deferential to the public realm. As an urban display, the ePaper totems don’t do as much as Link, no doubt, but it’s still not quite clear what itch Link is scratching on behalf of the public. I’m interested in ePaper displays as a low-cost, easy to maintain way to bring neighborhood content back into shared physical spaces—parks, libraries, ice cream shops, and the like.

Left: Sydney bus stops by Visionect. Right: LinkNYC. Images: Visionect, AM NY

VSMP also suggests a pull in a different direction towards new possibilities of ornament and decoration, towards ways to make our digital lives present in our physical world in more subtle ways. This ePaper-tiled parking garage at San Diego International Airport is one example of the ornamental potential for ePaper. And while the parking garage does change appearance, there is further room to explore the possibilities of a building or surface whose appearance evolves slowly over time. At a pace of 24 frames per hour, an evolving wall would change almost imperceptibly.

As the architectural equivalent of Ninja Cat—a non-binary between moving and stillness—ePaper walls offer a way to play with a handful of interwoven timelines: the minutes of human rituals, the hours of our planet’s rotation, and the months of earth’s annual orbit around the sun. We might create a wall of William Morris wallpaper patterns that blossom and whither with the seasons, at the annual pace of the seasons. Or use footfall sensors to track visitors and add one black pixel to the walls for each person who enters, very slowly turning the walls from black to white in a digital obliteration room. Or add to the predictable shadows that fall across the facade of a building with fantastical digital supplements. Shadow puppets at the scale of a building? Stay tuned or get in touch if you would like to collaborate with us on this.


Building a Very Slow Movie Player

I did not intend to have to build everything from scratch, but that’s how it ended up, so in case this is useful to others, here’s how a very slow movie player comes together. It started with ordering a 7.4" screen from Pervasive Displays and a USB Timing Control Module. The product is supposed to mount like a USB device and accept a proprietary “.epd” binary file format which would then be flashed onto the screen. Knowing nothing about binary files or image processing, I was eventually able to refactor this python code into Node.js (the best programming language for the job is the one you already know, right?) which led, after some real stumbling, to this miraculous sight:

Proof of EPD

Node-fluent-ffmpeg is the library that VSMP uses to extract frames from a video file. These are then passed to a dithering module that prepares the full color video still for display on a 1-bit screen. Tanner Helland’s dithering algorithms were useful here, and it was fun, if indulgent, to spend an afternoon playing with the various algorithms from the scenography of my childhood spent in front of various computers.

Still from Quadrophenia…
… and now the same dithered with the Floyd Steinberg algorithm. The level of detail is pretty remarkable, actually.

With sequential video stills being extracted from an mp4, dithered, and converted to EPD files, I excitedly hacked together a “case” for the screen and its controller so both could hang above my desk where they would be easily visible, and then fired up a cron job to copy frame after frame to the device, updating the screen each time. What I discovered was some kind of conflict between the Raspberry Pi and the controller module that produces an error every 10 frames or so, which is quite problematic if you’re interested in continuous playback. To make matters worse, the constantly running Node.js script I used to occasionally push new frames to the screen ran out of memory about once a day.

As an aide: at the time of filming this time lapse VSMP was running at one frame per minute because that seemed sensible and plenty slow. However, when I tried to show friends the project, it took a lot of effort to explain the idea. Conceptually, it seems easier to describe a slow down from 24 fps to 24 fph (frames per hour) than some arbitrary speed like 60fph. Regardless of playback speed, the reoccurring glitch and regular crashes were an impediment.

Original VSMP test hacked together in the sunlight of a July afternoon. A glitch appeared on the last of these nine frames and repeated every 9–13 frames. Therefore I decided to write my own code to control the ePaper display.

My original plan was to make the screen module as compact as possible, which I planned to accomplish by using a Raspberry Pi 3B (RPI3B) to power the device and to let the RPI3B be attached to the wall plug or to sit on the floor near it. The slim chassis below was an experiment to see how that would work. In the end, it was not possible to continue down this route because the display glitch above required switching from USB to GPIO.

Abandoned case concept that favored a slim profile

If you’re asking WTF is GPIO, worry not; I was in the same situation. The only good thing about GPIO is that it means your device literally has a rainbow inside of it. Small victories?

GPIO stands for General Purpose I/O and the name is accurate. It’s a way to send and receive signals between your computer and any kind of device; in this case a screen. Signals are sent as bits, so with an ePaper screen you’re transmitting instructions to turn each pixel on or off. It took me forever to figure out the right way to physically connect the screen to the RPI3B. After a good bit of trial and error and some minor hair loss I succeeded in porting Node-ePaper, which is written for BeagleBone, to work with Raspberry Pi hardware. That’s now on Github as Node-ePaper-RPI. But then the rainbow cables!

The two images below probably look relatively similar but they document an important upgrade. On the left the 10 GPIO cables are loose and unruly and easily came undone from the RPI board. On the right the VSMP is now wired with custom jumper cables in exactly the right dimension and connection pathways that I needed. The only disappointing thing about the custom cables I made is that I could not figure out where to find a right angle connector, so the cables emerge perpendicularly from the RPI3. Ultimately this causes the case to be deeper than it needs to be.

Adafruit’s housing blocks and custom jumper wires making the inside of VSMP nice and tidy

With the EPD files working and GPIO under control, we now have something like the magic seen below in the terminal window. Having built this thing from the ground up (with a lot of help from open source code), the progress bar seen below is one that I feel the weight of more than any other progress bar I’ve ever watched. It’s like I can feel each pixel jumping from computer to screen. The pace of it is always a little different, making it feel like a physical process. Which it is on some level: electrical signals zapping literally bit by bit. I still enjoy watching the progress bar stutter to completion.

Watching VSMP’s software pleasantly do its thing
Raspberry Pi Zero is very small and very slow

If you caught the name of the script as “single_frame.js,” it’s because the software that controls VSMP processes one frame and then quits. This is scheduled with a cron job to run every 2.5 minutes which allows the computer to manage its memory better and prevents the regular crashes we were experiencing previously. Each frame takes roughly 25 seconds to process and transfer to the screen, at which point the display flashes and the new frame appears. This means the computer has about two minutes of downtime between each frame, so I wanted to see if it could run on a Raspberry Pi Zero instead and let the natural slowness of that device provide a roughly 2.5 minute spacing between frames by working as hard as it could on a constant loop. I installed everything on a Pi Zero and briefly dipped back into the USB transfer mode to test speeds. This experiment was moot due to the five long minutes it took a Pi Zero to process the image. Too slow.

Here’s a compilation of all of the time lapse test shots I took during the development of this thing. Lots and lots of Quadrophenia, which is Laura’s favorite movie and one that I’ve still only seen at the plodding pace of VSMP.

As a final step I tested a few configurations of case to find something that minimizes “digital picture frame” vibes. I need to spend more time on this because I’m not happy with the physical design of the case, but for now this cowl-like profile has some advantages. The notches cut into each corner of the cowl create opportunities for VSMP to cast shadows onto itself, which is a bit showy but has a nice effect. It’s printed in Shapeways’ “Strong & Flexible” black nylon which has a rough surface that soaks up light and produces a matte appearance.

VSMP casting shadows on itself

Below is a sequence of images from the assembly of VSMP. One of the things I find uncanny about ePaper is the way it keeps an image even when it’s disconnected from power. Even though VSMP is very clearly a digital object that is either on or off, the sustained presence of the image gives the device a rounder presence in our world. It does not obviously turn on and off; instead, VSMP is always present. It renders the digital continuous. To me, it’s in these half-assembled states when VSMP is most clearly what Robin Sloan calls a flip flop. In this case, a film captured with the physical light of one place, turned into digitally manipulated images, which are then turned back into physical images, reflective and constant under a new sun.

VSMP presented a rewarding set of technical challenges to noodle through, but ultimately I built it because I wanted to be more cognizant of time. I wanted to see slowly, and to enjoy the way context blooms around an object when doing so. So it seemed fitting that the details would be important.

The back of the device is really never seen, but it is meant as a reminder of the original intention behind the project as a tool for human use rather than some kind of blunt device. Ridges are formed on the back depicting the intersection of two sets of lines: one curving in a circle and one horizontal. One cyclical and one straightforward. Fast and slow meeting at a contemplative intersection.