Creating a Very Slow Movie Player
Rendering digital images in a new way holds a lot of possibilities
Walking around Brasília some years ago, I had the distinct feeling that I was doing it “wrong”—because, of course, I was. The center of the capital city of Brazil is organized along the Eixo Monumental, or Monumental Axis, and features an array of important buildings that form a long spine. It is a place designed to be “read” at the speed of a vehicle, so taking it in by foot is like watching a movie in slow motion.
But this can be rewarding in unexpected ways. Pedestrians in Brasília have an opportunity to discover the subtle variations between the seemingly mega-scaled buildings. Rhythmic reflections and shadows bring surfaces to life under the tropical sunlight in beautiful and nuanced ways—just don’t forget to put on sunscreen.
Slowing things down to an extreme measure creates room for appreciation of the object, but the prolonged duration shifts the relationship between object, viewer, and context. A film watched at 1/3,600 of the original speed is not merely a very slow movie, it’s a hazy timepiece. And while a thing like Very Slow Movie Player (VSMP) wouldn’t tell you the time, it helps you see yourself against the smear of time.
How ‘VSMP‘ Works
VSMP is an object that contains a Raspberry Pi computer, custom software, and a reflective ePaper display (similar to a Kindle), all housed in a 3D-printed case. Every two-and-a-half minutes, a frame from a film that’s stored on the computer’s memory card is extracted, converted to black and white using a dithering algorithm, and then communicated to the ePaper display. The video below explains the process, but essentially, the film is played at a rate of 24 frames per hour in contrast to the traditional 24 frames per second. That’s the slow part, obviously.
Films are vain creatures that typically demand a dark room, full attention, and eager eyeballs ready to accept light beamed from the screen or projector to your visual cortex. VSMP inverts all of that. It is impossible to “watch” it in a traditional way because it’s too slow. It can be noticed, glanced at, or even inspected but not watched. That’s one of the things I like about the Bill Viola pieces. You don’t watch them because they’re not films; they’re portraits and you see them, but it just so happens that you see them in four dimensions.
The screen technology used in VSMP is reflective like a Kindle instead of emissive like a television or computer, which means that the image one sees when looking at it is always a compromise with the environment. Hang an LCD or OLED on the wall, and it looks more or less the same in darkness or light; context is obliterated. While ePaper displays are largely black and white, one thing they can do that emissive screens cannot is work politely in the dark. The images below show VSMP progressing from full afternoon sun into twilight. If the room is dark, VSMP’s imagery is dark.
If the light is warm, VSMP is warm. If the room is bright, VSMP is bright. The device’s interactions with shadow and ambient light are its most distinguishing feature. Movies are mostly intended to be the same wherever they are played. Cinephiles have strong opinions about projector types, screen quality, audio, and everything else, but the idea is that the film is the film because of the light that shoots out of the projector, regardless of when or where the film’s projected. With VSMP, however, the weather, location, and orientation are all factors that define what you see.
Delightful moments of unexpected cooperation between celestial and digital show up, like in the image below with a spotlight that the sun added to “Jimmy in Quadrophenia at 10:55 a.m. on March 15, 2018, in Detroit, Michigan, with a north-northeast-facing exposure.” That’s a mouthful, right? But the video content, date of playback, time, location, and orientation of the device all coalesce to produce the image you see, so the whole mouthful is the only useful way to describe it.
These images bring together shadow, color, and temperature, and they remind me of Alvin Lucier’s I Am Sitting in a Room in that VSMP is a performance that relies on a place. Lucier’s loops gradually overwhelm human speech with an assist from the shape, size, and materials of a specific room. VSMP is fundamentally feeble without an assist from the light that floods through windows, bounces off walls and other surfaces, and eventually illuminates its display. Rather than creating a battle between context and content, as in Lucier’s recording, VSMP collaborates with a place in a small way by borrowing the light of the room it is in.
Scaling Up Slowness
The nice thing about a technology enhanced by place is that it immediately has architectural implications. We can experiment with larger format displays as wall surfaces for both interior and exterior. The precedents for ePaper at an architectural scale are limited, with notable examples including OMA’s renovation of the North Delegates Lounge at the UN featuring a wall of small ePaper panels. Future Cities Catapult did some top-notch speculative work demonstrating the politeness and sensibility of ePaper in public spaces, including some new interactions that are lovely.
There’s still plenty of work to be done in this area, particularly in light of the low-power requirements of ePaper. Sydney deployed real-time ePaper bus stops earlier this year by Visionect, and they’re beautiful. Compared with LinkNYC, which uses big, bright LED screens, the ePaper displays are more urbane because they are significantly more deferential to the public realm. As an urban display, the ePaper totems don’t do as much as Link, no doubt, but it’s still not quite clear what itch Link is scratching on behalf of the public. I’m interested in ePaper displays as a low-cost, easy to maintain way to bring neighborhood content back into shared physical spaces—parks, libraries, ice cream shops, etc.
VSMP also suggests a pull toward new possibilities of ornament and decoration, toward ways to make our digital lives present in our physical world in more subtle ways. This ePaper-tiled parking garage at San Diego International Airport is one example of the medium’s ornamental potential. And while the parking garage tiles intend to readily change the structure’s appearance, there is room to explore the possibilities with a building or surface whose appearance evolves slowly over time. At a pace of 24 frames per hour, an evolving wall would change almost imperceptibly.
It would be the architectural equivalent of a ninja cat making moves to sneak up on you when you aren’t looking—a nonbinary between moving and stillness. Walls with ePaper can offer a way to play with interwoven timelines: the minutes of human rituals, the hours of our planet’s rotation, and the months of Earth’s annual orbit around the sun. We might create a wall of William Morris wallpaper patterns that blossom and wither at the actual pace of the seasons. Or use footstep sensors to track visitors and change one pixel of a wall for each person who enters, very slowly turning the room from black to white in a digital obliteration room. Or add to the predictable shadows that fall across the facade of a building with fantastical digital supplements—shadow puppets at the scale of a building.
I did not intend to have to build everything from scratch, but that’s how it ended up, so in case this is useful to others, here’s how a very slow movie player comes together.
It started with ordering a 7.4" screen and a USB timing control module. The product is supposed to mount like a USB device and accept a proprietary EPD binary file format, which would then be flashed onto the screen. Knowing nothing about binary files or image processing, I was eventually able to refactor this Python code into Node.js—the best programming language for the job is the one you already know, right?—which led, after some real stumbling, to a miraculous sight: It worked.
Node-fluent-ffmpeg is the library that VSMP uses to extract frames from a video file. These are then passed to a dithering module that prepares the full-color video still for display on a one-bit screen. Tanner Helland’s dithering algorithms were useful here, and it was fun, if indulgent, to spend an afternoon playing with the various algorithms from the scenography of my childhood spent in front of various computers.
With sequential video stills being extracted from an MP4, dithered, and converted to EPD files, I excitedly hacked together a “case” for the screen and its controller so both could hang above my desk where they would be easily visible. I then fired up a cron job to copy frame after frame to the device, updating the screen each time. But I discovered a conflict between the Raspberry Pi and the controller module that produced an error every 10 frames or so, which is problematic if you’re interested in continuous playback. To make matters worse, the constantly running Node.js script I used to occasionally push new frames to the screen ran out of memory about once a day. The recurring glitch and regular crashes were an impediment.
My original plan was to make the screen module as compact as possible, which I planned to do by using a Raspberry Pi 3B (RPI3B) to power the device and let the RPI3B be attached to the wall plug or sit on the floor near it. In the end, it was not possible to continue down this route because the display glitch required switching from USB to GPIO.
The only good thing about GPIO is that it means your device literally has a rainbow inside it. GPIO stands for general purpose I/O, and the name is accurate. It’s a way to send and receive signals between your computer and any kind of device—in this case, a screen. Signals are sent as bits, so with an ePaper screen, you’re transmitting instructions to turn each pixel on or off. It took me forever to figure out the right way to physically connect the screen to the RPI3B. After a good bit of trial and error, I succeeded in porting Node-ePaper, which is written for BeagleBone, to work with Raspberry Pi hardware. That’s now on Github as Node-ePaper-RPI. But then—the rainbow cables.
The two images above probably look relatively similar but they document an important upgrade. On the left, the 10 GPIO cables are loose and unruly and easily came undone from the RPI board. On the right, VSMP is wired with custom jumper cables in exactly the right dimension and connection pathways that I needed. The only disappointing thing about the custom cables I made is that I could not figure out where to find a right angle connector, so the cables emerge perpendicularly from the RPI3B. Ultimately, this causes the case to be deeper than it needs to be.
With the EPD files working and GPIO under control, I had something magical. The name of the script is “single_frame.js” because the software that controls VSMP processes one frame and then quits. This is scheduled with a cron job to run every two-and-a-half minutes, which allows the computer to manage its memory better and prevents regular crashes. Each frame takes roughly 25 seconds to process and transfer to the screen, at which point the display flashes and the new frame appears.
Because the computer has about two minutes of downtime between each frame, I wanted to see if it could run on a Raspberry Pi Zero instead and let the natural slowness of that device provide the spacing between frames, but it was too slow.
As a final step, I tested a few case configurations to find something that minimizes the digital picture frame vibe. I’m not happy with the physical design of the case, but for now, this cowl-like profile has some advantages. The notches cut into each corner create opportunities for VSMP to cast shadows onto itself, which is a bit showy but has a nice effect. It’s printed in Shapeways’ Strong and Flexible black nylon, which has a rough surface that soaks up light and produces a matte appearance.
The below video is a compilation of all the time-lapse test shots I took during the development of this thing. There’s lots of Quadrophenia, which I’ve still only seen at the plodding pace of VSMP.
One of the things I find uncanny about ePaper is the way it keeps an image even when it’s disconnected from power. The sustained presence of the image gives the device a rounder presence in our world and renders the digital continuous.
To me, it’s in these half-assembled states when VSMP is most clearly what Robin Sloan calls a flip-flop. In this case, a film captured with the physical light of one place turned into digitally manipulated images that are then turned back into physical images, reflective and constant under a new sun.
The back of the device is really never seen, but it is meant to be a reminder of the original intention behind the project as a tool for human use rather than some kind of blunt device. Ridges are formed on the back depicting the intersection of two sets of lines: one curving in a circle and one horizontal; one cyclical and one straightforward. It’s fast and slow meeting at a contemplative intersection.
VSMP presented a rewarding set of technical challenges to noodle through, but ultimately, I built it because I wanted to be more cognizant of time. I wanted to see slowly and enjoy the way context blooms around an object when doing so.