Michael Ziomek

Jordan Howell

ENGL110 046

October 3, 2017

Hyper What? Hyper-lapse, From Instagram

A couple weeks ago we learned about “Hyperlinks” and for me that was one of the first hyper-anything I’ve ever heard of. Well today you’re going to learn about just one more “Hyper” topic. It’s Hyperlapse, an app released from Instagram. Hyper-lapses are a different kind of time lapse that requires the camera to also move. Instagram is known for the amazing high quality pictures which causes people to unleash their creative side. But imagine being able to post a short 10–15 second video capturing a beautiful sunrise… the possibilities are endless!

Hyper-lapses are a different kind of video but they are often compared to time-lapse videos. It is important to know the distinction between the two. According to Lisa Keith of Central Michigan University, “In time-lapse videos, the frequency of film frames are captured at a much lower rate than that used to view the sequence, so when the video is played at normal speed, time appears to be moving faster and thus lapsing. The camera is either static or moving only very short distances. In hyper-lapse videos, the position of the camera is aimed at a defined, fixed point and moved over considerable distances in order to create a tracking shot in time-lapse sequences.” In other words a hyper-lapse is a kind of time-lapse where the camera is also moving. In the past hyper-lapses have been a straining process that requires a unique plan, special editing, and different kinds of camera mounts. With Hyperlapse, this process is no longer a struggle.

One of the most important parts in creating Hyperlapse is video stabilization. This allows for smooth running videos to be captured. In order to get this smoothness the Hyperlapse team created a video stabilization algorithm called Cinema. What this does is cause the video to not shake when recording. In real life movie shots the camera man wears a harness to get rid of this shake so the Hyperlapse team uses Cinema to access your phones gyroscope which dissipates the shakiness of the camera. Talking about the layout of the algorithm is the Instagram Engineering account on Medium saying “We feed gyroscope samples and frames into the stabilizer and obtain a new set of camera orientations as output. These camera orientations correspond to a smooth “synthetic” camera motion with all the unwanted kinks and bumps removed.”

So basically the Hyperlapse team is saying they feed phone gyroscope pieces and different frames into their stabilizer which gives them new smooth orientations. They continue on saying “These orientations are then fed into our video filtering pipeline shown below. Each input frame is then changed by the IGStabilizationFilter according to the desired synthetic camera orientation.”

What the Hyperlapse team means by this is that they take the orientations from the stabilizer and run them through the stabilization filter which matches them up with the right frames giving them steady frames.

Another big part in creating Hyperlapse is adaptive zoom. Zoom is needed to counteract the shake that can occur. All digital video stabilization algorithms trade resolution for stability. However, according to Instagram Engineering, “Cinema picks the zoom intelligently based on the amount of shake in the recorded video.” Instagram Engineering continued on explaining about zoom saying “Since zooming in reduces the field of view, there is a tradeoff between effective resolution and the smoothness of the camera motion. Our adaptive zoom algorithm is fine-tuned to minimize camera shake while maximizing the effective resolution on a per-video basis. Since motion, such as a slow pan, becomes more rapid at higher time lapse levels (i.e. 12x), we compute the optimal zoom at each speedup factor.” In other words “Instagram says its zoom is intelligent and responds to the amount of shake in a video, so you’re better off trying to hold your phone a little steadier if you’re worried about seeing a big dip in resolution. The less shaking the Cinema algorithm detects, the less it has to zoom in to find an unaffected section,” says Steven Tweedie from Tech Insider.

The final aspect of Hyperlapse is just putting it all together. Hyperlapse team members Tom Cargill and Bell Labs had this to say: “The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.” The Hyperlapse app is controlled by a slider. Every time the slider is moved a process occurs. Instagram Engineering says “First we request frames from the decoder at the new playback rate. Then we simultaneously kick off the Cinema stabilizer on a background thread to compute a new optimal zoom and a new set of orientations for the new zoom and time lapse amount. Following that we continue to play the video while we wait for new stabilization data to come in. Finally we use the orientations we computed at the previous time lapse amount along with spherical interpolation to output orientations for the frames we’re going to display.” This is the process that happens every time the slider moved and more or less is the process of Hyperlapse.

Now you’ve learned about Hyperlapse and how it could apply to you in everyday life. It changes the way videos are looked at and really gives a creative outlook. I suggest downloading it and unleashing the creative side we all have, whether its waves crashing, walking your dog, or the sun rising, there are plenty of moments to capture. Get hype and use Hyperlapse!