LeapSnap at the Google DevFest Hackathon
From Saturday morning, November 8th through to Sunday night Anubhav Mishra and I worked to build LeapSnap at the DevFest Hackathon in Vancouver. The idea for LeapSnap came from a Sci-Fi short I watched where everyone had chips implanted that would store their memories via video. Whenever they wanted to share their memories with friends, they would “cast” them onto any screen within view. While sitting at the dinner table, they could then wirelessly control the TV, cycle through to the memory they wanted to share, and then play it.
Here’s what we ended up building. The videos that play were shot from my phone and uploaded in real time. Each screen is being controlled independently by the gestures and the displays can be placed anywhere, as long as they are connected to the internet.
Our project involved three parts: recording the video, showing the list of videos on multiple displays, and then controlling all of the displays at once with simple hand gestures.
To record the video we used an Android app called Spydroid that wirelessly streamed video from my phone using the Real Time Stream Protocol. We then ran a program called ffmpeg on my laptop that would read the video stream from the phone over wifi and then save it as video.
Displaying the Videos
As the videos were coming in, we needed a way to display them across multiple screens. We wrote a simple website that showed the list of videos in a cover flow style view.
Playing with Gestures
Leap Motion is a device that enables you to use gestures to control an application. We connected a Leap Motion controller to one laptop that registered the gestures. It then pushed the commands down to all the connected displays, controlling all the displays at once!
Swiping to the left or right cycled through the videos, poking played the video, and a simple down-tap would stop the video.
There were a ton of great projects that came out of the hackathon. From a distributed game using arduino and multiple players to a virtual reality game using Google Cardboard. The projects were judged on how they solved a difficult technical problem in an innovative way and in the end we were lucky enough to be picked as the winners! We won two Chromecasts and an LG GWatch.
You can view our presentation along with the rest of the projects here. Thanks to Yaniv Talmor for organizing, all the judges for donating their time, and Google for sponsoring this awesome event!
As mentioned above, the video was streamed from my phone using the SpyDroid Android app using RTSP. I initially tried to connect to the stream and save it as MP4 files using VLC but most of the time it would never write a valid MP4 file. Instead, thanks to a friend’s advice I switched to using ffmpeg and it worked perfectly. I wrote a simple bash script to
- connect to the video stream
- record for 10 seconds (for demo purposes)
- generate a thumbnail
- save the video and thumbnail in a dated directory
List of videos API
When users open up the webpage, the front-end needed to get the list of videos. I hacked out a simple PHP script to provide an API endpoint that would provide a list of the videos and their urls.
Mishra is the Leap Motion pro and he integrated Leap Motion with our project. He wrote a node app that listened to the leap motion gestures and pushed the commands down to all the connected clients using websockets. Leap Motion provides some built-in gestures but in order to get the direction of the swipe, he had to do some math. Because Leap Motion runs at 300 fps, we had to debounce the events before we sent them down to the clients.
I think that sharing experiences and memories is something we all desire to do. Being able to do so effortlessly is the future.
This project was an attempt to get somewhere near that idea. Stay tuned…