My first hackathon experience

12 hours of iOS hacking and building the essentials

James Tang
Oct 27, 2013 · 5 min read

Oct 19 2013 was the AngelHack Hong Kong event. This was my first
time attending a real hackathon after an equally fantastic event StartLab back in June.

I didn’t prepare anything before hand, but I got teamed up with one entrepreneur, one designer, and two web engineers, at the opening ceremony.

We came up with a location based chatting idea called Ripple (Updates: you can already Download Ripple on App Store), and here I’d like to walk you through our story.

The Prototype

Flinto is a really well crafted and unique tool that allows designer to upload mockups and define connections between them. It comes with very native like transitions between screens, and also adopts to different screens height by allowing scrollable middle content, that allows the production of highly expressive prototype.

The screens can be replaced on-the-fly, once designer updated the screens, engineers will be able to see the live changes using their phone or in the web browser.

But there were two screens that we couldn’t really demonstrate without a working example.

Part 1 — Signup screen with the live Camera

The technical barriers of this screen contains two main features:

  • Display the live camera in partial of the view
  • Blur the live picture and blend it to the background

I started with a Single View Based application, which I liked how Xcode provides a minimum template with a Main.Storyboard properly set up, and started implementing the video layer.

I had already heard about the BradLarson/GPUImage few years ago, it went viral since iOS 4 when Apple introducted the AVFoundation framework.

The framework is on Cocoapods, it was a real quick one line configuration.

This was the first time I actually use GPUImage, the sample code contains quite a number of sub projects and live camera filters. It took me some time to understand it and it turned out that initializing a camera and display on screen was pretty easy.

The blurry live background

My initial thought was to manually capture and create an image out of the self.avatarView, then do the blurring operation and assign it back to the background.

I quickly realized executing this way would probably turn the code into an ugly, slow, complicated beast.

I double checked the filter gallery and found the GPUImageGaussianBlurFilter. Yes, we could directly process the live picture with the GPU, we just need to properly setup the our view with the filtered output.

I added a full screen GPUImageView, and did some proper setup in code:

Boom! No performance issue, clean code, happy engineer! :)

To add an extra touch, we masked the avatar into a perfect circle, thanks to the research by our teammate, it was just a single line of code.

Part 2 — Chat screen and the springy message bubbles

The next screen we wanted to create is the chat room view. We wanted to provide the best chatting experience for our users.

We loved the iMessage style floating bubbles very much. Since WWDC 2013 we were told that the physics were built right into the UIKit framework call UIKit Dynamics, and would be available to developers through iOS 7 SDK.

It could easily took up few hours for learning the sample code. At the end of the day, it was a hackathon. I googled for quick solutions and libraries, and came across THSpringyCollectionView, got it smoothly compiled and ran on the simulator:

We needed no further investigation. The README containing keywords like UICollectionViewFlowLayout and UIAttachmentBehaviours indicated that it was the correct way of implementation.

I asked our designer to exported the message bubbles together with sample text. With previous experience of working with UICollectionView, declaring an array for the UICollectionViewDataSource was fairly straightforward.

The sliding keyboard

The sliding keyboard was also first introduced by iMessage from Apple. Facebook took this feature with their implemention on chat heads for their iOS app. I couldn’t think of a reason for us not to do that on our app.

Daniel Amitay has a fairly popular opensource project called DAKeyboardControl quickly did the job, a warm reminder would be make sure you were required to manually taking care of the retain cycles, and also remove the observer properly in dealloc.

You’ll be able to download all the code in gist.


We were very lucky to be able to build an MVP demonstrating our idea and highlight our thoughts and attention to UX and details, that brought us to the final top three teams, plus, we had a good sleep. ;P

I’m very happy to be part of the hackathon and the team, we’ll continue on developing Ripple and we’ll see how it goes!

Updates: Ripple is live on App Store

Special thanks to Greg Gopman, Joshua Slayton, and Cassy Lau for hosting, sponsoring, and giving advise for Angelhack attendees. Thanks Ken for inviting me and the free tickets. And also to my teammate Christopher, Benny Ng, Vincent, and Meng To whom let me to use the cover images and all the support on this blog post!

iOS Apprentice

We know we have more to learn. It’s a long way.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store