DESIGNED BY Meng TO

My first hackathon experience

12 hours of iOS hacking and building the essentials

James Tang
iOS Apprentice
Published in
5 min readOct 27, 2013

--

Oct 19 2013 was the AngelHack Hong Kong event. This was my first
time attending a real hackathon after an equally fantastic event StartLab back in June.

I didn’t prepare anything before hand, but I got teamed up with one entrepreneur, one designer, and two web engineers, at the opening ceremony.

We came up with a location based chatting idea called Ripple (Updates: you can already Download Ripple on App Store), and here I’d like to walk you through our story.

The Prototype

Meng described the design concept with a well created Flinto Prototype

Flinto is a really well crafted and unique tool that allows designer to upload mockups and define connections between them. It comes with very native like transitions between screens, and also adopts to different screens height by allowing scrollable middle content, that allows the production of highly expressive prototype.

The screens can be replaced on-the-fly, once designer updated the screens, engineers will be able to see the live changes using their phone or in the web browser.

But there were two screens that we couldn’t really demonstrate without a working example.

Part 1 — Signup screen with the live Camera

A bare minimum signup screen contains one input field for screen name and a live camera view to capture users face.

The technical barriers of this screen contains two main features:

  • Display the live camera in partial of the view
  • Blur the live picture and blend it to the background

I started with a Single View Based application, which I liked how Xcode provides a minimum template with a Main.Storyboard properly set up, and started implementing the video layer.

I had already heard about the BradLarson/GPUImage few years ago, it went viral since iOS 4 when Apple introducted the AVFoundation framework.

The framework is on Cocoapods, it was a real quick one line configuration.

pod ‘GPUImage’

This was the first time I actually use GPUImage, the sample code contains quite a number of sub projects and live camera filters. It took me some time to understand it and it turned out that initializing a camera and display on screen was pretty easy.

- (void)viewDidLoad {
[super viewDidLoad];
self.videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
self.videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
// self.avatarView is a non full screen GPUImageView instance
// created in Storyboard
[self.videoCamera addTarget:self.avatarView];
[self.videoCamera startCameraCapture];
}

The blurry live background

My initial thought was to manually capture and create an image out of the self.avatarView, then do the blurring operation and assign it back to the background.

I quickly realized executing this way would probably turn the code into an ugly, slow, complicated beast.

I double checked the filter gallery and found the GPUImageGaussianBlurFilter. Yes, we could directly process the live picture with the GPU, we just need to properly setup the our view with the filtered output.

I added a full screen GPUImageView, and did some proper setup in code:

self.blurFilter = ({
GPUImageGaussianBlurFilter *blurFilter = [[GPUImageGaussianBlurFilter alloc] init];
blurFilter.blurSize = 8;
blurFilter;
});
self.videoCamera addTarget:self.blurFilter];// self.blurView is a fullscreen GPUImageView instance configured in Storyboard
[self.blurFilter addTarget:self.blurView];

Boom! No performance issue, clean code, happy engineer! :)

To add an extra touch, we masked the avatar into a perfect circle, thanks to the research by our teammate, it was just a single line of code.

self.avatarView.layer.cornerRadius = self.avatarView.frame.size.width/2;

Part 2 — Chat screen and the springy message bubbles

Springy message bubbles are very fun to play with.

The next screen we wanted to create is the chat room view. We wanted to provide the best chatting experience for our users.

We loved the iMessage style floating bubbles very much. Since WWDC 2013 we were told that the physics were built right into the UIKit framework call UIKit Dynamics, and would be available to developers through iOS 7 SDK.

It could easily took up few hours for learning the sample code. At the end of the day, it was a hackathon. I googled for quick solutions and libraries, and came across THSpringyCollectionView, got it smoothly compiled and ran on the simulator:

The runtime results with a block of yellow rectangles

We needed no further investigation. The README containing keywords like UICollectionViewFlowLayout and UIAttachmentBehaviours indicated that it was the correct way of implementation.

I asked our designer to exported the message bubbles together with sample text. With previous experience of working with UICollectionView, declaring an array for the UICollectionViewDataSource was fairly straightforward.

The sliding keyboard

When you drag your finger from the top way down to the bottom, the keyboard also follows your touch and slides down.

The sliding keyboard was also first introduced by iMessage from Apple. Facebook took this feature with their implemention on chat heads for their iOS app. I couldn’t think of a reason for us not to do that on our app.

Daniel Amitay has a fairly popular opensource project called DAKeyboardControl quickly did the job, a warm reminder would be make sure you were required to manually taking care of the retain cycles, and also remove the observer properly in dealloc.

- (void)viewDidLoad {
[super viewDidLoad];
__weak typeof (self) weakSelf = self;
[self.view addKeyboardPanningWithActionHandler:^(CGRect keyboardFrameInView) {
// Becareful of the retain cycle and use weakSelf instead of self
// when you need to add custom logic in code.
CGRect tableViewFrame = weakSelf.messageView.frame;
tableViewFrame.size.height = toolBarFrame.origin.y;
weakSelf.messageView.frame = tableViewFrame;
}];
}
- (void)dealloc {
[self.view removeKeyboardControl];
}

You’ll be able to download all the code in gist.

Conclusion

We were very lucky to be able to build an MVP demonstrating our idea and highlight our thoughts and attention to UX and details, that brought us to the final top three teams, plus, we had a good sleep. ;P

I’m very happy to be part of the hackathon and the team, we’ll continue on developing Ripple and we’ll see how it goes!

Updates: Ripple is live on App Store

Special thanks to Greg Gopman, Joshua Slayton, and Cassy Lau for hosting, sponsoring, and giving advise for Angelhack attendees. Thanks Ken for inviting me and the free tickets. And also to my teammate Christopher, Benny Ng, Vincent, and Meng To whom let me to use the cover images and all the support on this blog post!

--

--

James Tang
iOS Apprentice

Sketch Plugins and iOS UX Engineer. Opensource projects contributor, share on Twitter. @jamztang