HackMIT 2015 Winners

Claire Nord
HackMIT Stories
Published in
4 min readOct 9, 2015

--

Congratulations to all of the teams that competed in this year’s HackMIT! Teams had 24 hours to use their ingenuity and technical skills to build and test their hack, then present it. The teams below were selected by judges through our voting algorithm as this year’s winners.

1. Air Guitar

G. Beams, K. Leidal, M. McEachern, N. Matthews

Air Guitar uses Apple Watch and iPhone to reimagine the air guitar concept. Apple Watch’s accelerometer along with iPhone’s multitouch interactivity provide the perfect mechanism to create air guitar music by tracking strum and finger patterns. The simple, elegant iOS application is the result of a highly technical implementation involving noise filtration of accelerometer data, modification of the Karplus-Strong string synthesis algorithm, and the integration of C++ code to Swift for efficient audio buffering. Air Guitar will be on the App Store soon!

2. Kinarity

B. Baker, D. Pham, H. Wu, M. Wang

We’ve created a way for the blind to navigate and discover the world around them more clearly than ever before. Using a Kinect, Clarifai, and your headphones we can help you avoid obstacles and experience your surroundings in a unique way.

3. Sensei

A. Gupta, P. Mukherjee, A. Zhang, J. Luo

Sensei provides analytics for the real world using security camera footage.

Left: The Sensei team. Right: Sensei in action.

And the rest of the 2015 HackMIT finalists, in alphabetical order:

ap(eye)

A. Rao, H. Kemburu, J. Patel, T. Hyon

ap(eye) allows anyone with just a smartphone to easily experience intelligent augmented reality. Simply point your camera at an object around you, and ap(eye) provides relevant, contextual connections to your favorite apps such as Uber, Fitbit, and Tinder. This system is extensible and designed to support many more APIs in the future.

CelluScope

A. Winfree, D. Huculak, P. Ho, K. Lampotang

Celluscope is an iOS application that uses an attachable lens to take 200–300X magnified photos of blood swabs. The image is then uploaded to our server using Parse, and the image is processed using Clarifai’s machine learning API. We trained the image processing classifications to recognize healthy red blood cells, sickle cells, and malarial cells. The type of blood sample is then returned and displayed on the mobile application. You can use it to diagnose the two diseases in the field relatively inexpensively compared to existing methods or paying for an expert to work full time. You can use it for anything else really, that’s just the application we chose to focus on. The CelluScope team is currently developing an Android application to complement the iOS application.

inCync

R. Delaney, P. Sathyanarayanan, M. Surajiwale

inCync helps you do presentations like a professional. Use inCync to set up notification times and duration of your presentation, and you and all your friends will receive gentle vibrations in your pocket so that everyone knows exactly how far into the presentation you are.

LeanOnMe

A. Trattner, L. Jing, N. Buduma

We built a peer support network for universities to reduce the barrier to entry for mental health resources on campus.

LiSense

P. Zhang, A. Kumar, E. Shin, S. Nahar

LiSense generates powerful analytic insights about people in a venue. It uses a laser with a machine learning model to detect individuals and crowds in a large space, and then visualizes customer engagement metrics on a web interface. LiSense has many practical applications, from physical A/B testing of store promotions and quantitatively measuring exhibit popularity to optimizing foot traffic through public spaces.

LiSense visualization of people and objects in an environment: the black cells represent objects and the orange cells represent people.

ReacTV

V. Mayar, L. Araujo, P. Dhariwal, P. Nagaraj

Enjoy watching TV with your loved ones even when they’re not there! Post your reactions during a show or movie, and see your friends’ reactions during dramatic or funny moments directly on screen.

Tinder+

C. Denny, X. Ren, Y. Zhang, Y. Liu

We use the Clarifai image deep learning service to teach our Azure server your Tinder preference. The server can then automatically decide to swipe left or right for potential baes.

Tinder+ presenting their hack (photo by Michelle Zheng).

Thanks to all the hackers, mentors, judges, sponsors, and team members that made this year’s HackMIT amazing. We look forward to all the hacks and hullabaloo to come next year!

--

--