One Shoe Hackathon: Conny

12 developers, two teams, 24 hours. The start of the first One Shoe Hackathon was friday, february 17 at noon (yesterday, in fact, at the time of this writing, although it feels like a week ago). The subject picked by our organizing colleagues (thanks to Yaron and Robin) was Artifical Intelligence.

We started out with three teams to brainstorm ideas. After presenting the ideas and a vote, two ideas remained and the third team was split among the two winning ideas.

Our team implemented Conny, the next generation of people tracking. Based on camera images, Conny will keep track of where people are in the building. The Minimum Viable Product was slightly less ambitous; personalized greetings when entering the building.

Conny is based on several key technologies. The AI heavy lifting is performed by Amazon Rekognize. The frontend is a fairly simple HTML/CSS/JavaScript application that does not use any framework. It does use the tracking.js JavaScript library for facial tracking. It communicates to a Laravel backend.

That’s me, not Marc. Just as handsome, though. This is a fairly late prototype, but the messaging had not been tied to the recognition yet.

Amazon Rekognize

Rekognize is one of Amazon’s Web Services (AWS). It has several image recognition tricks up its sleave, where we were most interested in facial recognition. The service allows for storing a set of images (typically containing faces), and later using a new image (also containing a face) to find the face that resembles the face in the new image the most. Using attributes of the original image, it is then possible to identify the person in the new image with a high degree of certainty. One important gotcha that caused us some grieve was that the Amazon documentation states that the PHP SDK needs image data to be base64 encoded before it is offered to the API, whereas it turns out the SDK takes care of that now, just like with most other language integrations. Other than that, the service was extremely easy to use.

Tracking.js

Tracking.js is a JavaScript library that “brings different computer vision algorithms and techniques into the browser environment.” We’ve used it to do the initial facial tracking on the frontend, where a snapshot is taken using the client’s webcam (fully automatically, I might add) that is sent to the backend and used to match against the data in Amazon Rekognize.

Laravel

Laravel is used to do the grunt work. It’s used to fill Rekognize with the initial data and to store user information (including which images in Rekognize go with which user). It uses version 3 of the AWS SDK for PHP to talk to Rekognize.

The first One Shoe Hackathon was a great success. Thanks to the other team members; Robert (who now, in addition to Drupal, also knows Laravel), Robin (thanks for organizing, man), Johan, Dylan and Marc (who turns out not to be full of it when claiming he “used to be a web developer”). We’ll all need some sleep before considering a second edition of the Hackathon.