Image for post
Image for post
Don’t leave your models to rot into obscurity

So you’ve deployed your machine learning model to the cloud and all of your apps and services are able to fetch predictions from it, nice! You can leave that model alone to do its thing forever… maybe not. Most machine learning models are modeling something about this world, and this world is constantly changing. Either change with it, or be left behind!

What is model rot?

Model rot, data rot, AI rot, whatever you want to call it, it’s not good! Let’s say we’ve built a model that predicts if a zombie is friendly or not. We deploy it to the cloud and now apps all over the world are using it to help the general public know which zombies they can befriend without getting bitten. Amazing, people seem super happy with your model, but after a couple of months you start getting angry emails from people who say that your model is terrible! Turns out that the zombie population mutated! Now your model is out of date, or rotten! …


Image for post
Image for post

Recently a friend got me into basketball. Turns out, it’s a lot harder than it looks. No matter, I can over engineer a solution using machine learning. If your into ML and shooting hoops then there’s also this article that combined TensorFlow and basketball in a simulation.

Image for post
Image for post
nothing but net… if there was a net

The task is to find the exact angle of my shots. Then I can hopefully use that information in a proactive way to get better.

psst! the code for all of this is on my Github

Task 1: collecting data

Image for post
Image for post
I didn’t need to follow the seems of the ball, but it looks cool

I don’t have access to 3D tracking studios fitted with 200 cameras, but I do have Ebay. It’s quite easy to buy reflective tape online and stick it to the ball. Then (thanks to the lack of lighting at my local court) I can record some footage of me practicing in the evening and capture the balls movements. …


At the risk of alienating a lot of readers… I grew up with GUI’s, so I never needed to learn the ways of the terminal! This is socially acceptable in todays society of friendly user interfaces, except maybe if you’re in software engineering… whoops!

Image for post
Image for post
Manager — “Oh this is easy, just use this command line app”, Me — ”Yes…the command line… I’ll use that…”

Getting back my software engineering street cred’

I’ve gotten pretty far just through copy and pasting full command-line operations from Stack-Overflow but have never become comfortable enough to use a command line app “properly”. …


Manually labelling data is nobodies favourite machine learning chore. You needn’t worry though about asking others to help out provided you can give them a pleasant tool for the task. Let me present to you: generated Google Forms using Google App Script!

Image for post
Image for post
Google App Scripts allow you to build automation between Google Apps

The regular way people might label data is just by typing in the labels into a spreadsheet. I would normally do this as well, however in a recent task I needed to label paragraphs of text. Have you ever tried to read paragraphs of text in a spreadsheet?.. it’s hell! …


Here’s a dataset that is designed to help showcase when a Recurrent Convolutional Neural Network (RCNN) will outperform its’ non-recurrent counterpart, the Convolutional Neural Network (CNN).

A little primer

Recurrent models are models that are specially designed to use a sequence of data in making their predictions (e.g a stock market predictor that uses a sequence of data points from the past 3 days).

Convolutional models are models that are specially designed to work well with image data.

So a Recurrent Convolutional model is a model that is specially designed to make predictions using a sequence of images (more commonly also know as video). …


Image for post
Image for post

A time might arise were you’re going to need to predict using rotational data, either as a feature or as a target. Plugging degrees straight into your model might seem to work, but be wary, it’s not doing what you want.

Why machine learning algorithms hate degrees

Simply put, they are not smooth! What I mean is that the degrees scale teleports from 359 back to 0 degrees as it progresses. Look:

Image for post
Image for post

In this format a gradient decent algorithm won’t know 350 degrees is 10 degrees away from 0 degrees and therefore will never produce a robust model.

Sin and Cos to the rescue!

To overcome these issues we just convert the degrees to SIN and COS and use those as our features. Now there is no teleporting and distances are appropriately…


Recently finishing my stint within the British educational system I’ve moved into the field of machine learning. Whilst learning the ropes I’ve come to see extremely strong parallels that can be drawn between how we train machine learning systems and how we teach in the classroom. Some of the these parallels help highlight potential flaws in the ways we teach.

1. Learn by examples (the super obvious one)

As you can remember from your textbooks; sometimes the only way you can grasp a techniques was by seeing multiple answers and working out for yourself the general pattern that must be followed. …


Image for post
Image for post

Hollywood has made many big promises about artificial intelligence(AI) in the past: how it will destroy us, how it will save us, and how it with pass us butter. One of the less memorable things that has been promised is how cool it will look.

Image for post
Image for post
Jarvis AI interacting with Ultron AI ©Marvel

A great example of amazing AI visualisation is in Avengers when Tony Stark’s AI butler Jarvis interacts with Ultron, we see this organic floating network of light morphing and pulsing.

I wanted to make something similar to fill some blank space on my apartment wall (should be better than the usual IKEA art). Obviously I’m not going to be able to recreate anything as amazing as floating orb of light Jarvis. However there is a machine learning algorithm that could look interesting with some quirky data visualisation: a Neural Network! …

About

Zack Akil

(www.zackakil.com) Generalist programmer with knowledge in artificial intelligence, electronics and 3D printing. Developer Advocate of Machine Learning @ Google

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store