CONTEXT: Invisible is going Machine Learning, and our first goal is to use ML to aid us in routing client delegations. When a client sends us an email, we’ll use the emails’ content to infer whether the task should be routed to the “Calendar” team, the “Lead Sourcing” team, or any other of our capabilities.
The Pareto principle states that, for many events, roughly 80% of the effects come from 20% of the causes.
We’re not off to become Machine Learning experts (yet), so we will try and make the most out of the existing frameworks and services. I have built one proof of concept with each of these three solutions and will be giving you my analysis.
Finding intent to buy from Instagram comments with TensorFlow.js
As a developer at in an influencer marketing company, I work intimately with Facebook’s Instagram Graph API and attempt…
2. AWS SageMaker
Getting from the AWS Console to a working Jupyter Notebook based on sample code was easier than expected. It only accepts Python, but it’s really easy to import libraries. I’ve appreciated the Jupyter Hub interface as it gives you a local filesystem which makes for a very ergonomic place to do data massaging.
3. Google AutoML
I’ll tell you right off the bat, AutoML sounds too good to be true. I just dropped a CSV file, and it did everything by itself. One of the coolest things is that Google AutoML gave me insight into my dataset, and told me what to fix in order to get good predictions out of it. This is something that in different circumstances could have taken me quite some time to figure out.
Comparison and Decision
We have decided to go with Google AutoML at this moment.
* Google AutoML forced me to remove qualifications for which we had less than 100 datapoints (i.e. delegations), which made its dataset have less noise than the dataset used for AWS SageMaker and Tensorflow.js.
** Google AutoML requires no code to be written.
*** Even though AWS SageMaker does not completely abstract ML into a GUI like AutoML, having a hosted JupyterHub does make things a lot nicer.
I’m stoked about being able to leverage the power of Machine Learning to implement features that will have an immediate impact on the way we work.