Google Cloud Just Changed the World
Imagine if you created a unique way to increase the performance of an enterprise software sales teams by helping them focus on leads that have a greater chance of closing through machine learning and artificial intelligence. You clean up a massive data-set, you run that in the cloud in less than a day, package it up as a service and sell the best clustered centroid to the highest bidder.
Within years a new generation of entrepreneurs will be taking their artificial ideas to the cloud, processing those ideas into real-world services and products and reselling those ideas online — and Google is leading the way to make that a reality even sooner than we imagined.
Jeff Hawkins, the inventor of the Palm Pilot and founder of the extraordinary AI firm Numenta, wrote a book in 2004 called On Intelligence. It was a fascinating critique of the trends of the day around artificial intelligence which focused primarily on rule-based interactions to mimic intelligence. Jeff and his team have proposed a more biological analogy for driving true artificial intelligence based on the human brain inter-workings of sparse distributed representations or SDRs.
SDRs are essentially groups of patterns, frameworks and adjacent memory troves feeding backwards and forwards within the structures of our brains to bring understanding and apply intelligence to our world. I believe that the most recent roll-out of Google Cloud injects significant energy to the world of machine learning, artificial intelligence and gets us closer to the SDR approaches that Hawkins believes will change the world.
Google Cloud has existed in one form or another for a while and encompasses a host of services exposed for application services. Their newest itearation includes serious advancements. Essentially, Google is going toe-to-toe with Amazon Web Services.
Two recent deployments have set Google ahead of others however.
- Google has deployed services and frameworks customized around specific functional, topical and informational models.
- They’ve given anyone with a Gmail account a $300 credit to start hacking away with these services!
The last piece is a fun development for any aspiring technologist.
For example, in the last 24 hours I spent approximately $7.60 of that $300 credit running a Linux virtual machine on an Intel Broadwell 8 virtual-CPU machine with 30 GB of memory. I used that configuration on their Computer Engine to train a recursive neural network around a dataset of tweets from President Trump to simulate a twitterbot in his milieu.
(Here’s an example when I primed the trained bot with the phrase: “I love twitter.”)
I love twitter. For President Obama a marine ordered that @donlemon is doing arrogant irrelevant stuff. The debate tonight. @TrumpSoHo. THANK YOU!
Ummm… obviously I still have some refinements to do but you get the idea :)
I can only recall one time in my 20+ career of being an internet strategist where I’ve had access to such power — and that was on-site setting up a server cage. Yesterday I summoned a massive machine from the cloud, performed an advanced deep learning Tensorflow python function, and saved it all across nearly 3GB of processed data. Now I can download that data model, stop the VM and turn it back on when I’m ready to take on the next big CPU-intensive bout.
The second advancement that Google Cloud has introduced are specifically designed hardware, software, and virtual environments for specific operations — somewhat akin to what Hawkins aims for with SDRs.
For example, in their “Big Data” product set of Google Cloud they have created a set of APIs and quick-start applications called Google Genomics. This is one of my favorite topics but it required very specific datasets to run these comparison. By comparison, something like the CDC CDI analysis I did can probably be performed on the regular compute engine but a dataset of gene data — even your own downloaded DNA analysis from AncentryDNA runs tens of thousands of rows.
Google Genomics on Google Cloud provides detailed APIs hooking up to the Human Genome Project among other resources for quick data processing.
Other tools include frameworks for publishing, ML engines, and data flows. Google has taken a thematic approach to building an infrastructure and with a host of tutorials you can start your own VM machine within minutes.
As Hawkins and others noted in their article last year:
We believe you can’t get to machine intelligence by incrementally building upon the simple neuron approach, but instead must throw it away and start over with a more realistic biological approach.
Imagine a world where your incredible idea could take life within minutes because of the power of distributed cloud computer at your finger tips!
— about the author:
Justin Hart is a senior executive consultant.
His primary objective: plumb the deep depths of cutting edge technologies and translate those into c-suite strategies to improve marketing and sales teams.
shorter version: mktg + bizdev + ai
Justin is a recognized industry speaker on modern marketing trends. He is currently working with several companies applying advanced tech tools like machine learning and artificial intelligence to business funnel basics.
You can find his work online at justinhart.biz.
Email Justin at justinhart.biz at gmail.
On twitter @justin_hart.
On Medium Justin Hart
Justin has over 20 years experience as a senior executive of established and start-up companies and even political campaigns (as senior digital director to the Mitt Romney campaign). He currently resides in Southern California.