TensorFlow 1.0 — Top 3

Participating in the 1st ever TensorFlow Summit was interesting. Lots of enthusiasm, the TF team has the vibe and the time is right. Am sure we all have our own favorites in TF 1.0, let me iterate mine.

Note : I have captured the slides from the respective presentations, you should watch the videos (link at the end of this blog) to get a good understanding.

The framework has matured into a robust, feature-rich & deployable artifact

Models, Interfaces & Templates

I like the architecture with the three upper boxes, making it a lot easier to develop Deep learning based applications. The naming of the boxes are a little uninspired — I hope the engineers at Google come up with more expressive names. For example I am using the word “boxes” whenever I mean “layers” because the “layers” is a layer in TF; I rest my case.

Layers

The “Layers” layer makes it easier to construct models directly from neural network concepts without a lot of impedance. This is where Keras filled a vacuum.

It is not clear where tf.layers() stops and tf.keras() starts.

Also not clear if all the keras functionality and APIs are available — for example, the Keras model.summary() has been very valuable for me — to get an overview of the model as well as as an overall verification. Need to see what other summary tools are provided by TF 1.0.

The other best part of the Layers is the embedding of best practices — initialization, scoping et al. One caveat, need to see how flexible this abstraction is for customization. For example, when we come across an interesting initialization or normalization in a paper and want to implement it in TF.

ML Toolkit

Finally we can use TF for normal Machine Learning as well ! Not that we couldn’t do it uptill now, but the toolkit makes it a lot easier.

The ML toolbox has the usual suspects, but is still somewhat limited - reminds me of the evolution of the MLlib in Apache Spark !

Interestingly would be instructive to compare the Weighted ALS in TF vs ALS in Spark.

Incidentally, I thought the co-training example was a little clunky. Let us see how the ML layer matures.

The ML toolkit has a good performance story, again very similar to the Apache Spark presentations.

Trivia

“We may be able to overcome all our AI problems, but the AV is still out of our reach” — Pete Warden, when the clicker stopped working

Links

  1. TF Summit 2017 Videos : https://www.youtube.com/playlist?list=PLOU2XLYxmsIKGc_NBoIhTn2Qhraji53cv
  2. Google Announcement https://developers.googleblog.com/2017/02/announcing-tensorflow-10.html
  3. Tensorforest Paper NIPS 2016 https://sites.google.com/site/mlsysnips2016/accepted-papers
  4. Tensorforest Code https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/tensor_forest
  5. Tensorforest Poster https://twitter.com/Reza_Zadeh/status/807614308016357376