TensorFlow 1.0 vs 2.0, Part 3: tf.keras

why tf.keras? one ring to rule them all!

Yusup
AI³ | Theory, Practice, Business
5 min readNov 22, 2019

--

In the first two parts of this series, I covered Computation Graphs, plus Eager Execution and AutoGraph. However, as powerful as Graphs and Eager Execution may be, they simply aren’t that pleasant to use.

Whether you’re a seasoned researcher or simply a newcomer, the complexity of the TensorFlow ecosystem will likely overwhelm you. To lower the bar of entry, TensorFlow developers have placed tf.keras centerstage.

In this post, I will introduce and discuss:

  • The differences between Keras and tf.keras;
  • Architecture and an example of tf.keras in action;
  • How to use layers and methods of building models.

Keras vs tf.keras

François Chollet, a Google software engineer, created Keras as an independent machine learning framework.It instantly gained popularity with its elegant API design and support for multiple backends such as TensorFlow, CNTK, Theano, and PyTorch. tf.keras, on the other hand, is TensorFlow’s only high-level API.

In his latest tf.keras SIG meeting, Mr. Chollet mentioned that he will discontinue Keras (multi-backend) after version 2.3.0 in order to focus on tf.keras. The images below detail some important takeaways from this meeting.

In short, let’s stick to tf.keras! :)

tf.keras Architecture

Before I add anything else to make your head spin, let me sketch up the topology graph.

As the graph shows, tf.keras sits above the Graph, AutoGraph, and Eager modes. Among all the submodules within the Keras API (Engine, Layers, Losses, Metrics, Callbacks, Optimizers, Constraints, Regularizers, Model and Sequential).

The most important core abstractions in tf.keras, which we’ll discuss right after the example, are Layer, Model, and Sequential.

tf.keras Example

Without further ado, let’s see tf.keras in action:

This tf.keras example code is almost self-explanatory. Not only do we end up writing less cumbersome Python code to implement algorithms, but also it’s easier and less buggy!

tf.keras strikes a perfect balance between API clarity, brevity, and customizability. Let’s go through tf.keras core abstractions one by one.

1. Layer

Layers are the building blocks of tf.keras.

What’s so useful about layers? I can think of six things:

  1. Computation from a batch of inputs to a batch of outputs:
  • Works in eager and graph execution;
  • Supports a training mode and an inference model;
  • Supports masking (for time series or missing features).

2. Management of state (trainable weights, non-trainable weights).

3. Tracking of loss and metrics (which can be created during ‘call’) and updates that can be disabled on demand.

4. “Type checking” (automated compatibility checks when calling a layer);

  • optionally manages static shape inference.

5. Ability to be frozen or unfrozen (for fine-tuning, GANs).

6. Ability to be serialized/deserialized or saved/loaded (weight values).

2. Sequential vs Functional

There are two ways to build a model in tf.keras: with Sequential and Functional API.

Sequential

As the previous example showed, we can create a model by using the Sequential class. It’s a pretty intuitive way to buildcreate a model.

Because it’s a layer-by-layer architecture, it’s not suitable for complex scenarios such as:

  • Multiple differing input sources;
  • Production of multiple output destinations;
  • Any complex models with branches involved;
  • Models that reuse layers.

Functional API

An even more powerful way to build a model is to use a Functional API.

We can replace the Sequential model with a Functional model as such:

Functional API is much more flexible than Sequential API because:

  • We can connect layers at will;
  • We can have multiple inputs and outputs;
  • And at last, we configure the model with inputs and outputs, and tf.keras will take care of the rest.

Since Functional models can do all Sequential models do and more, Functional API is the way to go.

There is one thing to note, though, whether using Sequential or Functional APIs. Starting from TensorFlow 2, tf.keras uses graph execution by default, for the sake of performance benefits. However, if you prefer eager mode for the sake of ease of use, this can be configured via the run_eagerly parameter during model configuration:

Other Useful Tools

tf.keras is not the only goodie the keras-team created; here are a few more nuggets:

Keras Tuner

Keras tuner is a hyperparameter tuner, created specifically for tf.keras with TensorFlow 2.0.

To tune the example above, we can change the code into something like this:

Building the model:

We start by defining a model-building function (build_model). It takes it as an argument hp, from which you can sample hyperparameters such as hp.Int(‘units’, min_value=32, max_value=512, step=32).

Configuring the tuner:

Our goal is to get the best validation accuracy, which can be set via objective parameter. And max_trials are the maximum numbers of hyperparameter combinations tuner has to test.

Start tuning & getting the best models:

tuner.search parameters are identical to the regular model.fit function.

tf.keras is pretty declarative, and Keras Tuner is not that much of a hassle, either.

AutoKeras

Auto-Keras is an open-source software library for automated machine learning (AutoML). It is extremely cool and promising. Maybe I will introduce this one in one of my upcoming posts.

In Summary

In short, tf.keras is awesome. It is user-friendly, modular, and extensible as advertised.

Now that we covered tf.keras, let’s recap the ground you’ve gained in my series so far:

  • You have a clear understanding of TensorFlow computational graphs;
  • You know how Eager execution works;
  • You know how AutoGraph does its magic.

In part 4, I will cover TensorBoards. Stay tuned! :)

Thanks for reading! If you enjoyed this article, please hit the clap button as many times as you can. It would mean a lot and encourage me to keep sharing my knowledge.

Feel free to share your questions and comments here, and follow me so you don’t miss the latest content!

--

--