Visualize Your Networks in Tensorboard

Gary(Chang, Chih-Chun)
Deep Learning#g
Published in
3 min readMay 29, 2018
https://www.tensorflow.org/programmers_guide/graph_viz

I’m implementing some prominent CNN architectures in Tensorflow, and I found that it’s hard to debug especially when there are thousand lines of codes. Fortunately, Tensorflow provides us a visualization tool “Tensorboard”, which is easy to check the network and the data flow. So in this article, I’ll demonstrate how to build a network graph.

The example code is from here.

tf.name_scope()

To begin with, I have to introduce the scope functions tf.name_scope(), which creates a scope for the names of the ops you create in the network. This has an effect on how you refer to the tensors and how the graph shows in TensorBoard and so on.

Check here for more details.

tf.summary.scalar()

tf.summary.histogram()

def variable_summaries(var):
with tf.name_scope('summaries'):
mean = tf.reduce_mean(var)
tf.summary.scalar('mean', mean)
with tf.name_scope('stddev'):
stddev = tf.sqrt(tf.reduce_mean(tf.square(var - mean)))
tf.summary.scalar('stddev', stddev)
tf.summary.scalar('max', tf.reduce_max(var))
tf.summary.scalar('min', tf.reduce_min(var))
tf.summary.histogram('histogram', var)

We use tf.summary.scalarand tf.summary.histogramto show the scalars of variables (like mean, stddev, max and min) and histograms in Tensorboard. By defining the function above, we can pass some important variables to the function for visualization.

...
variable_summaries(weights)
variable_summaries(bias)

When finishing the training step and testing, we can visualize the result by enter tensorboard --logdir=THE_SUMMARY_SAVE_PATHto activate the Tensorboard.

Starting TensorBoard 41 on port 6006
(You can navigate to http://xxx.xx.xx.x:6006)
the scalars of weights and bias
the histograms
...
tf.summary.scalar('cross_entropy', cross_entropy)
tf.summary.scalar('accuracy', accuracy)

To define the graph

with tf.name_scope('input'):
...
with tf.name_scope('input_reshape'):
...
with tf.name_scope(layer1):
...
with tf.name_scope('dropout'):
...
with tf.name_scope(layer2):
...
with tf.name_scope('cross_entropy'):
...
with tf.name_scope('train'):
...
with tf.name_scope('accuracy'):
...

So, we can see the graph in the Tensorboard.

●The input block is the placeholders of training images and its responding labels.

●The input_reshape block is to reshape the 1D inputs to origin 2D images for visualization.

●Network forward path: input → layer1 → dropout → layer2 → accuracy/cross_entropy(to calculate accuracy and loss)

●The train block is to minimize the loss and update weights and bias, so there are backward paths to layers.

Merge and Save

At the end, we merge all the summaries and write them out to the saving path.

merged = tf.summary.merge_all()
train_writer = tf.summary.FileWriter(FLAGS.log_dir + '/train',
sess.graph)
test_writer = tf.summary.FileWriter(FLAGS.log_dir + '/test')

This is the end of the brief introduction of Tersorboard. If there is any question, please let me know. Thank you!

If you like this article and consider it useful for you, please support it with 👏.

--

--