How to use Tensorboard with PyTorch in Google Colab

Andrew B. Martin
Looka Engineering
Published in
3 min readFeb 28, 2019


Why you’re here

PyTorch is the fastest growing deep learning framework. It offers several benefits over the more established TensorFlow.

However, one area PyTorch falls short of TensorFlow is ecosystem support. Tensorflow has a rich ecosystem of libraries that PyTorch doesn’t have. For example, to serve models, deploy on mobile, and to visualize training. This last one is what interests me today. In particular, PyTorch doesn’t have a native training visualization tool like TensorFlow’s TensorBoard. This means it can be more time consuming to set up a visualization of your training with PyTorch than with TensorFlow, and you might decide to not set up visualization at all.

Cool Tensorflow visualisation from:

In this post I’ll show you two ways you can visualize your PyTorch model training when using Google Colab. The first uses the new Jupyter TensorBoard magic command, and the second uses the library tensorboardcolab. You can find a link to an example Colab notebook at the end of each section.

Tensorboard Colab magic

Magic commands come from the IPython kernel and are meant to concisely solve common problems in data processing. Conveniently, there’s now a TensorBoard magic command (just make sure you install the latest TensorFlow build).

# Install latest Tensorflow build
!pip install -q tf-nightly-2.0-preview
from tensorflow import summary
%load_ext tensorboard.notebook

Then instantiate the summary writers. In this case I have one for recording training and another for recording testing.

current_time = str(
train_log_dir = 'logs/tensorboard/train/' + current_time
test_log_dir = 'logs/tensorboard/test/' + current_time
train_summary_writer = summary.create_file_writer(train_log_dir)
test_summary_writer = summary.create_file_writer(test_log_dir)

And you can write to TensorBoard just like you would in your TensorFlow code. See lines 24–26 below.

Then run

%tensorboard --logdir logs/tensorboard

It can take 5 seconds for TensorBoard to load and start listening to the folder logs/tensorboard. Once you see TensorBoard up, run.

train(model, train_loader, ...)

And you’ll see something like this:

Tensorboard running simultaneously with training!

Congratulations 🎉. You’re using PyTorch with TensorBoard in Colab. Note that this should also work in any Jupyter notebook using the IPython kernel as long as you install the latest TensorFlow.

To see how everything works together checkout this example Colab notebook.

The tensorboardcolab Library

The second way to use TensorBoard with PyTorch in Colab is the tensorboardcolab library. This library works independently of the TensorBoard magic command described above.

This approach is similar to the TensorBoard magic command, except instead of running TensorBoard in your colab notebook it uses ngrok to tunnel TensorBoard to localhost. See this Stack Overflow Answer for more details.

To use tensorboardcolab we’ll instantiate TensorBoardColab and then save values to it during training.

!pip install tensorboardcolabfrom tensorboardcolab import TensorBoardColab
tb = TensorBoardColab()

The code to save to TensorBoardColab while training could look something like this (see line 25):

Then when you run

train(model, train_loader, ...)

You’ll see something like

Tensorboard running on

When you go to the link, you’ll see the TensorBoard dashboard you know and love.

Tensorboard dashboard using tensorboardcolab

To see how everything works together checkout this example Colab notebook.


There you have it, two ways to visualize your PyTorch training using TensorBoard and Google Colab. I’ve made example Colab notebooks for each of the approaches.

Magic Tensorboard:


Good luck 😄

You can follow me on Twitter for more content @andrewbrownmart