How to use Tensorboard with PyTorch in Google Colab
Why you’re here
PyTorch is the fastest growing deep learning framework. It offers several benefits over the more established TensorFlow.
However, one area PyTorch falls short of TensorFlow is ecosystem support. Tensorflow has a rich ecosystem of libraries that PyTorch doesn’t have. For example, to serve models, deploy on mobile, and to visualize training. This last one is what interests me today. In particular, PyTorch doesn’t have a native training visualization tool like TensorFlow’s TensorBoard. This means it can be more time consuming to set up a visualization of your training with PyTorch than with TensorFlow, and you might decide to not set up visualization at all.
In this post I’ll show you two ways you can visualize your PyTorch model training when using Google Colab. The first uses the new Jupyter TensorBoard magic command, and the second uses the library tensorboardcolab. You can find a link to an example Colab notebook at the end of each section.
Tensorboard Colab magic
Magic commands come from the IPython kernel and are meant to concisely solve common problems in data processing. Conveniently, there’s now a TensorBoard magic command (just make sure you install the latest TensorFlow build).
# Install latest Tensorflow build
!pip install -q tf-nightly-2.0-previewfrom tensorflow import summary
%load_ext tensorboard.notebook
Then instantiate the summary writers. In this case I have one for recording training and another for recording testing.
current_time = str(datetime.datetime.now().timestamp())
train_log_dir = 'logs/tensorboard/train/' + current_time
test_log_dir = 'logs/tensorboard/test/' + current_time
train_summary_writer = summary.create_file_writer(train_log_dir)
test_summary_writer = summary.create_file_writer(test_log_dir)
And you can write to TensorBoard just like you would in your TensorFlow code. See lines 24–26 below.
Then run
%tensorboard --logdir logs/tensorboard
It can take 5 seconds for TensorBoard to load and start listening to the folder logs/tensorboard. Once you see TensorBoard up, run.
train(model, train_loader, ...)
And you’ll see something like this:
Congratulations 🎉. You’re using PyTorch with TensorBoard in Colab. Note that this should also work in any Jupyter notebook using the IPython kernel as long as you install the latest TensorFlow.
To see how everything works together checkout this example Colab notebook.
The tensorboardcolab Library
The second way to use TensorBoard with PyTorch in Colab is the tensorboardcolab library. This library works independently of the TensorBoard magic command described above.
This approach is similar to the TensorBoard magic command, except instead of running TensorBoard in your colab notebook it uses ngrok to tunnel TensorBoard to localhost. See this Stack Overflow Answer for more details.
To use tensorboardcolab we’ll instantiate TensorBoardColab and then save values to it during training.
!pip install tensorboardcolabfrom tensorboardcolab import TensorBoardColab
tb = TensorBoardColab()
The code to save to TensorBoardColab while training could look something like this (see line 25):
Then when you run
train(model, train_loader, ...)
You’ll see something like
When you go to the link, you’ll see the TensorBoard dashboard you know and love.
To see how everything works together checkout this example Colab notebook.
Conclusion
There you have it, two ways to visualize your PyTorch training using TensorBoard and Google Colab. I’ve made example Colab notebooks for each of the approaches.
Magic Tensorboard: https://colab.research.google.com/drive/1NbEqqB42VSzYt-mmb4ESc8yxL05U2TIV
tensorboardcolab: https://colab.research.google.com/drive/1hR-DQvve8uEX2zH8h4y1XgP1atKRUl0g
Good luck 😄
You can follow me on Twitter for more content @andrewbrownmart