Using Tensorboard in Google Colab with PyTorch
TensorBoard a visualization toolkit for machine learning.
By using TensorBoard
- Track and visualize loss and accuracy with graph
- Visualize model graph
- View histogram of tensor and tack changes
And so on
This article is not about tensorboard tutorial. It is about how to use tensorboard in google colab including available methods.
If you want to learn more about Tensorboard See this video Hands-on TensorBoard (TensorFlow Dev Summit 2017)
Now PyTorch
Though TensorBoard is a visualization library for TensorFlow and it is useful in understanding training runs, tensors, and graphs. But we can still use it for PyTorch. Before official support, there have some 3rd party libraries available such tensorboardX.
But Now TensorBoard supports officially…
This is available under torch.utils.tensorboard package
If you want to know more check the documentation of tensorboard for PyTorch
Use in Google Colab
If you want to use tensorboard in google colab, there are so many available methods. But here is just only top 2
- TensorBoard notebook extension
- ngrok
you can also use this 3rd party library TensorbardColab
Let's dig in those methods
TensorBoard notebook extension
Install TensorBoard notebook extension
!pip install -q tf-nightly-2.0-preview
%load_ext tensorboard #Load extension
Now create a directory for logs
import os
logs_base_dir = "runs"
os.makedirs(logs_base_dir, exist_ok=True)
We are ready to go. when you want to launch tensorboard use this command
%tensorboard --logdir {logs_base_dir}
Before launching write some write something on tensorboard by using SummaryWriter. For instance model graph and images, and dummy scaler value
from torch.utils.tensorboard import SummaryWriter
# load data from loader
images, labels = next(iter(train_loader))tb = SummaryWriter()
tb.add_images("Image", images)
tb.add_graph(model, images)
for i in range(1, 20):
tb.add_scalar("Tag", 10/i, i)tb.close()
Now use the magic command to launch Tensorboard
%tensorboard --logdir {logs_base_dir}
And the output looks like this
Now switch the tab and play with tensorboard
This is a pretty easy method, but the problem of this method is the tensorboard will open inside the cell output. In my point of view, I don’t like this type of output option. I feel like
If you feel like me, then jump into the second method, but this time it will generate a link, by using that link you can browse tensorboard
Ngrok
Start by downloading ngrok and unzip
!wget https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip
!unzip ngrok-stable-linux-amd64.zip
Let’s assume the TensorBoard log path is “runs” and fire up the TensorBoard in the background
import os
LOG_DIR = 'runs'
os.makedirs(LOG_DIR, exist_ok=True)
get_ipython().system_raw(
'tensorboard --logdir {} --host 0.0.0.0 --port 6006 &'
.format(LOG_DIR))
run ngrok to tunnel TensorBoard port 6006
get_ipython().system_raw('./ngrok http 6006 &')
On the last step, generate a public URL by using that URL we can access the colab TensorBoard web page
! curl -s http://localhost:4040/api/tunnels | python3 -c \
"import sys, json; print(json.load(sys.stdin)['tunnels'][0]['public_url'])"
we will get output like this
Use this link to lunch Tensorboard, and the webpage looks like
we didn't write any data yet. Write something on tensorboard by using SummaryWriter. For instance model graph and images, and dummy scaler value (Like previous one)
from torch.utils.tensorboard import SummaryWriter
# load data from loader
images, labels = next(iter(train_loader))tb = SummaryWriter()
tb.add_images("Image", images)
tb.add_graph(model, images)
for i in range(1, 20):
tb.add_scalar("Tag", 10/i, i)tb.close()
Refresh the TensorBoard web page (default refresh rate 30s). And TensorBoard web page is ready with data. See screenshots
Finally, we can achieve our goals
Now play with tensorboard
Here is notebook for all the code
Thanks for reading…
Happy coding 😍