Tutorial 0: Setting Up Google Colab, TPU Runtime, and Cloud Storage

David Yang
Fenwicks
Published in
3 min readApr 15, 2019

Google provides a free cloud-based Python programming environment called Colab. Colab runs on a virtual machine over the cloud, which comes with decent hardware configurations: a Xeon 2.3GHz CPU core, 12 GB of memory and 47 GB disk space, as of April 2019. Everyone can use Colab to run arbitrary Python programs for up to 12 hours. After 12 hours, Google terminates all your programs, erases memory and deletes all files on disk. Then you get another 12 hours for free. Colab can access your Google Drive (15GB free space) and Cloud Storage (5GB free). Save your files on either of these services, and you won’t lose any file when the virtual machine gets reset every 12 hours.

In addition, as of April 2019, Colab provides the option of a “TPU runtime”, which adds access to a Cloud TPUv2, for free. This is remarkable since, on Google Cloud Platform, a TPUv2 is currently priced at $4.5 per hour. In other words, every hour you use Colab’s TPU runtime, you save $4.5. This is how you enable the TPU runtime:

“TPU” stands for Tensorflow Processing Unit, a piece of hardware specifically designed to run Tensorflow. A TPU looks like this:

Image result for tpu
Tensorflow Processing Unit (TPU), available free on Colab. ©Google

A TPU has the computing power of 180 teraflops. To put this into context, Tesla V100, the state of the art GPU as of April 2019, is around 125 teraflops. So theoretically, a TPU is 1.5x faster than a V100. This should be more than enough for most deep learning tasks.

Unlike a GPU, a TPU is not a card plugged in a computer. Instead, TPUs are crammed in a special “pods” similar to crypto-currency miners, like the following:

TPU Pods. © Google

Since the TPU isn’t a part of your computer, it can’t read/write files on your disk, access your memory, or talk to your CPU directly. Instead, you must communicate with it through the network. Meanwhile, in the Tensorflow library, TPUs read data and write model files on Google Cloud Storage (GCS). Bypassing GCS is possible, but tricky. To keep things simple, it is much better to register a Google Cloud Platform account, which gives you $300 credits, and 5GB of storage on GCS for free, as of April 2019.

Registering a Google Cloud Platform account is straightforward — you fill out a form, tell Google your credit card number, and collect your free credits. After that, log in to GCS, and create a “bucket”, which is like a home folder. For this tutorial series, our bucket is always called “gs_colab”, though the specific name of the bucket does not matter.

Although the TPU is free, the GCS bucket is not. Even if you never exceed the 5GB storage limit, Google still charges you a cent or two, once in a while. Fortunately, the fees are low — a few cents per week, so your $300 free credits is more than enough to cover for a whole year. In my own experience, I spent around $0.1 every month for making this tutorial series.

That’s it — you are all set. Time to start your first deep learning tutorial on TPU.

All tutorials:

--

--