Using TPU in Google Colab for FREE

Tommy Tao
1 min readDec 10, 2018

--

TPU, chip designed for AI

TPU is now free in Google Colab

Although its speed is just similar to Colab GPU (locked by Google?), it is still worth to try it, because Google may increase TPU speed in the future.

To use TPU, unfortunately, code change is required.

Code change

Firstly, get device name and TPU address

try:
device_name = os.environ['COLAB_TPU_ADDR']
TPU_ADDRESS = 'grpc://' + self.device_name
print('Found TPU')
except KeyError:
print('TPU not found')

Secondly, change Keras model into TPU model

tf.contrib.tpu.keras_to_tpu_model(
model,
strategy=tf.contrib.tpu.TPUDistributionStrategy(
tf.contrib.cluster_resolver.TPUClusterResolver(TPU_ADDRESS)))

Note

To build Keras model, the following Keras object (include but not all) should not be imported from keras.*

They should be imported from tensorflow.keras.*

e.g. (wrong)

from keras.models import Sequential, Model
from keras.layers import Dense, Dropout, Flatten
from keras.layers import Conv2D, MaxPooling2D

e.g. (correct)

from tensorflow.keras.models import Sequential, Model
from tensorflow.keras.layers import Dense, Dropout, Flatten
from tensorflow.keras.layers import Conv2D, MaxPooling2D

--

--