Useful snippets for Google Colaboratory. Free GPU included.

Andrey Nikishaev
Machine Learning World
3 min readApr 23, 2018

For people who are new or still didn’t use it — Google Colab is Jupyter service with dedicated server(with GPU) that you get free for about 12 hours(after it you need to reinitialize it). So it’s good place where you can test your Python scripts, or use it while you are learning Machine Learning & Data Science.

In my work i often use Colab, for two reason. First it gives free computation power and even free Nvidia Tesla K80. Second it’s very convenient way of sharing some ML & DS examples to people (which i use in every article on Medium).

Here i gathered useful snippets that i use every day, that makes my life easier :)

Connecting Google drive to your instance

# Create drive folder
!mkdir -p drive
!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
# Authorize instance to use Google Drive
from google.colab import auth
auth.authenticate_user()
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}
# Connect drive to folder
!google-drive-ocamlfuse drive

from new release all above can be replaced with

from google.colab import drive
drive.mount('./drive')

Don’t forget to copy all intensive reading data from Drive to instance to make access faster.

Upload/Download files from instance

from google.colab import files
def upload(path):
uploaded = files.upload()
with open(path,’wb’) as fp:
fp.write(uploaded[uploaded.keys()[0]])

def download(path):
files.download(path)

Use Facets

Source: https://github.com/PAIR-code/facets

Then just write (example):

FacetsOverview(df_train, df_test)
FacetsDive(df_train.head(500))

Running TensorBoard on instance

Then just open url that will be printed and that’s all.

Connecting to instance over ssh

All your data is in /content/ directory.

Right now free Ngrok account doesn’t support 2 simultaneous tunnels, so if you are using one for TensorBoard, you will have to kill it.
You can do it with command:

!kill $(ps aux | grep './ngrok' | awk '{print $2}')

Accessing web cam

Downloading/Uploading data to Kaggle

For this purpose you just need to use Kaggle-API library https://github.com/Kaggle/kaggle-api

Support

Become a Patron and support our community to make more interesting articles & tutorials

Get interesting articles every day — Subscribe on Telegram Channel

Read my other fresh articles

--

--