Deep learning without expensive hardware using Google Colab and connecting it with GitHub

Tushar Dhyani
Oct 1 · 5 min read
Image for post
Image for post
Assembling dependencies for Deep learning is like arranging a jigsaw puzzle

When it comes to Deep Learning, the first thing that comes to our mind is an expensive GPU. But affording an expensive machine while we are just learning is not feasible for almost all of us. Deep learning does require a good amount of computational power in order to create even basic classification models. We can take the path of getting a decent gaming laptop with a fair GPU to learn and build our models, but even after purchasing a GPU setting up our environment for any Deep Learning task is a hassle. Although it is not rocket science, it is really time-consuming and requires a lot of hits and try methodologies if the dependencies are not well documented. This time is extremely precious and we cannot be wasting it setting our machine with all the individual dependencies just in order to build one simple solution. So what can be done in order to get decent machines locally and train all our models without the hassle of arranging these bits and pieces of dependencies?

Kaggle and Colab provide us with above-average hardware that we can use for training our models for a limited time. The only downside to these is the restrictive access to only Jupyter Notebooks or maybe one single script. But the problem arises when we are building a complete solution or some API. So recently I came across a GitHub repo that could help us solve this issue and provide us with an entire IDE built over these platforms. This project is called Colabcode and this could help us solve a number of issues. So let’s have a look at how we can achieve this:

Installing and Running ColabCode

Once you have installed the Colabcode, you can easily call it in a single line and run as shown in figure 1.0.

Image for post
Image for post
1.0: Installing and starting the vs code server

In the ColabCode class, the port argument signifies the port of the machine used for serving the backend. Please note that certain port numbers are reserved for machine usage such as port 80 or 8080, so please make sure that you do not use some conflicting port number else they will result in the unpredictive behavior of the vs code server.

Once you run the cell, it will return a server access code in the form of https://<random-token>.ngrok.io which could be used to get a full-fledged text server which could be used as an IDE as shown in figure 2.3.

Image for post
Image for post
1.1 : The server starts and returns the server link

Attaching Google Drive

Image for post
Image for post
2.0 : Mounting drive with the vs code server
Image for post
Image for post
2.1 : Authorizing the Google drive
Image for post
Image for post
2.2 : Authorized vs code server online up and running.

Once the authorization is completed and the backend has been connected, it will generate the ngrok URL again as shown above in figure 2.2. You can use the generated link to use the fully-fledged vs code server.

Image for post
Image for post
2.3 vs code running on the generated address

Connecting vs code to GitHub

Image for post
Image for post
3.0: Vs code with GitHub for development

Start using the vs code today over your usual Jupyter notebooks and then thank me later for faster workflows 😉. You can check this colab notebook for the code.

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data…

Sign up for Analytics Vidhya News Bytes

By Analytics Vidhya

Latest news from Analytics Vidhya on our Hackathons and some of our best articles! Take a look

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Tushar Dhyani

Written by

Jr. Data Scientist | Natural Language Understanding | AI best practices

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Tushar Dhyani

Written by

Jr. Data Scientist | Natural Language Understanding | AI best practices

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store