How to run Merlin on Google Colab

Radek Osmulski
Published in
4 min readJan 24, 2023


Colab provides access to GPUs for new learners and enables experienced practitioners an easy way to experiment with the Merlin recommender system!

Follow along with this blog and learn the following:

So if you’d like to take the Merlin Framework for a spin

  • without altering your environment
  • using freely available resources and running on the GPU

this tutorial has got you covered!

Installing RAPIDS and Merlin on Google Colab

Navigate to Google Colab and pick a Merlin example notebook you would like to run. The Merlin Models introductory track is great to start with.

Let’s start with 01-Getting-started!

Navigate to File > Open notebook, switch to GitHub from the menu on top and copy the path to the repository into the search field.

Next, click the magnifying glass on the right or press enter.

Before proceeding, make sure you are using an environment with a GPU. On the menu bar on top, go to Runtime > Change runtime type and select GPU for Hardware accelerator.

Then copy the code below into your notebook as the first cell, run it, and you should be all good to go!

# This get the RAPIDS-Colab install files and test check your GPU.  Run this and the next cell only.
# Please read the output of this cell. If your Colab Instance is not RAPIDS compatible, it will warn you and give you remediation steps.
!git clone
%cd rapidsai-csp-utils
!git checkout patch-22.12
%cd ..
!python rapidsai-csp-utils/colab/
!python rapidsai-csp-utils/colab/

# Install the Merlin Framework
!pip install -U git+
!pip install -U git+
!pip install -U git+
!pip install -U git+
!pip install -U git+
!pip install -U git+
!pip install -U xgboost lightfm implicit

Please note, we have tested this with the 23.04 release, and this should generally continue to work going forward. We have automated tests in our repositories, but unfortunately cannot test our code against Colab, hence compatibility is not guaranteed. If you notice something is not working, please let us know by opening an issue on GitHub.

It might take a few moments for everything to be installed. You should however be able to see the logs stream across your screen so do not be alarmed.

And that’s really it!

You can now execute the notebook and step through each cell to inspect what each line of code is doing.

All our examples are self-contained, which means they will either generate or download the data they need for running (you might need to execute them in order 01, 02, 03, etc).

Now, you won’t be able to run all the example notebooks. For instance, there is no way to run the Triton Inference Server on Colab, hence serving your predictions will be a no go!

But that is just a mall part of the Merlin Framework. If you’d like to go on a tour of it, here are several tutorials that are great to dive in:

  • meet the all-new and powerful Merlin Dataloader and run it with your choice of vanilla deep learning model (be that using PyTorch or TensorFlow)
  • experience the speed and convenience of preprocessing your data using the GPU when you dive into these NVTabular introductory notebooks
  • get your feet wet with Merlin Models, learn how to define your own architectures, or use one of the many predefined models, in this introductory track (we used the first notebook as the example for this blog post)
  • experiment with one of the hottest topics in recommender systems, that is session-based recommendations, in this introductory set of notebooks leveraging the Transformer architecture

We do our best to ensure our example notebooks execute flawlessly across all environments however we can’t guarantee that with future releases of Colab some functionality will not break.


In this blog post, we looked at how to install RAPIDS and the Merlin Framework on Google Colab. Using our example notebooks is just a starting point — the next step might be customizing them or running them on different datasets!

Thank you for reading! If you’d like to learn more about the cutting-edge Merlin Framework, you can do so here.



Radek Osmulski

I ❤️ ML / DL ideas — I tweet about them / write about them / implement them. Recommender Systems at NVIDIA