Setting Up a Data Lab Environment — Part 3 — Bashing and composing

We used two scripts and docker-compose.yml to help us serve Jupyter notebooks from AWS in the part 2 of this series.

Now, let’s explain how these two scripts work.


First, the bash script —

Bash scripts are codes that can be run in the terminal shell in a Unix/Linux environment. helps us install, set the necessary permissions and create some folders.

We first update the environment we are in. sudo is used to allow us to undertake updates/installation as an admin. We also install tree to allow us to visualise folders as a tree.

# Update
sudo apt-get update

# Install tree
sudo apt install tree

Next, we install Docker.

# Download and install docker-compose
sudo curl -L`uname -s`-`uname -m` > docker-compose
sudo mv docker-compose /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose

And create some directories.

mkdir docker
mkdir docker/jupyter
mkdir notebook

And restart.

sudo reboot


Next docker-compose.yml.

This barely scratches the surface of what Docker can do. But it’s good to start simple.

version: '3'
image: jupyter/tensorflow-notebook
- "8888:8888"
- .:/home/jovyan/work

What we are doing here is to ask Docker to start a container named ‘juryterone’, using the image ‘jupyter/tensorflow-notebook’ pulled from Docker Hub. We then connect the 8888 port in the container to the AWS EC2 instance’s 8888 port. The current folder we are in is also mapped to the folder ‘/home/jovyan/work’ in the container.

And that’s it. Pretty simple isn’t it?