Good evening, everyone!
I’ve recently faced an objective that required to configure CI/CD for project’s test environment.
I’ve started trying to find vendor serviced solution with free plans.
There are two options that I’ve found:
Shippable seemed better for use, at first sight, but after spending a lot of time trying just to run a script on my VM, I gave up.
Bitbucket Pipelines in the opposite turned out to be very easy to set up. I have spent less than 30 minutes to make it run my projects restart scripts.
So I will tell you how I have done it, hoping that it will help you in future.
- Configured server, I have used Digital Ocean Droplet with Ubuntu 16.04, but you can use any VM provider.
- Installed git on your VM (will be used to update code)
- SSH client on your machine (if not available by default)
Configure SSH access to remote machine
First of all, you have to configure ssh access to your remote machine, so the bitbucket pipelines agent could remotely connect to your machine to run scripts.
To do that you can use ssh-keygen + ssh-copy-id tools.
Use this command to generate a new key, it will ask you for a key name, that will be used later:
The key files are usually stored in the
To copy keys to a remote server for enabling ssh access use following command:
ssh-copy-id -i ~/.ssh/mykey user@host
Consider changing <mykey>, <user> and <host> to appropriate values.
If the execution finishes successfully you can test the connection to your server.
ssh -i ~/.ssh/mykey user@host
After you successfully connect to your machine with ssh command now you can use this key to set up bitbucket pipelines.
Go to your BitBucket settings page and to Pipelines/SSH keys tab in there.
Use my own keys button will lead you to the following screen.
Here you should insert previously generated private/public keys, which you can find in a ~/.ssh directory in files <mykey> for a private part and <mykey>.pub for the public.
Bitbucket Pipelines service uses bitbucket-pipelines.yml to store configuration information, so let’s take a look how it works.
The first section is ‘image’, the base image to run your build job. In my case, it is python image.
Next section is ‘pipelines’ which in my case is configured only for default case, which will run for any branch, but you can easily configure different pipelines for branches if you need it.
Default section is built of set of steps, which include script section. These commands will be run on your build agent machine. You can see how it looks in my case:
Next step that is included in default section is the most important because it is a deployment step.
In this script section, I pass my shell script to be run on the remote machine, so now I’ve already succeeded. The only thing that is not still done is updating the code on a remote machine.
Updating the remote machine code
If you remember you’ve been required to have git on your remote machine.
To make a deployment work you have to git clone your repository on your remote machine. After you do that it will be easy to git pull on your deployment step, what actually I am doing in my shell script. That’s how the script looks like:
One thing to mention that I’ve spent a bit of time to fix, not related to the current article was environment variables. Please be sure that if you use any environment variables in your project, they are actually exported when you connect with the non-interactive shell you still see all the variables you need.
It can be easily checked with:
ssh -i ~/.ssh/<mykey> <user>@<host> env
- Ssh-keygen tutorial https://www.ssh.com/ssh/copy-id
- Bitbucket pipelines SSH keys https://confluence.atlassian.com/bitbucket/use-ssh-keys-in-bitbucket-pipelines-847452940.html
- Ssh environment variables issue https://stackoverflow.com/questions/216202/why-does-an-ssh-remote-command-get-fewer-environment-variables-then-when-run-man