How to deploy from Gitlab CI

Web hints
5 min readNov 4, 2018

--

Gitlab CI is a built-in continuous integration tool from Gitlab, in this tutorial, we will explain how to setup auto-deploy for any kind of system

Still using FTP, SFTP or SSH to deploy your websites? Forgotten which files you updated since your last deployment? Seems familiar, look no further and start using Gitlab CI for your deployment.

Why you should be using auto-deployment

Auto-deployment has been around for years, but has been made simpler with Gitlab CI and it’s Docker integration.

If you’ve ever deployed a website to both staging and live, your mile may vary but there’s always a problem because you probably forgot to upload something, or overwritten something that shouldn’t have been uploaded yet.

This is where auto-deployment comes in, and since you already know Git it’s really easy to get started.

Is Gitlab the only option?

It surely isn’t, as is with every other software solution there are tons of variations.

To give some alternatives: * Jenkins CI * Circle CI * Bitbucket pipelines * …

These all work quite well, Jenkins is also open source but requires an own server for installation and Circle CI is only free for open source projects.

But since Gitlab is the online repo platform I’m using and it has a built-in CI integration which is completely free it makes sense for me to go this route.

You should be able to switch easily since Gitlab CI and many others use the Docker platform behind the scenes.

How does Gitlab CI work?

When pushing any code to your repository Gitlab will check if you have a .gitlab-ci.yml file in your root directory.

This file will define how Gitlab CI should interact with your project.

Gitlab CI is based on the Docker environment, this means you can make use of thousands of Docker images which are hosting on Docker Hub.

You define in the file which Docker image should be used and build your commands from there on.

Demo time!

First of all, here’s an example of how I build my very basic Gitlab CI file.

image: php:7.2-apache stages: - tests - staging - production before_script: - ... stylecheck: stage: tests script: - ... to_staging: stage: staging script: - git push ssh://<username>@<server-ip>:<port>/~/<path-to-online-git-repo> HEAD:refs/heads/master to_production: stage: production ...

The Docker image chose here is php:7.2-apache, which is defined by the image type.

You can see the stages as steps if the first step fails it won’t go to the next one, which is good because if a test fails you don’t want to actually deploy anything.

The before_script in the root will be executed on every job, in case you need to run composer install or npm install for every job.

The jobs are here stylecheck, to_staging, to_production, each with their own stage defined. But of course you can have multiple jobs per stage.

The auto-deployment feature

The deployment of different apps can be quite different, but one thing is always certain. You’ll need to get your code onto your server somehow. This can be either done using FTP, SFTP or SSH.

Prepare your server

Preparing your server is half the work, most servers and even shared hosting services have git pre-installed. (If not contact your hosting provider)

To transfer the files easily from Gitlab to the server I use a bare repository which will wait for a push to transfer the new code to the right position.

cd ~ mkdir repos/<my-site>.git && cd repos/<my-site>.git git init --bare nano hooks/post-receive

You now created a bare repository in the ~/repos/<my-site>.git directory. Now with nano paste in the following code:

#!/bin/bash echo "Pushing code to working directory.." GIT_WORK_TREE=<absolute-path-to-directory> git checkout -f cd <absolute-path-to-directory> composer install #only needed using Symfony php bin/console doctrine:schema:update --force #only needed using Symfony

This file acts as a hook when pushing to your server. It will take the code sent from Gitlab and copy it to the defined directory.

In case you need the commands like composer install on your server you can just add them like above.

To finish things, give execute rights to the hook chmod +x hooks/post-receive

Setup Gitlab

Already now you can use the post-receive hook by pushing directly from your local repository, this defeats the purpose of using Gitlab CI though since you’re not running any tests before deploying.

Depending on if you’re using FTP or SSH you’ll need either an FTP password or SSH private key to continue.

Since we don’t want to pass this valuable information as cleartext to our .gitlab-ci.yml file I add them to the Gitlab CI secret variables.

The secret variables can be found under your project. Settings -> CI/CD -> Secret variables

Add the variable SSH_PRIVATE_KEY and paste in your private ssh key cat ~/.ssh/id_rsa. (You first need to generate ssh key, with ssh-keygen.

Setup .gitlab-ci.yml file.

Now that your server and Gitlab is ready it’s time to set up the .gitlab-ci.yml for deployment!

Remember, we just need to push our code to the hook made on our server, that’s it!

to_staging: before_script: - "which ssh-agent || ( apk --update add openssh-client )" - eval $(ssh-agent -s) - mkdir -p ~/.ssh && chmod 700 ~/.ssh && rm ~/.ssh/id_rsa && touch ~/.ssh/id_rsa - ssh-keyscan -t rsa <server-address> >> ~/.ssh/known_hosts - echo "$SSH_PRIVATE_KEY" >> ~/.ssh/id_rsa && chmod 400 ~/.ssh/id_rsa script: - git push <username>@<server-address>:~/repos/<my-site>.git HEAD:refs/heads/master

That’s it! Let’s explain a little.

  1. You push your code from your local repository to Gitlab
  2. Gitlab sees your .gitlab-ci.yml file and will check for syntax errors.
  3. It will loop the stages and it’s jobs, so in this example, the test jobs will be running first.
  4. Once in the staging process, it will validate the ssh connection between Gitlab CI and the client by using it’s $SSH_PRIVATE_KEY defined in the secret variables.
  5. Now that we can connect to the server, execute a push command to it using the current branch.
  6. The server will receive an incoming push command on its hook and will copy all of the code to the correct directory.
  7. It may or may not execute composer install depending on what you've set up.

Pro tip: I suggest you use your own Docker image, you can easily install composer, npm and all other dependencies. This decrease the time needed to execute the commands.

If you have any questions, like always feel free to comment below!

Get involved on Slack!

Have any more questions? Come and join us on Slack and get involved in the web community! I will help with any questions I can help with regards web development, SEO and hosting.

Join us on Slack

Originally published at www.web-hints.com.

--

--

Web hints

From basic to advanced web developer hints and tutorials for SEO, Symfony, HTML & CSS, and use of tools.