Getting Started With Laravel On Codeship

Laravel is one of, if not the most popular framework around for PHP today. Using GitHub as my source of truth, Laravel has 24,543 stars, 690 of which it has received this month alone. This is admittedly a simplistic, dare I say even rudimentary, measure of success or quality. But it does show both inertia and uptake.

Given this inertia and given that Laravel is a framework for the web’s most-used software development language, it’s important to consider how significant it may become over time, with an increasing number of applications based upon it or affected by it in some way. For that reason, and given that I’m so passionate about continuous deployment and integration, today I’m launching a series of articles demonstrating how to integrate Laravel with Codeship.

My intent isn’t to further popularize Laravel. Whether it’s Laracon US, Laracon EU, Laracasts, or one of the many dedicated podcasts or blogs, the community’s doing a fantastic job of promoting it without needing my help. My intent is to show you how to use Codeship as part of your CI workflow so that you can deploy your applications with a minimum of fuss and effort.

The Laravel Series Overview

If we lay a solid foundation for future success, we’ll be able to gradually build in complexity without getting overwhelmed in the process. Let’s start at the beginning, with a simple, uncomplicated application.

First, we’ll install the application from a project repository, which you can find on GitHub. Then we’re going to link the GitHub project to our Codeship account and create a project to manage it. We’ll then work through the required pipeline configuration and related settings so that our application can be tested and deployed.

The application we’re going to build is a basic URL shortener. By the time we’re done, the application will support the following functionality:

  • List all routes
  • Add new shortened routes
  • Update existing routes
  • Delete existing routes

As you can see, it’s essentially a simple CRUD application. I say simple because the core of the application, the shortening code, won’t be written by me. For that, I’ll be using an external library. Also, we’ll just be managing records in a database, and not a very complicated one at that. However, this will give us enough to work with to emulate a normal CI workflow, which will include:

  • Making a code change
  • Pushing changes to version control system of choice
  • Triggering a build
  • Deploying if build succeeds

We’re also going to need to take environment variables into consideration, as we would in a normal deployment workflow. One aspect, which is very close to my heart, is that the application is going to be test-driven.

Given these elements, I’m confident that this series won’t fall into the “Hello World” category. There’s more than enough meat on its bones to cover aspects that we’d encounter on a regular basis.

The Application Foundation

The application will be based on Laravel 5.2 and require only one external package, laravelcollective/html 5.2. This is so we can quickly build the forms for manipulating the route information. The rest will be stock Laravel. To keep things simple, the database will be SQLite 3.

Over the course of the series, I’ll also demonstrate how to integrate external vendors, such as ElephantSQL. This isn’t strictly necessary, but I feel it’s important to show how it’s done.

And finally, to follow along with this series and to deploy the application yourself, you’re going to need to have an API key for Google’s URL Shortener API. If you don’t already have one, then take the time now to set one up.

The routes

As you can see in app/Http/routes.php, the application has two routes:

  • /view-urls. Links to UrlController@viewUrls and renders all of the routes available in the database.
  • /manage-url/. Links to UrlController@manageUrl and will be responsible for adding new routes.

The controller

If we dig into app/Http/Controllers/UrlController.php, we see that the two methods are very simplistic. viewUrls makes a call to the Url model’s all() method, making a blanket request to retrieve all available routes. After retrieval, it sets them as a template variable called urls.

manageUrl is also a simplistic method. If the request is a POST request and data is present, it will retrieve the url provided by the user and created a shortened version of it by using the Laravel Shortener package. It then sets that information in a new Url entity and saves the record in the database by making a call to the Url model’s save()method. It then redirects to /view-urls, where the new URL and its shortened equivalent will be displayed in the URL list.

On purpose, it performs no validation or defensive coding at this stage. That may sound a bit insane, but we’re starting off light. Over the course of this series, proper validation and input data sanitation will be added.

The database configuration

Now let’s look at the database configuration. Given that we’re using SQLite initially, it’s not going to be too involved. This extract, taken from config/database.php, shows that I’ve set the default database as sqlite and updated the configuration to point to the database file located under /database.

'default' => env('DB_CONNECTION', 'sqlite'),
    'sqlite' => [
'driver' => 'sqlite',
'database' => env('DB_DATABASE', database_path('database.sqlite')),

The database migrations

Database migrations are one of the key aspects that I’ve come to like about Laravel. It bundles in migrations as part of the default install.

Now applications are diverse in scope and nature, and as a package maintainer, it’s hard to know exactly what kind of needs your end users will have. But if you’re prototyping an application, then using a database as your initial data source makes a lot of sense. Given that, the choice of bundling a database migration component as part of the core install makes a lot of sense as well. Here’s my initial migration.

Schema::create('urls', function (Blueprint $table) {
$table->string('shortened_url')->comment('Stores shortened url');
$table->string('original_url')->comment('Stores shortened url');

You can see that it creates one table, called urls. It has an auto-incrementing id field, a field for the original URL, one for the shortened URL, and, using the timestamps method, auto-creates a field for when the record was created and updated.

The tests

Now to the tests. Here’s a sample, available in tests/UrlControllerTests.php. The first test checks the view-urls route.

First off, we’ll load a record into the database, just one to keep things simple, then visit the page and run some checks on it. We go to the page, check the response code, the URL, and that the page contains the information that should have been rendered in the template, which was extracted from the database.

public function testViewUrlsPage()
// Load some sample data
$url = factory(App\Url::class)->create([
'shortened_url' => 'http://tSQ1r84',
'original_url' => '',
->see('Manage URLs')
->see("This form let's you shorten a new url, or update an existing one.")
->click('View URLs')

Following that, we perform a minor check to ensure that we can follow a link on the page. These aren’t exhaustive tests, but then, the application’s quite young! Over time, they’ll become more full-featured. If you are keen to know more about the testing functionality, definitely check out the Laravel testing documentation.

Adding The Application to Codeship

Now that we’ve looked at the application, let’s create the initial integration with Codeship. For this series, I’ll assume that you already have an account at your disposal.

After logging in, create a new project from your dashboard and choose Connect with GitHub repository. Then enter the URL of the repository, which you can find in Clone or download, in your fork of the project. After that’s done, you are ready to configure the project.

Codeship’s pretty handy in that it offers a default set of options based on some broad project types, such as PHP, Ruby on Rails, Python, Go, Dart, and Java. But it doesn’t offer a framework-specific set of configuration options. So, we’re starting off with the PHP default, which you can see below.

It’s setting the PHP binary to be version 5.6 (arguably it should be 7.0) and to install the code from the repository. If we were to try this, as I originally did, it would result in a series of failures. To set up the project properly, we’re going to need the following configuration:

phpenv local 7.0
mkdir ./bootstrap/cache
composer install --prefer-source --no-interaction
cp -v .env.example .env
php artisan key:generate
touch database/database.sqlite
php artisan migrate

Let’s step through that. We’re setting our environment to use PHP 7.0. We’re then creating the directory,bootstrap/cache, which is required by Laravel’s command line tool, Artisan. We need to do this because it’s created as part of the package creation process, which won’t happen when we install from source.

After that’s done, we can then install from source, using Composer. We then need to set the application key and create the SQLite database file, called database.sqlite and located in /database. With these steps complete, we can now run the database migration, again using Artisan.

Configuring the Environment

There’s one task left to do, which is to configure the project environment. We need to tell Laravel what the default database connection is; otherwise, it’s going to assume that we’re using MySQL, and the build will fail.

To do that, navigate to the environment settings by clicking Environment under Project Settings. From there, we provide a KEY of DB_CONNECTION and a value of sqlite and save the configuration. We then add a second setting, called GOOGLE_SHORTENER_TOKEN_1. For the value, we set the API key, which you can retrieve from your GoogleAPIs account.

Triggering a Build

With the project now configured, it’s time to put it to the test. To do that, I made a minor change to the tests and pushed the changes to GitHub. The build, I’m happy to say, was successful.

And that’s the first part of how to set up a continuous deployment workflow for a basic Laravel application. What we’ve not yet done is cover deploying the code to a test or production server, such as Heroku, Amazon, or DigitalOcean, or how to configure notifications.

That’s what we’ll be covering in part two of this series. We’ll also see how to work with an external service provider, such as ElephantSQL. Stay tuned, as this will start to flesh out the application and its accompanying complexity.

Wrapping Up Part I

As this is the first time that a deploy’s being made, I see no harm in the last three steps. However, we could have approached the deployment configuration a little differently.

We could have, and will in future, store the database file in the Git repository. We could also have created the cache directory previously and created an empty .gitkeep file, so that the directory could be managed by Git. This way, we could remove both of these steps, simplifying the configuration. So long as the database file was there, there’d be no problem in running the migration.


In the first part of this series, I stepped through the basics of creating a continuous-deployment pipeline for a Laravel-based application using Codeship. We had a look at the application’s overall structure, its configuration, created database migrations, and created some acceptance tests. We finished up by working through an initial Codeship project creation, so that the test-suite could be run whenever a push was made to the project’s ≈master branch in its GitHub repository.

As I mentioned at the end of that article, it wasn’t the whole story. We’ve only gotten halfway down our continuous deployment pipeline. What’s missing is the part where we actually deploy the application to a production (or testing or staging) server. That’s what we’re going to do in this, the second and final, part of the series.

Specifically, we’ll cover how to refactor the application, so that it uses a PostgreSQL database, instead of SQLite. I’ll demonstrate how to update the environment configuration, so that it includes the relevant variables necessary to test the build, using Codeship’s PostgreSQL database, available in all projects.

We’ll wrap things up by deploying the project to a DigitalOcean droplet. We’ll also see how to use ElephantSQL as our host, instead of a PostgreSQL installation on the droplet.

Updating the Environment Configuration

The first thing we need to do is update the environment configuration variables. Specifically, we need to change the database-related ones and then add some extra variables to support the deployment process.

Did you know that every Codeship build containsan installation of PostgreSQL? Codeship provides support for versions 9.2–9.5, as at the time of writing.

Regardless of what version you need, it’s easy to access. The port number you use has symmetry with the PostgreSQL version number. So, version 9.2 is available on port 5432, version 9.3 is available on port 5433, and so on.

So the first thing we’re going to do is change the DB_CONNECTION environment variable to be pgsql and add a new variable, called DB_PORT, giving it the value 5435.

With that done, we need to add two extra variables, so that we can authenticate with the database. These areDB_USERNAME and DB_PASSWORD. The values for these settings will be set from two pre-existing environment variables. These are $PG_USER and $PG_PASSWORD. Add these in, and we’ve made the required changes for the database.

Now we need to add a further set of environment variables to support the deployment tool we’ll use, which we’ll see shortly. The deployment environment variables we’ll need are:

  • REPOSITORY: The git repository storing our code
  • RELEASES: The number of deployed releases to keep
  • DEPLOY_DIR: The directory to deploy the code
  • DEPLOY_USER: The operating system user who will run the deploy
  • PRODUCTION_IP_ADDRESS: The IP address of the server to deploy to

A quick note on the DEPLOY_USER setting: Make sure that they have permissions to write to the directory, and that there is a public key for that user stored on your GitHub account. This way, they can access the repository, if it’s private.

Updating the Setup Commands

With the new environment variables set, we need to update the Setup Commands. We’re not going to do much, other than to remove one line — touch database/database.sqlite — which set up the SQLite database file.

We’re not using SQLite in testing anymore, so we don’t need that configuration. Under Project Settings -> Test -> Setup Commands, remove that line from the configuration.

Updating The Code

With these changes made, we need to make a few small updates to the application code and configuration so that it uses PostgreSQL instead of SQLite.

First, we need to create a copy of .env that will be copied to somewhere on our hosting server, say/opt/application/config. In there, we need to set DB_CONNECTION to have the value pgsql. With that done, copy it to your web host.

The reason for this is that we’re not going to store it under version control. If we did, there’s a risk that, at some stage, the credentials may leak out. Plus, storing it under version control violates one of the tenets of a Twelve-Factor app.

The next change is optional, depending on how you’re going to handle caching. By default, Laravel uses a filesystem cache. If we continue down that road, the build will fail when we deploy the application. The reason is because bootstrap/cache is listed in .gitignore. Given that, it won’t exist when a deploy is run, leading to an error when that directory is attempted to be used.

So, you have two choices. You can add extra configuration and environment variables to use a caching service. Or you can remove bootstrap/cache from .gitignore.


As I mentioned in part one, in this, the final post in the series, we’re going to be using an external service,ElephantSQL.

I’ll assume that you don’t already have an account with them, and step through the process of setting one up. Happily, their interface makes it almost trivial. What’s more, they have a free plan, so that you don’t have to pay anything to give them a try.

  1. In your browser, navigate to
  2. Scroll down and click Try now for FREE, under the Tiny Turtle option. You’ll be taken to a login and signup page.
  3. Under Sign up, you’ll see a section titled Sign up or login. Click the button labeled Sign in with GitHub. You’ll be asked to authorize ElephantSQL’s access to your GitHub repositories.
  1. Click Authorize application. With that completed, you’ll be redirected and logged in to your ElephantSQL dashboard.
  2. Click the blue Create button on the right-hand side.
  3. Add a name for your database, pick a data center, and click Create. You’ll see your database in the list.
  1. Finally, we need to get the connection details, so that we can update our environment settings. Click Details, and make a note of the URL, hostname, database and usernames, and password.
  1. With those at hand, you need to add them to the production server configuration environment. I have a sample in the next section. You also need to set them in your environment settings.

I know, this has largely been taken care of quite effortessly so far. But this part takes a little bit of manual intervention, depending on how you’re handling server provisioning. The simplest way that I can recommend is by using Ansible.

If you’d like a more comprehensive guide, check out Using Packer and Ansible to Build Immutable Infrastructure by Marko Locher, here on the Codeship blog. Whichever way you go, ensure that those environment variables are set in your production environment.

The Deployment Server

Before we get on to working through how we’re going to deploy the code, we need to cover the deployment server and its requirements.

Make sure that the server supports either PHP 5.6 or 7.0 and has the pgsql extension installed and enabled. It also needs to have a recent version of Git installed.

Next, make sure that the NIGINX or Apache virtual host configuration is set up to host a Laravel application. Also, make sure that that the DB_ environment variables, which I showed you how to set earlier, have been set.

If you have all that done, there are no other critical services, packages, or configurations required. Here’s a minimal, sample, Apache VirtualHost configuration:

<VirtualHost *:80>
ServerAdmin webmaster@localhost
DocumentRoot /var/www/html/current/public
        ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
        SetEnv DB_USERNAME your_database_username
SetEnv DB_PASSWORD your_database_password
SetEnv DB_DATABASE your_database_name
SetEnv DB_HOST your_database_hostname
        <Location ></Location>
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} -s [OR]
RewriteCond %{REQUEST_FILENAME} -l [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^.*$ - [NC,L]
RewriteRule ^.*$ /index.php [NC,L]

Deploying the Application

Now that the environment configuration is ready to go, we need to step through the deployment process. Like all things in PHP, there are a host of ways in which you can do it.

There’s Phing, Capistrano, SCP, Rsync over SSH, and so on. But the one that I’m going to be covering is a newer tool, one written specifically in PHP, called Deployer.

Deployer is a handy library that works in a similar way to Phing, Make, and Ant. You create one or more tasks that have to be run during deployment. You provide details of the deployment server you’re going to deploy on, whether that’s one server (such as production) or multiple (such as testing, production, and staging).

Then you run the deployment. All the tasks are run in the order in which you’ve specified them. All going well, the deployment works just as you expected.

What’s extra nice about Deployer, aside from being written in PHP, is that it comes with a series of prepackaged tasks and it supports rollback and deployment versioning. As far as its prepackaged tasks, there are task recipes for working with Composer-based builds as well as a variety of services and tools, such as New Relic, RabbitMQ,Slack, and HipChat.

To make Deployer available, copy the phar version to your repository by calling:


This will bring in the latest version (3.3.0 at the time of writing). Next, we need to create a Deployer configuration file called deploy.php in the root of our project. It’s a little lengthy, so I’ll work through it a piece at a time.

require __DIR__ . 'vendor/autoload.php';
require __DIR__ . 'vendor/deployer/deployer/recipe/composer.php';
    set("sudo_for_composer_is_wrong", true);
First, we import the classes we’ll be using and specify that we’re not going to attempt to use sudo with Composer.
    server('digitalocean', getenv('PRODUCTION_IP_ADDRESS'))
->env('deploy_path', getenv('DEPLOY_DIR'));

Next, we create a server configuration; in this case just one, called production. In the call to server(), we specify the hostname and IP address of the server.

We next call user() and password() to authenticate to the server as a given user. Alternatively, we could use a combination of user() and identityFile() to instead use SSH public keys. We then use stage() to give a name to the server and provide a custom environment variable, by making a call to env().

set('repository', getenv('REPOSITORY'));
set('keep_releases', getenv('RELEASES'));

Next, we set the repository where the code is located, as well as the number of releases to keep in addition to the current one, by making two calls to the set() method. This is responsible for setting Deployer configuration settings.

At this point, you can now see where all of the extra environment variables we added before will be used. By default, unless we set it explicitly, Deployer will keep up to five releases of our application. I know that we setRELEASES to five.

But I wanted to be explicit, particularly since this is a key setting. If something goes wrong, we can quickly roll back. Ideally, we’ll never have to do that, but it’s crucial to have it there should we need to.

task('deploy:copy-env-file', function () {
$deployPath = '/var/www/html/current/';
$envFile = '/path/to/your/.env/file';
run("cp {$envFile} {$deployPath}");
})->desc("Copy the env file");
    task('deploy:database-migrations', function () {
$deployPath = '/var/www/html/current/';
run("php artisan migrate");
})->desc("Run the database migrations");
    task('deploy:done', function () {
write('Deploy done!');
})->desc("Run artisan make");
    task('deploy', [
])->desc("The main deployment process");

Now we write some custom tasks and compose the deployment pipeline. A task is composed of three parts:

  • A name
  • A callback, or list of tasks, which handles the task’s work
  • A task description, which is optional

In the first task, we’re copying the environment file to our new installation. In the second task, we’re running the database migrations on our database hosted on ElephpantSQL. And in the third task, we’re creating a message to provide visual confirmation of the fact that the deployment is completed.

Now for the deployment pipeline. In this one, I’ve passed an array of pre-existing tasks (or recipes) to run which form the deployment process, along with our three custom tasks.

These will prepare the release, check out the latest version from the repository, run a composer update, symlink the latest copy as the current version, and remove any unnecessary releases, if there are more than the required number still present.

// The task to run after deployment
after('deploy', 'deploy:done');

Now for the last step. Here, we’re specifying what to do when a release finishes. In this case, we’re running our custom notification task, which we saw just a moment ago.

The Deployment Pipeline

We now have one final task to take care of before we can run the deployment: create a deployment pipeline.

  1. Under Project Settings -> Deployment, click Custom Script. You’ll see a text box, similar to what we’ve seen previously when setting the Test configuration.
  2. Add the following, adding in the details for your deployment server.
export DEPLOY_DIR=
php deployer.phar deploy production

This will export the environment settings that the deployer needs and then run the deployer binary, calling the deploy method, and specifying that the server we’ll be deploying to is named production. That will then run the deployment pipeline which we established, and finalize the deployment.

Deploying the Application

With all that done, we’re now ready to deploy our application. Assuming you’ve not pushed the change to the application to the repository yet, push them now, and a new build will be triggered.

A few minutes after you’ve pushed the code, the build will be triggered. If all goes well, your shiny Laravel application will be deployed.


And that is how you can set up a deployment pipeline for Laravel with Codeship. Admittedly, it’s a bit of a long pipeline, but it’s completely transparent as to what’s going on and how it’s been setup. There are a number of moving parts, but when combined together, they work very well.


Learning PHP 7: From the Basics to Application Development

The Complete PHP 7 Guide for Web Developers

Up to Speed with PHP 7

Learn PHP 7 This Way to Rise Above & Beyond Competion!

PHP MySQL Database Connections

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.