Containerized WebSphere app deployment pipeline

Nico Meisenzahl
Jan 12 · 4 min read

Another post out of the “use the quieter days after Christmas” series. :-)

This time I will write about a use case we had in our own (panagenda) environment. It was on my list for a long time and now I found some time to implement it: A containerized pipeline to deploy a WebSphere application on different WebSphere environments triggered by another pipeline. For us, it was enough to update an existing application. Therefore my pipeline does only support updates so far.

I decided to build a clientless solution to be able to deploy the application to different environments. That’s why I decided not to use a Gitlab Runner locally on the WebSphere machines (Yes, ones again I’m using Gitlab CI to build the pipeline). Instead of a local Gitlab Runner, I built a WebSphere Docker image which I then use to remotely connect to the different WebSphere instances. The Docker container will be deployed (and destroyed after the pipeline finished) on our Kubernetes Cluster which is integrated with the Gitlab platform. This is handled by the Gitlab Runner for Kubernetes (more details).

What is needed?

First of all, we need a WebSphere Docker image. I built the Docker image with the following Dockerfile. Until now, this part is not automated. It wouldn’t be hard to build it (more details) but in this case, it's just not needed.

I saved some disk space by using wget to copy all my installation sources instead of using multiple ADD commands. The easiest solution is to use a python web server to provide the installation files. It can simply be started from the command line within your local software directory:

Anyway, the Image is still not a small one.

I decided to go for a full WebSphere profile instead of a client installation to be able to use the retrieveSigners tool which is not part of the client package. Without it, you need to manually handle and copy the certificates from the remote WebSphere installation. For the installation itself you will need a response file (/install_response_file.xml) to be able to install WebSphere in silent mode:

Now you are ready to build your WebSphere image and upload it into the Docker Registry of your Gitlab repository.

The pipeline will use our image to call the wsadmin client which will execute a python script. The script (/upgrade.py) itself will handle the update process:

Finally, we are ready to build the pipeline itself. This is the pipeline definition I used:

As you see I used four different variables to define the WebSphere environment, the ear URL, the Application name and context root. They are empty by default because we specify the values when we call the pipeline. The WebSphere credentials, which are used to connect to WebSphere, are stored in the repository itself and are valid for all environments.

The pipeline will start a container using our image on the integrated Kubernetes Cluster and will then execute three different commands:

  1. The retrieveSigners script downloads and saves the remote certificates (They are needed to successfully connect to remote WebSphere instance)
  2. curl is used to download the ear file from a different repository (in our case it’s a Marven repository)
  3. wsadmin calls the upgrade.py script which updates the application

Now we are ready to run the pipeline and update our application. In our case, this is triggered by another build pipeline which calls this pipeline with a single curl command (you will need to create a trigger token which is necessary for the authentication):

01001101

Stories related to DevOps topics by Nico Meisenzahl. 01001101? First char of my surname.

Nico Meisenzahl

Written by

Senior Consultant @panagenda, @soccnx & @DockerRosenheim team member. Blogger, speaker & IBM Champion. Loves DevOps, K8s. His desk is a ping pong table.

01001101

01001101

Stories related to DevOps topics by Nico Meisenzahl. 01001101? First char of my surname.