Flask Jenkins Pipeline with SonarQube on Kubernetes

shubham kumar singh
DevOps-Journey
Published in
6 min readAug 16, 2020

Coverage with Flask

Problem introduction

Recently I have started working on a flask project. One of the challenges I have faced was to execute CICD test cases. I have used flask-Sqlalchemy ORM to connect to MySQL databases. However, while doing the CICD, we do not have access to the MySQL database causing the CICD to fail. We have tried to evaluate multiple ways to achieve that. Possible strategies:

  1. Setup MySQL on the CICD container. This makes containers a little heavy and takes time and resources.
  2. Since we use Kubernetes deployment for execution we can set up a MySQL container within the same execution pod. This would probably be the best approach considering the Kubernetes circumstances and reduces execution time by preloading the data.
  3. Setup an in-memory database Sqlite, like java’s h2. This will reduce the requirement for both resources and time.
  4. Another issue is to run and execute the test cases on the pod while being able to push the docker image to the Docker registry. In order to solve both, we can run 2 containers, 1 for building docker and pushing it to docker-registry, while the other container is used to execute python specific tasks like executing test cases. This seems to be a very lightweight approach since we are already using docker(with docker binary) container to a few services.

I would like to outline this document, this document will follow through below steps:

  1. Explain the deployment topology
  2. Problems importing mysql data into Sqlite
  3. The solution to import data into Sqlite3
  4. Setup test profiles and how to use them
  5. Prepare pipeline execution environment
  6. Setup and execute pipeline

Build topology

Basic import problem

Upon reading I realized that the best way to execute unit test cases is to run in-memory databases like H2 for java-hibernate. Initially, we tried to create a SQL dump and load that in Sqlite3.

$ mysqldump -u<user> -p<password> -h <host> db > testdump.sql
$ cat testdump.sql| sqlite3 mysqlite3.db
Error: near line 25: near "AUTO_INCREMENT": syntax error
Error: near line 38: near "LOCK": syntax error
Error: near line 41: near "UNLOCK": syntax error
Error: near line 50: near "ENGINE": syntax error
Error: near line 60: near "LOCK": syntax error
Error: near line 62: no such table: alembic_version
Error: near line 64: near "UNLOCK": syntax error
Error: near line 73: near "AUTO_INCREMENT": syntax error
Error: near line 92: near "LOCK": syntax error
Error: near line 94: no such table: apps
Error: near line 96: near "UNLOCK": syntax error
Error: near line 105: near "AUTO_INCREMENT": syntax error

Setup SQLite3 with mysql-to-sqlite3

As you can see it is a bit difficult to load the data into Sqlite database. We find one tool that can covert the MySQL data into Sqlite data. This is just a simple way to create a fixture.

Steps to solve this:

  1. Install the mysql-to-sqlite3 utility.
  2. Use the utility to create sqlite file
$  pip install mysql-to-sqlite3
$ mysql2sqlite -f testdb.sqlite -d db_name -u<user> -p<password> -h <host>

Setup test profile

In order to set up a Flask test, we can set up a config file that can work like a mvn profile.

# Config file aka profile configurations
import os
DEBUG = True # Turns on debugging features in Flask
BCRYPT_LOG_ROUNDS = 12 # Configuration for the Flask-Bcrypt extension
basedir = os.path.abspath(os.path.dirname(__file__))



class Config(object):
DB_URL = os.environ.get("DB_URL", '0.0.0.0')
DB_USER = os.environ.get("DB_USER", 'root')
DB_PASS = os.environ.get("DB_PASS", 'root')
DB_NAME = os.environ.get("DB_NAME", 'app')
SQLALCHEMY_DATABASE_URI = "mysql+pymysql://{DB_USER}:{DB_PASS}@{DB_URL}/{DB_NAME}?charset=utf8mb4".format(
DB_URL=DB_URL, DB_USER=DB_USER, DB_PASS=DB_PASS, DB_NAME=DB_NAME)
SQLALCHEMY_TRACK_MODIFICATIONS = False
SQLALCHEMY_ECHO = False
BUNDLE_ERRORS = False
CELERY_BROKER_URL = 'sqs://'
sns_queue_region = os.environ.get('SNS_QUEUE_REGION', 'us-west-2')

SQLALCHEMY_POOL_RECYCLE = 3600
SQLALCHEMY_ENGINE_OPTIONS = {
'pool_size': 50,
'pool_recycle': 120,
'pool_pre_ping': True
}

class ProductionConfig(Config):
DEBUG = False


class StagingConfig(Config):
DEVELOPMENT = True
DEBUG = True


class DevelopmentConfig(Config):
DEVELOPMENT = True
DEBUG = True


class TestingConfig(Config):
TESTING = True
DB_NAME = os.environ.get("DB_NAME", 'testdb')
SQLALCHEMY_DATABASE_URI = "sqlite:///{DB_NAME}.sqlite".format(DB_NAME=DB_NAME)
SQLALCHEMY_POOL_RECYCLE = 0
SQLALCHEMY_ENGINE_OPTIONS = {}

Once the above is set up, we need to add a line in app.py for the flask app. Something like setting up APP_SETTINGS. In the below configuration, we set the default to production. This certain configuration will use the inheritance and set the default configuration for different profiles. This way you just need to override the environment variable to change to a different profile.

# Load the default configuration
app.config.from_object(os.environ.get("APP_SETTINGS", "config.ProductionConfig") )

Once these are configured we can override them using environment variables like below and execute anything command that one may like.

export APP_SETTINGS="config.ProductionConfig"
python app.py

Setup run.py

Additionally, we need a way to execute coverage. I prefer to execute them using coverage and pass a file with reference to other files. I believe this is a decent way to run the coverage and store the results.

# manage.py
import pytest
from flask_script import Manager
from app import app
manager = Manager(app)

@manager.command
def test():
"""Runs the tests."""
pytest.main(["-s", "tests/__init__.py"])

if __name__ == "__main__":
manager.run()

Once the above is complete one can execute the coverage using the below command. The final step will create an HTML report. Sonarqube requires an XML format for this to work.

$ PYTHONPATH=. coverage run -m pytest run.py run 
$ coverage report
$ coverage html
$ coverage xml

Pipeline execution environment

We can set up a docker container as an execution environment. We can set up a profile and execute that over the environment. Configuration of docker container looks like below:

FROM python:3.6-stretch
RUN mkdir -p /app && apt-get update && apt-get install -y libcurl4-openssl-dev libssl-dev vim
WORKDIR /app
COPY requirements.txt /app
RUN pip install --no-cache-dir -r requirements.txt
ENV cmd=""ENV APP_SETTINGS="config.TestingConfig"
COPY . /app
EXPOSE 8080

Additionally, we need to set up a python test case execution container. This is required to execute test cases.

# Dockerfile 
FROM python:3.6-stretch
RUN mkdir -p /app && apt-get update && apt-get install -y libcurl4-openssl-dev libssl-dev vim
WORKDIR /app
COPY requirements.txt /app
RUN pip install --no-cache-dir -r requirements.txt
# Build the test image from here:
$ docker build . -t hub.docker.com/shubhamkr619/python-testing-image:v1
# Push the image
$ docker push hub.docker.com/shubhamkr619/python-testing-image:v1

Setup and execute Jenkins pipeline

In order to execute this pipeline, we need to setup Jenkins's job to achieve that.

First, we will set up a Jenkins.yaml file to be used by Jenkins Kubernetes plugin.

apiVersion: v1
kind: Pod
metadata:
labels:
Application: app
spec:
containers:
- name: docker
image: docker:1.11
command: ['cat']
tty: true
env:
- name: NODE_IP
valueFrom:
fieldRef:
fieldPath: status.hostIP
volumeMounts:
- name: dockersock
mountPath: /var/run/docker.sock
- name: python
image: hub.docker.com/shubhamkr619/python-testing-image:v1
tty: true
command: ['cat']
env:
- name: NODE_IP
valueFrom:
fieldRef:
fieldPath: status.hostIP
- name: APP_SETTINGS
value: "config.ProductionConfig"
volumeMounts:
- name: dockersock
mountPath: /var/run/docker.sock
volumes:
- name: dockersock
hostPath:
path: /var/run/docker.sock

This above step will ensure that we run 2 containers within the pod be able to run both docker-container with docker binary and Python container with python and coverage access.

//Added shared Library
library identifier: 'cicdjenkins@master', retriever: modernSCM(
[$class: 'GitSCMSource',
remote: 'github.com/jenkinpiple/',
credentialsId: 'svc.devops-ut']
)

//Defines the build CI pipeline for app
pipeline {
agent {
kubernetes {
label 'app'
defaultContainer 'jnlp'
yamlFile 'Jenkins.yaml'
}
}

environment {
DOCKER_REGISTRY= '<registry_url>'
APP_NAME = '<app_name>'
DOCKER_REGISTRY_CRED_ID= "<registry_cred>"
DOCKER_REPOSITORY= 'docker-local'
}

stages {
//Purpouse : Notify slack about JOB started
stage('General') {
steps {
notify('STARTED')
githubstatus('STARTED')
echo sh(script: 'env|sort', returnStdout: true)
script {
USER = wrap([$class: 'BuildUser']) {
return env.BUILD_USER
}
GIT_REVISION = sh(returnStdout: true, script: 'git rev-parse --short HEAD').trim()
}
}
}

//Docker Build
stage('Build') {
steps {
container('docker') {
script{
def git_branch = "${GIT_BRANCH}"
git_branch = git_branch.replace("/", "_")
DOCKER_BUILD_NUMBER = BUILD_NUMBER+"-${git_branch}"
docker.build("${DOCKER_REGISTRY}/${DOCKER_REPOSITORY}/${APP_NAME}"+":"+DOCKER_BUILD_NUMBER)
}
}
}
}
stage('Test') {
steps {
container('python') {
script {
output = sh(returnStdout: true, script: 'pip install --no-cache-dir -r requirements.txt && export ENV APP_SETTINGS="config.TestingConfig" && export PYTHONPATH=. && coverage run --source=. run.py test && coverage xml ' ).trim()
echo output
}
}
}
}

stage("SonarQube Analysis") {
steps{
//Executing SonarQube Analysis inside build container
container("python"){
script {

sh "sed -i 's/sonar.projectVersion=build-number/sonar.projectVersion=${BUILD_NUMBER}/g' sonar-project.properties"
sh "sed -i 's@sonar.branch.name=branch_name@sonar.branch.name=$BRANCH_NAME@g' sonar-project.properties"
withSonarQubeEnv('SonarQube') {
echo "===========Performing Sonar Scan============"
def sonarqubeScannerHome = tool 'SonarQube Scanner 3.3.0.1492'
sh "${sonarqubeScannerHome}/bin/sonar-scanner"
}
}
}
}
}

//Quality Gate
stage("Quality Gate") {
steps {
script {
timeout(time: 1, unit: 'HOURS') {
waitForQualityGate abortPipeline: true
}
}

}
}
//Docker Push
stage('Artifactory Push'){
steps {
container('docker') {
script {
dockerpush("${DOCKER_REPOSITORY}/${APP_NAME}","${DOCKER_BUILD_NUMBER}","${DOCKER_REPOSITORY}")
}
}
}
}
}


}

--

--

shubham kumar singh
DevOps-Journey

Googler | Cloud computing| Kubernetes | Containers | Monitoring | Python