Sentiment Analysis with Docker Compose: Building a Machine Learning App with Database Persistence

Hemachandran Dhinakaran
6 min readFeb 13, 2024

--

I’m confident that I no longer need to advocate for Docker, but if you’re still curious, feel free to explore my narrative on Containerize and Deploy ML Models with FastAPI & Docker.

However, our focus here will be on Docker Compose and its role in crafting a Machine Learning App with Data Persistence. Docker Compose simplifies the orchestration of multi-container Docker applications. With Compose, developers can define and manage complex environments using a single YAML file, streamlining the process of setting up interconnected services. Whether it’s for local development, testing, or production deployments, Docker Compose offers a seamless solution for containerized applications.

Without delay, let’s delve straight into the subject. In a prior narrative, I developed an ML Application using FastAPI and Docker. Now, I’ll utilize the same application while incorporating data persistence for output.

Set Up a Python Virtual Environment (optional, but recommended)

Python:

pipenv shell
pipenv install <packages>
pipenv install fastapi uvicorn spacy spacytextblob

Conda:

conda create -n <envname> python=x.x <packages>
conda create -n mldocker python=3.8 fastapi uvicorn spacy
conda activate mldocker

To ensure app dependencies are ported from your virtual environment/host machine into your container, run ‘pip freeze > requirements.txt’ in the terminal to overwrite the ‘requirements.txt’ file.

Build the Model

Create the Machine Learning Application Script named model.py in the ML_Docker_Compose_App directory:

We’ll employ Spacy to create a straightforward sentiment classifier capable of generating two outcomes when analyzing text inputs. From which we will define the Sentiment of the input sentence.

import spacy
from spacytextblob.spacytextblob import SpacyTextBlob

class SentimentModel:
def get_sentimentanalysis(self, text):
nlp = spacy.load('en_core_web_sm')
nlp.add_pipe("spacytextblob")
doc = nlp(text)
polarity = doc._.polarity
subjectivity = doc._.subjectivity
if polarity>0:
result = 'POSITIVE'
elif polarity<0:
result= 'NEGATIVE'
else:
result= 'NEUTRAL'
return polarity, subjectivity, result

Build the API

FastAPI is a modern, high-performance web framework for building APIs with Python. It’s known for its speed, simplicity, and automatic generation of interactive API documentation.

import uvicorn
from fastapi import FastAPI
from model import SentimentModel
import mysql.connector

app = FastAPI()
model = SentimentModel()

# Connect to the MySQL database
def connect_to_database():
try:
connection = mysql.connector.connect(
host='db', # Docker service name for the database container
port='3306', # Port exposed by the database container
user='mlusername',
password='mlpassword',
database='mldb'
)
return connection
except mysql.connector.Error as err:
print("Error:", err)
return None

# Insert data into a table
def create_table(connection):
try:
cursor = connection.cursor()
create_table_query = """
CREATE TABLE IF NOT EXISTS mldb.ml_results (
id INT AUTO_INCREMENT PRIMARY KEY,
Input VARCHAR(255),
Sentiment VARCHAR(255)
)
"""
cursor.execute(create_table_query)
connection.commit()
print("Table Created successfully.")
except mysql.connector.Error as err:
print("Error:", err)

# Insert data into a table
def insert_data(connection, data):
try:
cursor = connection.cursor()
sql = "INSERT INTO mldb.ml_results (Input, Sentiment) VALUES (%s, %s)"
cursor.execute(sql, data)
connection.commit()
print("Data inserted successfully.")
except mysql.connector.Error as err:
print("Error:", err)

@app.post('/predict')
def predict(input_sentence):
polarity, subjectivity, result = model.get_sentiment_analysis(input_sentence)
# Connect to the database
db_connection = connect_to_database()

if db_connection:
# Sample data to insert
data_to_insert = (input_sentence, result)

# create Table
create_table(db_connection)

# Insert data into the database
insert_data(db_connection, data_to_insert)

# Close the database connection
db_connection.close()
else:
print("Failed to connect to the database.")

return { 'Sentiment': result
}


if __name__ == '__main__':
uvicorn.run(app, host='0.0.0.0', port=8000)

Build the Dockerfile & docker-compose.yaml

You can manually write the Dockerfile with the respective Contents or you can make use of VS code to simplify the work. Ensure that you have the Docker by Microsoft extension installed in your VSCode. Then, launch Docker Desktop on your machine.

Navigate to VSCode and press: Command + Shift + P to access the command palette. Type “Add Docker files” to find the option for adding a Dockerfile to your project.

Follow through the instructions and keep the application port number as 8000.

The next essential step involves crafting the docker-compose.yaml file, containing the following contents. Within the db service, you’ll observe the binding of the host location ./db_data to the container location /var/lib/mysql. This configuration ensures that data is stored in the host location /db_data. Consequently, even if the container is shut down or terminated, the data remains readily accessible once the application is live again.

version: '3.4'

services:
mldockercomposeapp:
image: mldockercomposeapp
build:
context: .
dockerfile: ./Dockerfile
ports:
- 8000:8000
depends_on:
- db
environment:
- DB_HOST=db
- DB_PORT=3306
- DB_USER=mlusername
- DB_PASSWORD=mlpassword
- DB_NAME=mldb
db:
image: mysql
environment:
MYSQL_ROOT_PASSWORD: root_password
MYSQL_DATABASE: mldb
MYSQL_USER: mlusername
MYSQL_PASSWORD: mlpassword
ports:
- "3306:3306"
volumes:
- ./db_data:/var/lib/mysql

Here is a snippet for your clear understanding of the structure.

Now that we have configured the application and db services next is to start the docker compose to do its job. Let’s hit the up command.

docker compose up -d

At the end you should see the following popup

As you may have observed, the FastAPI application is set to operate on port 8000, allowing access to it from there to submit the input for analysis. You might be pondering the purpose of the MySQL database provisioned here. Some may have already discerned from the app.py script that its role is to capture both the input and output of the Sentiment Analysis app, storing them in the ml_results table within the mldb database.

The API can be accessed by navigating to localhost:8000/docs

Fast API of the local host Application

You can capture the logs in the docker desktop application.

Let’s verify whether the data is indeed inserted into the table within the MySQL database named mldb. To accomplish this, we can utilize a Docker interactive command to enter the db container and execute MySQL commands.

docker exce -it <containername> mysql -p
docker exce -it ml_docker_compose_app-db-1 mysql -p

Upon executing the above command, it will prompt for a password, where you can enter the MYSQL_ROOT_PASSWORD. You can then proceed to execute MySQL commands and examine the results. In my case, the data consumed by the ML API and the corresponding output generated are stored in the ml_results table.

While the data insertion is currently functioning properly, our primary objective is to ensure its persistence. Let’s explore that aspect further. For this I am going to bring down both the containers down (ML APP & mysql DB) i.e. ml_docker_compose_app-mldockercomposeapp-1 and ml_docker_compose_app-db-1 using the command below,

docker compose down
output of docker compose down.

You can see there are no running containers by using the command docker container ls or docker ps

Restart the containers by utilizing the docker compose up -d command. Once they are running again, you can verify the presence of the data by executing the mysql command in interactive mode, as previously discussed.

Congratulations, we achieved the objective of persisting the data. With data persistence successfully integrated into our Machine Learning application using Docker Compose, we’ve achieved a robust solution for storing and managing user inputs and sentiment analysis results. By combining the power of FastAPI, MySQL, and Docker, we’ve built a scalable and reliable platform for sentiment analysis tasks. As we conclude this journey, we can rest assured knowing that our application is well-equipped to handle real-world scenarios with ease. Happy coding and may your ML endeavors be ever successful!

Feel Free to applaud and comment.

GitHub Resource

HemachandranD/ML_Docker_Compose_App (github.com)

--

--

Hemachandran Dhinakaran

Senior AI/ML Engineer | Crafting AI Systems & Algorithms for Tomorrow's Innovations | GenAI | ML/DL | MLOps | LLMOps