How to build complete end-to-end ML model, Backend RestAPI using FastAPI and front-end UI using Streamlit

Balwant Gorad
10 min readOct 14, 2022

Hey Guys….!!!!! Welcome all

My Name is Balwant Gorad working as Sr. Lead AI/ML IT architect.

In this tutorial I am going to explain you, how we can build complete end-to-end Regression ML model, Backend RestAPI to serve this model and front end UI so end user can interact and use it through web browser. Complete implementation is done Python language.

This can be small end-to-end project for Data Science, AI ML beginners who want to know

  1. What is the process of ML model building (in our case it is Regression), but the process is quite similar to classification and other ML technique also.
  2. How backend RestAPI can can be developed to serve the model
  3. How front-end UI can be designed using Streamlit

Normally Independent end-to-end ML applications can be developed in 3 steps

  1. Build ML Model (In our case we are going for regression model, car price prediction. You may choose other ml technics like classification, clustering etc.)
  2. Develop REST API as Backend (we are going to use FastAPI framework in python for our case, you may choose other frameworks like Django, Flask etc.)
  3. Develop Front End UI (we are going to use Streamlit but you can use other technologies to design frontend like HTML, CSS, etc.)

There are couple of IDE’s using which entire application can be developed but here we are going to use VS Code as an IDE, why we preferred VS code because we have to write RestAPI and frond-end UI also, but you can choose as per your choice.

Before going to discuss each step in detail, we can make our project structure look like this

Create CARPRICEPREDICTOR directory anywhere in your machine, we have created on Desktop. Inside this directory create 3 folders namely

  1. api — which will contain RestAPI code
  2. model — will contain dataset, model building jupyter notebook file(.ipynb) and generated model (.pkl file) which is to be served in backend API
  3. ui — will contains frond-end UI files written using Streamlit

Now, we will discuss and go through each step in details now..

Build ML Model

For this tutorial we are going to build car selling price prediction system based on various other car features like car manufacturing year, km_driven, new price, fuel_type, transmission and other some features.

The dataset we are going to use for this tutorial can be found at this link.

We personally recommend to you guys to create new virtual environment so it will not mess up the existing python packages in your base system. As you might be aware that there are various ways to create new virtual environments using venv, virtualenv, conda etc.

We are going to create virtual environment using conda, so for this just open anaconda prompt and run following command

conda create -n test_env

I already have an test_env, so i have not created environment again, but you can create

To activate env run following command

conda activate test_env

We need few python packages for model building, API and UI building. To install packages into this activated environment you can run following commands

conda install pip

pip install numpy pandas matplotlib seaborn sklearn lightgbm xgboost catboost

Now go inside earlier created carpricepredictor directory using following command

cd Desktop\carpricepredictor

and open VS Code in same directory using following command

code .

Now create 3 folders as mentioned earlier inside carpricepredictor

We will discuss API and UI part later in this tutorial, First we will build model first. To create model we will need a dataset so download the cars.csv file from above given URL and place it under model directory. Now create car_price_predictor.ipynb file under model folder. When you will create this file it will prompt to install ipykernel package just click on install button to install it. If it is not going to prompt it then you can install it using same process of installing pip packages

conda install ipykernel or pip install ipykernel

Now start the model building process

Step1: Import the packages

Step2: Read the dataset using pandas

Step3: Do the Basic EDA

we have total 301 cars data and 9 features that are car name, year, present price, kms driven, fuel type, seller type, transmission, owner , selling price

This dataset is very cleaned and it is purposely taken because we don’t want to focus more on data preprocessing rather we want to see end to end process of ML project

Let is check the correlation between variables using data.corr() method

so from above heatmap, it is clear that selling price is more depends on present price, year etc

Step4 : do the basic preprocessing

Here few columns (Fuel type and transmission) are having text data so we will do label encoding

For our case Selling price is target feature and all other features are input features

Lets divide the data into train and test, We have kept 80% training and 20% testing

Step 5: Do the model building

Here we are going to build 7 different regression models for this tutorial. We can also add few more models also that's up to us. Here we are going to build 7 models and we will check which one is going to perform well. Also we can hyper tune these models but currently we are not going to do it. The models which we are going to use are Linear Regression, Decision Tree Regression, Random Forest Regression, Gradient Boosting Machine Regression, Light Gradient Boosting Machine Regression, Xtreme Gradient Boosting Machine Regression, Cat Boost Regression.

Step 6: Evaluate Models

Now its time to evaluate these models.

For evaluating various metrics can be used, here we are going to use MAE, MSE, RMSE, R2 Score, MAPE etc.

This will print the following result

Step7 : Prediction

Lets do the prediction for test data and new input data and check the results. From above result, as MAE and r2 is comparatively high in Cat Boost Model so lets use this model

Step8: Save Model

Finally save the model using pickle package. You can use the pickle operation to serialize your machine learning algorithms and save the serialized format to a file.

Later you can load this file to deserialize your model and use it to make new predictions.

So at the end of this car_price_predictor_model.pkl file will be created inside model folder which we are going to load in backend rest api.

Develop RestAPI using FastAPI

Now lets develop Rest API using FastAPI which will load model created in previous step.

FastAPI is a Web framework for developing RESTful APIs in Python. FastAPI is based on Pydantic and type hints to validate, serialize, and deserialize data, and automatically auto-generate OpenAPI documents

Lets create new python file named main.py under api folder. We need few packages that should be installed into activated environment. Install these packages with following command

pip install fastapi uvicorn pydantic

step1: lets import required packages

Uvicorn is an ASGI web server implementation for Python. Pydantic allows custom data types to be defined or you can extend validation with methods on a model decorated with the validator decorator

step2: Load the model

step3: Create the FastAPI Instance

Here the app variable will be an “instance” of the class FastAPI . This will be the main point of interaction to create all your API

step4: create the input schema using pydantic basemodel

Pydantic models are structures that ingest the data, parse it and make sure it conforms to the fields’ constraints defined in it

step 5: create routes

Here we will create home route(/) which will just return “Car Price Predictor” message and predict route(/predict)

Also finally we will run uvicorn server using

That’s it … this is API file which we can run using following command

python api\main.py

After running this you will see the following output

now uvicorn web server has started and swagger UI can be accessed in web browser using http://127.0.0.1:8000/docs

If we expand predict route and give input data shown in following image and click on execute, we will get the prediction

Example — here we have received price as 8.23 Lakhs

Hoorays !!!!!!!!!!

Up to here we have done the 2 steps i.e. building ML model and integrating ML model into RestAPI. But you tell me is there any app exist without UI?? No right… If yes how end users are going to interact with it.

So lets come to last step where we will develop Simple UI using streamlit package.

Develop Front-end UI using Streamlit

Streamlit is an open source app framework in Python language. It helps us create web apps for data science and machine learning in a short time. It is compatible with major Python libraries such as scikit-learn, Keras, PyTorch, SymPy(latex), NumPy, pandas, Matplotlib etc

End users always interacts with UI. They will enter their car details here and click on predict button. When this button is clicked data entered in various UI components will be collected and passed to the API. To call API here we will requests module, Also for data parsing we will use json module. If these modules are not installed install using following command

pip install streamlit requests

So now create new file under ui folder named car_price.py

Step1: import packages

The requests module allows you to send HTTP requests using Python. The HTTP request returns a Response Object with all the response data.

Step2 : create dedicated function

Set title to app and use number_input components to accept year, km_driven, present_price. Use select box for fuel_type, transmission and owner

now create the dictionary for all those values which we have to pass to API

Now call the API with request module with post method, URL we have got in previous step and input data

Now, finally call main method using

Step 3: Run Streamlit App

Save the file, open the new terminal and run the file using following command. Also ensure your FastAPI is running in another terminal.

streamlit run ui\car_price.py

Once you run this command, you will notice that streamlit has started with following URL

Now Go to Browser and give the input and test the entire app.

This was basic end to end complete ML app with API and UI. In next tutorials I will come with few more interested end to end ML Projects.

For Complete code, dataset please click on this link. If you liked star me on github

That’s it guys… Thank you reading this tutorial !!!! If you liked follow me on medium, YouTube and linked In.

I know this tutorial was very long and big but i hope you have understood the entire process of “How to build complete end-to-end ML model, Backend API using FastAPI and front-end UI using Streamlit”

Thank you all…!!

--

--

Balwant Gorad

Sr. Lead AI/ML at Tata Communications | Machine Learning | Artificial Intelligence | Data Science | NLP