How To E8Lattice | ML Engineering and ML Ops for Fun and Science — Part 1
E8Lattice — ML Ops for Serious Data Scientists and Machine Learning Engineers
Background
In the past, Jupyter Notebooks have been used for developing and training machine learning during the development phase. However, the question of deployment of machine learning models has not been addressed in much detail.
Elements of ML Systems
Machine Learning platforms comprise of many elements including data collection, data verification, configuration, feature engineering, configuration, model analysis, resource planning, process planning, metadata management, testing, debugging, automation and maintenance and monitoring of serving infrastructure.
Machine Learning Engineering is the field in which machine learning engineers are able to engineer and deploy the model without relying on devops engineers necessarily.
Platforms like HuggingFace rely on ML engineers to deploy models instead of relying on DevOps.
Best Practice MLE Development Tools
VSCode is the suggested IDE for Machine Learning Engineers. We need to install the Jupyter VSCode extension for the steps in this tutorial
In this series of tutorials, we shall deploy a machine learning model as an API called E8Lattice, so that it can be integrated into a mobile or web application.
E8Lattice is a small language model that can be used to generate names of startups, names of products or names of anything really.
Tech Stack
Our tech stack shall consist of Python, Conda, Pip, Jupyter, VSCode, FastAPI and Docker
Setup
We need to setup our local environment with the dependencies for our tech stack which include Juypter, Notebook and Conda first.
For M1 macs (which are ARM64 architecture macs), we need to use miniforge version of Conda using Homebrew as below —
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
brew install miniforge
After installation is completed, you’re almost ready to use $conda
command. You need to source what’s on your .bashrc or .zshrc.
Run below command and restart terminal you’re done.
conda init bash
OR
conda init zsh
Open a new terminal to load the .bash_profile environment variables into your shell. Then, create a conda environment using the following command
conda create --name e8lattice python=3.8 pip
This should give us the below prompt
==> conda create --name e8lattice python=3.8 pip
Collecting package metadata (current_repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /opt/homebrew/Caskroom/miniforge/base/envs/e8lattice
added / updated specs:
- pip
- python=3.8
The following packages will be downloaded:
package | build
---------------------------|-----------------
python-3.8.15 |h3ba56d0_1_cpython 11.2 MB conda-forge
setuptools-66.1.1 | pyhd8ed1ab_0 630 KB conda-forge
------------------------------------------------------------
Total: 11.8 MB
The following NEW packages will be INSTALLED:
bzip2 conda-forge/osx-arm64::bzip2-1.0.8-h3422bc3_4
ca-certificates conda-forge/osx-arm64::ca-certificates-2022.12.7-h4653dfc_0
libffi conda-forge/osx-arm64::libffi-3.4.2-h3422bc3_5
libsqlite conda-forge/osx-arm64::libsqlite-3.40.0-h76d750c_0
libzlib conda-forge/osx-arm64::libzlib-1.2.13-h03a7124_4
ncurses conda-forge/osx-arm64::ncurses-6.3-h07bb92c_1
openssl conda-forge/osx-arm64::openssl-3.0.7-h03a7124_2
pip conda-forge/noarch::pip-22.3.1-pyhd8ed1ab_0
python conda-forge/osx-arm64::python-3.8.15-h3ba56d0_1_cpython
readline conda-forge/osx-arm64::readline-8.1.2-h46ed386_0
setuptools conda-forge/noarch::setuptools-66.1.1-pyhd8ed1ab_0
tk conda-forge/osx-arm64::tk-8.6.12-he1e0b03_0
wheel conda-forge/noarch::wheel-0.38.4-pyhd8ed1ab_0
xz conda-forge/osx-arm64::xz-5.2.6-h57fd34a_0
Proceed ([y]/n)?
After the packages have downloaded and extracted, one may activate the conda environment as below
conda activate e8lattice
One may also export the yml file of the conda environment with the below command
conda env export > environment.yml
The yaml file should look as below
name: e8lattice
channels:
- conda-forge
dependencies:
- bzip2=1.0.8=h3422bc3_4
- ca-certificates=2022.12.7=h4653dfc_0
- libffi=3.4.2=h3422bc3_5
- libsqlite=3.40.0=h76d750c_0
- libzlib=1.2.13=h03a7124_4
- ncurses=6.3=h07bb92c_1
- openssl=3.0.7=h03a7124_2
- pip=22.3.1=pyhd8ed1ab_0
- python=3.8.15=h3ba56d0_1_cpython
- readline=8.1.2=h46ed386_0
- setuptools=66.1.1=pyhd8ed1ab_0
- tk=8.6.12=he1e0b03_0
- wheel=0.38.4=pyhd8ed1ab_0
- xz=5.2.6=h57fd34a_0
prefix: /opt/homebrew/Caskroom/miniforge/base/envs/e8lattice
Cmd-Shit-P to open Command Palette in VSCode and select the Select Interpreter menu item
Then, select the Conda -> e8lattice interpreter from the list
Run the below command to install jupyter and notebook
conda install jupyter notebook
Create a test ipynb called test.ipynb as follows
Make sure to select the e8lattice kernel on the the top right
If one looks in the app folder of the below repo —
One can find a main.py for the FastAPI endpoints as well as a pickle file of the pre-trained model wine.pkl
The wine dataset has 178 samples, 13 dimensions of features and 3 classes
We deploy the FastAPI webserver as follow
Run pip install fastapi to get fastapi working
pip install numpy
pip install -U scikit-learn scipy matplotlib
pip install "fastapi[all]"
Run the webserver as follows
uvicorn main:app --reload
We can now open localhost:8000/docs in our browser and see our API docs as below
We may now provide the below data from the examples as an example to the FastAPI endpoint
{
"alcohol":12.6,
"malic_acid":1.34,
"ash":1.9,
"alcalinity_of_ash":18.5,
"magnesium":88.0,
"total_phenols":1.45,
"flavanoids":1.36,
"nonflavanoid_phenols":0.29,
"proanthocyanins":1.35,
"color_intensity":2.45,
"hue":1.04,
"od280_od315_of_diluted_wines":2.77,
"proline":562.0
}
We see that it correctly makes the correct prediction as seen below
In the next part, we shall continue to train and deploy ML models to our E8Lattice endpoint so stay tuned.