Meta Ensemble Self-Learning Model with Optimization

Ajay Arunachalam
May 6 · 4 min read

Meta-Self-Ensemble Learner Package (pip install meta-self-learner) — https://github.com/ajayarunachalam/meta-self-learner

Hello, friends. In this blog post, a meta-learner ensemble design is presented. The meta-ensemble learning model aims to fit any complex data better, lowering the uncertainty in estimation. The two self-learner algorithms aim to find the optimal weights that minimize the objective function.

This design is integrated as a part of the package “meta-self-learner”. More details can be found on the GitHub page here. All code from this blog post can be found here

USP of this package:-

Meta-Self-Learn” provides several ensemble learners functionality for quick predictive modeling prototyping. Generally, the predictions become unreliable when the input sample is out of the training distribution, bias to data distribution or error prone to noise, and so on. Current approaches by and large require changes to the network architecture, model fine tuning, need of balanced data, increasing the model size, etc etc. Also, mainly the selection of the algorithms plays a vital role, while the scalability and learning ability decrease with the complex datasets. In this package, I have developed an ensemble framework for minimizing generalization error in the learning algorithm irrespective of the data distribution, number of classes, choice of algorithms, the number of models, complexity of the datasets, etc. So, in summary, with this framework one can be able to infer better & generalize well. Another key take-away of the package is the intuitive pipeline that compliments building models in more stable fashion while minimizing the under-fitting/overfitting which is very critical to the overall outcome.

Quickly Setup package:-

Download automation script from here

Run the following command on the terminal

sudo bash setup.sh

Installation:-

Using pip:

pip install meta-self-learner

Using notebook:

!pip install meta-self-learner

Meta Self Learner Workflow:-

The designed framework pipeline workflow is as given in the figure.

Meta-Self-Learner Ensemble Pipeline

The first layer comprises several individual classifiers. We have used the base classifiers namely Logistic Regression (LR), K-Nearest Neighbor (KNN), Random Forest (RF), Support Vector Machine (SVM), Extra Tree Classifier (ETC), and Gradient Boosting Machines (GBM). The two self-learners (i.e., Ensemble-1 and Ensemble-2) aim to find the optimal coefficients that minimize the objective function, i.e., the log-loss function. With a given set of predictions obtained in the previous layer, the two meta-learners define two different linear problems, and aim to optimize the objective function to find the optimal coefficients that lower the loss.

The pre-processed data is input to Layer-1 of the model. ‘T’ and ‘P’ represent training data and predictions, respectively. In Layer-1, many standalone base learns are used. Input to Layer-2 includes the predictions from previous Layer-1.

Two meta-self-learner ensemble schemes are used. Layer-3 combines the results from the Layer-2 predictions as a simple weighted average (WA). Model evaluation, and result interpretation is done finally in the last stage of the pipeline.

The details of the meta-self learning architecture is as follows:-

LAYER-1:-

Six classifiers are used (LR, SVM, RF, ETC, GBM, and KNN). Here, one can use any machine learning algorithms of their choice. And, build any number of models. All the classifiers are applied twice: 1) The classifiers are trained on (X_train, y_train), and used to predict the class probabilities of (X_valid). 2) The classifiers are trained on (X = (X_train + X_valid), y= (y_train + y_valid)) and used to predict the class probabilities of (X_test).

LAYER-2:-

The predictions from the previous layer on X_valid are concatenated, and used to create a new training set (XV, y_valid). The predictions on X_test are concatenated to create a new test set (XT, y_test). The two proposed ensemble methods, and their calibrated versions are trained on (XV, y_valid), and used to predict the class probabilities of (XT).

LAYER-3:-

The predictions from the previous layer-2 are then linearly combined using a weighted average.

In this way, a hybrid architecture is designed & deployed, where the predictions of the standalone classifier are combined by meta self learner methods, thereby reducing the risk of under-fitting/overfitting.

Let’s get some hands-on with an example.

Getting started

2. Load your dataset

3. Splitting data in training, validation & test set

4. Set class configuration of the meta self ensembler. Here, we have taken four classes from the total digits dataset classes as an quick example.

5. Building the meta-self-learner architecture layer-by-layer in the pipeline

a. Create First Layer

b. Create Second Layer

c. Create third/final layer

6. Performance evaluation & creating graph plots for log loss metrics

Log-loss evaluation metrics for different ensemble algorithms with meta-self-learner model

7. Plot ROC, Confusion Matrix, and displaying classification report

ROC, Confusion Matrix, and classification report

Important Links

Complete Demo Notebook:-

https://github.com/ajayarunachalam/meta-self-learner/blob/main/Full_Demo_Tested.ipynb

Terminal Launch:

Download the file & run the following command on the terminal

python tested_example.py

CONNECT

You can reach me at ajay.arunachalam08@gmail.com

Thanks for reading. Hope this article will be useful :)

References:-

https://www.spotintelligence.com/blog/how-to-implement-self-learning-systems/

https://en.wikipedia.org/wiki/Ensemble_learning

https://scikit-learn.org/stable/modules/generated/sklearn.metrics.log_loss.html

MLearning.ai

Data Scientists must think like an artist when finding a solution

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store