TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Member-only story

LightGBM: The Fastest Option of Gradient Boosting

Learn how to implement a fast and effective Gradient Boosting model using Python

Gustavo R Santos
TDS Archive
Published in
7 min readJan 12, 2025

--

LightGBM is a faster option | Image generated by AI. Meta Llama, 2025. https://meta.ai

Introduction

When we talk about Gradient Boosting Models [GBM], we often also hear about Kaggle. This algorithm is very powerful, offering many tuning arguments, thus leading to very high accuracy metrics, and helping people to win competitions on that mentioned platform.

However, we are here to talk about real life. Or at least an implementation that we can apply to problems faced by companies.

Gradient Boosting is an algorithm that creates many models in sequence, always modeling on top of the error of the previous iteration and following a learning rate determined by the data scientist, until it reaches a plateau, becoming unable to improve the evaluation metric anymore.

Gradient Boosting algorithm creates sequential models trying to decrease the previous iteration’s error.

The downside of GBMs is also what makes them so effective. The sequential construction.

If each new iteration is in sequence, the algorithm must wait for the completion of one iteration before it can start another, increasing…

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Gustavo R Santos
Gustavo R Santos

Written by Gustavo R Santos

Data Scientist | I solve business challenges through the power of data. | Visit my site: https://gustavorsantos.me

No responses yet