XGBoost in python part-2

pritesh
3 min read5 days ago

6. XGBoost Feature Importance

XGBoost provides an easy way to visualize feature importance, which helps in understanding the key features that contribute the most to the predictions.

Example: Feature Importance Plot

import xgboost as xgb
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
import matplotlib.pyplot as plt
# Load the Breast Cancer dataset
data = load_breast_cancer()
X = data.data
y = data.target
# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Initialize the XGBClassifier
model = xgb.XGBClassifier(objective='binary:logistic', max_depth=3, learning_rate=0.1, n_estimators=100)
# Fit the model
model.fit(X_train, y_train)
# Plot the feature importance
xgb.plot_importance(model)
plt.show()

7. Early Stopping with XGBoost

Early stopping is used to prevent overfitting by stopping the training when the model’s performance on the validation set no longer improves. XGBoost supports early stopping with the early_stopping_rounds parameter.

Example: Early Stopping in XGBoost

import xgboost as xgb
from sklearn.model_selection import train_test_split
from sklearn.datasets…

--

--