Back to Developer Roadmap

Gradient Boosting Machines

src/data/roadmaps/machine-learning/content/gradient-boosting-machines@JuTTbL_pm1ltGvhUsIzQd.md

4.01.5 KB
Original Source

Gradient Boosting Machines

Gradient Boosting Machines are a type of ensemble learning method that combines multiple weak learners, typically decision trees, to create a strong predictive model for classification tasks. The algorithm works iteratively, with each new tree trained to correct the errors made by the previous trees. This is achieved by focusing on the instances that were misclassified in the previous iterations, effectively "boosting" the performance of the model. Popular implementations of gradient boosting include XGBoost, LightGBM, CatBoost, and the original GradientBoostingClassifier, each offering variations in regularization, tree growth strategies, and handling of categorical features.

Visit the following resources to learn more: